You are already signed in to your MathWorks Account. Please press the "Submit" button to complete the process.
What’s New in MATLAB and Simulink R2022a
Learn about new capabilities in MATLAB® and Simulink® to support your research, design, and development workflows. This talk highlights new tools for increasing productivity, such as interactive apps and Live Editor tasks in MATLAB for automating tasks and calculations without writing code, and new features in Simulink for running simulations in parallel. You’ll also see new capabilities for workflows involving other tools, languages, and technologies, including using Python® with MATLAB, and exporting content from Simulink as standalone functional mockup units (FMUs). Additionally, new capabilities for sharing your MATLAB code and Simulink models will be showcased, including publishing MATLAB functions as Docker container-based microservices and generating a configurable MATLAB UI from a Simulink model.
Save the Earth: Accelerate Climate Science and Electrify Everything
The climate crisis is here. Engineers and scientists are engaged to help. Engineers innovate rapidly to decarbonize energy production, electrify everything, and design sustainable products. Scientists accelerate their research to inform climate adaptation and enhance understanding through advances in cloud computing and artificial intelligence. And educators train the next generation to take these advances even further. In this talk, you will learn how scientists and engineers use MATLAB® and Simulink® to tackle this great challenge—to save the earth and build a clean, electrified future.
Rolls-Royce Pathway to Net Zero
At Rolls-Royce, we have a long, proud history as pioneers of the power that enables the modern world to function. Our customers use our products and services in industries such as aviation, shipping, and energy generation.
The climate crisis means the way humans use power must become compatible with net-zero carbon. We believe technology can be a force for good, and that, as the world emerges from the pandemic, sustainable economic growth is possible. For us, the transition to net zero is both a societal imperative and the greatest commercial opportunity of our time.
However, no individual company, sector, or technology has all the answers. That’s why we are forging partnerships across borders, industries, and technologies to seek out—and scale—solutions that can get us to net zero.
One example is our partnership with MathWorks on the development processes for these new solutions. Our control designs and software are built on a family of model-based product lines called ECOSIStem. These enable innovation in our design teams, reduce time to market, and improve software quality. ECOSIStem supports the delivery of novel projects, such as Hybrid Aero Propulsion, Intelligent Engine, and our Small Modular Reactor program, that are part of the journey to a net-zero future.
We have already pledged to reduce emissions from our own operations to net zero by 2030, and to play a leading role in enabling the sectors in which we operate to reach net zero by 2050. We are now laying out our technology pathway and setting clear short-term targets to show how we will achieve those goals.
Work Smarter, Not Harder – Electrifying Agriculture with Artificial Intelligence
Artificial intelligence is having a dramatic impact on farming operations. It removes the guesswork and instead helps farmers implement a precision-based approach to farming that offers increased efficiency and productivity, along with safety and environmental benefits. Praveen Penmetsa is leading the AI charge into agriculture with Monarch Tractor, a fully electric, driver optional, smart tractor. Join Praveen to learn about the challenges today’s farmers face and how AI is helping solve those problems for a better tomorrow.
Advancing AI and Data Science Through Industry/Academia Collaboration
The artificial intelligence (AI) and data science workforce continues to expand and demands more talent to meet industry needs. While technologies have accelerated rapidly in industry, incorporating these evolving technologies poses a challenge to universities to align educational experiences with the evolving workforce needs. There is growing and urgent recognition of the potential for racial and other forms of bias that can shape AI and data science with significant social and economic impacts on individuals and communities.
Developing mutually beneficial industry/academia collaborations can bring novel insights that enhance both research and educational experiences while fostering diversity, equity, and inclusion. To drive real social impact, cross-sector efforts must also build trust to catalyze AI/data science advances that benefit society. These high-impact relationships can lead to new courses, programs, and research endeavors that will enhance AI/data science technologies and the workforce. The Atlanta University Center Data Science Initiative works across Clark Atlanta University, Morehouse College, Morehouse School of Medicine, and Spelman College and facilitates collaborations with industry.
The Initiative developed a new AI course that infused AI4ALL’s curriculum with current ethical challenges. A partnership with NielsenIQ resulted in a hands-on course where students develop strategies to address a posed business case and connect with working professionals. By working with Greenlink Analytics, faculty will be engaged to use mapping tools that uncover inequities through data. With support from Google Health, social work faculty will use data to drive the development of better treatment and methodologies to address social determinants of health. In addition, partnerships with UnitedHealth, The Coca-Cola Company, SAP, Truist, The Home Depot, and Uber expand participation in AI/data science and encourage new advances on how to address bias and ethics in AI/data science.
This session will explore how to develop effective collaborations that lead to educational pathways for all students. It will also show how researchers in the Atlanta University Center are developing new ways to inform the development of AI and data science that minimize racial biases.
The Electronic System Architecture Modeling (eSAM) Method
Gulfstream will describe the novel Electronic System Architecture Modeling (eSAM) method that they developed over the last three years, which utilizes System Composer™. eSAM models the integration of electronic components, subsystems, and systems at the data-exchange level. The model serves as a rich electrical interface control document (ICD). We will demonstrate the customizations that MathWorks helped to create to support the realization of the eSAM vision.
We will present key characteristics of the eSAM method:
data and message model constructs
part reuse through template-instance modeling
component allocation to shared platform resources (computing and input/output interfaces)
layered modeling which leverages a black-box integration approach
It’s critical that the eSAM modeling process is easy to learn because Gulfstream is planning to roll out the modeling approach to their supplier base. System suppliers will deliver an eSAM-compliant model, and Gulfstream will integrate those models together to form an integrated aircraft system model. As a true model-based development (MBD) approach, lower-level, platform-specific tools read in the System Composer eSAM model and autogenerate the configuration for the data routing between systems—for example, network switch routing tables and data gateway configurations. This alleviates the need to manually edit low-level configuration files that are typically in the form of XML. The model is also used to generate electrical ICD reports to document the system interfaces.
The eSAM method was greatly enabled by new System Composer capabilities in the R2021b release. The modeling method is still in a development maturation phase. Gulfstream is looking for interested partners to learn more about the eSAM modeling approach and help mature the modeling standard. Please contact the project lead, Chris Watkins, if you are interested via email: firstname.lastname@example.org.
How Is Shell Driving Its AI Future?
Scientist and engineers at Shell have been using MathWorks products for 30 years to solve their technical challenges. With the revolution in cloud computing and the explosion in open-source technologies and APIs over the last decade, Shell has been on a strategic journey to develop their own digital transformation roadmap. Our most valuable assets are our data and the people who understand that data. Extracting the most from our data in terms of insights, predictions, and enablement also requires working with strategic partners, solution providers, and vendors. We are proud to have MathWorks as part of our ecosystem and transformation. MathWorks and Shell have worked closely together through DevOps to accelerate operationalizing projects across business and the end-to-end value chain. Notable deployments include Quest, a CO2 surface scanning and monitoring solution above Shell’s first carbon capture and storage (CCS) installation; MADA, an exploration tool allowing geoscientists to analyze stratigraphic analogues; and several manufacturing tools that that allow a plant to run optimally. MathWorks tools such as MATLAB Production Server™ and MATLAB Web App Server™ form part of the Shell.ai self-service DevKit, which is the technology stack backbone for developers, scientists, and technicians to develop, build, test, and deploy their AI solutions. Looking to the future, Shell is committed to strengthening its partnerships through the Open AI Energy Initiative (OAI) and framework. The intent is to have an integrated, interoperable platform that allows for rapid development and deployment of AI solutions at scale that can be commercialized, supported, and maintained at an affordable cost. Shell understands that a cross-sector approach is the best way to develop AI tools for the energy industry and its transition to new energies.
Wi-Fi Ranging: Delivering Ranging and Location Technologies of Tomorrow Today
Wi-Fi ranging technology uses time-of-flight measurements to estimate the distance between two Wi-Fi devices. For over a decade this technology has been enabling application developers and other solutions implementers to provide a variety of services including indoor navigation, asset tracking, geofencing, access control (locking/unlocking), and device operation, all with increased accuracy and performance without sacrificing real estate or overall bill of materials (BOM) cost. Since their implementation in 2009, the Qualcomm Wi-Fi ranging technology has been shipping in billions of devices globally with clear signs of accelerated adoption. Ranging capabilities have also continued to improve over multiple generations. This progress has led to greater levels of accuracy and performance, enabling a wide range of potential use cases. This presentation provides insights into the history, use cases, performance factors, and near-term innovations of Wi-Fi ranging technology. It also highlights statistical methods that application developers and other solutions implementers can use to enhance ranging accuracy in their applications. It draws on the results of extensive measurement campaigns using the Qualcomm Wi-Fi ranging technology to demonstrate achievable ranging accuracies in real-world scenarios.
System-Level Simulation and Testing of an Aperture Array Beamformer
A multi-element, multi-beam aperture array beamforming system is being developed as part of the Expanded GMRT (eGMRT) project. This prototype system is a part of the proposal aimed at improving the scientific capabilities of the Giant Metrewave Radio Telescope (GMRT), one of the most sensitive instruments in the world for observing celestial objects at meter-wave radio frequencies. The initial prototype beamforming system built on an FPGA and operating in the L-band (1.1-1.7 GHz) is undergoing testing in a free-space test range at the GMRT site. In parallel with the efforts towards the final prototype development, we carried out an end-to-end simulation of the beamforming system by modeling the Vivaldi antenna array, RF and analog signal processing systems, and digital beamformer using MATLAB® and Simulink®. The free-space testing was simulated by modeling the transmitting antenna, propagation channel, and sources of interference for testing multiple beams. The simulation was carried out in Simulink using multiple toolboxes for the beamformer's accurate modeling, including the antenna array, test range, and array signal processing algorithms. We tested phased array beamsteering, nulling, and maxSNR beamforming algorithms/techniques for linear and rectangular arrays in the simulation and compared them with the experimental results. We kept the model parameters for the beamforming system simulator similar to those in the actual system. This enabled us to see the difference in the performance of actual and simulated systems. Optimal beamforming was simulated in a case with mutual coupling between the antenna elements to understand its effect on the array covariance matrix, signal-to-noise ratio, and the array calibration process. This presentation would describe the simulator architecture and various beamforming tests and compare the results with the actual system performance. The simulator has a workflow-based framework to encapsulate future requirements of the project and to support diverse array signal processing and beamforming applications.
Developing Error Report Generation Software for Synthetic Aperture Radar
Synthetic aperture radar (SAR) is a technique to synthesize a long antenna, utilizing the movement of the sensor using signal processing at ground to produce better azimuth resolution for larger swath imaging. SAR system design is an iterative process. It involves the derivation of system parameters from the user requirements and takes orbital geometry and practical system realization feedback into the loop. For the system design, we developed software in MATLAB® using Phased Array System Toolbox™ and Antenna Toolbox™.
During the development of SAR payloads, we focused on developing automated checkout systems to process and validate the payload data. Our goal was to reduce processing and validation time and have minimum human interference. We also developed an automated error report generation software to process and validate SAR payload data using MATLAB, Symbolic Math Toolbox™, Signal Processing Toolbox™, and MATLAB Report Generator™. We found that the software was user friendly and achieved the automation objective quickly.
5G Vulnerability Analysis with Reinforcement Learning Toolbox
Challenge: 5G is a disruptive technology that transforms our society, yet there are many potential attack vectors that threat actors could take advantage of. To better protect our critical infrastructure and the devices on them, we need to identify and address as many vulnerabilities as we can.
Obstacle: A 5G infrastructure is comprised of many components and used in many different environments. The system complexity and dynamic nature of it add to the challenge of identifying vulnerabilities. Data security, user privacy, confidentiality, integrity, and availability are just some of the obvious concerns with 5G. And these complicated problems cannot be solved by traditional methods.
Solution: Our 5G security team built 5G models in a synthetic simulation environment and identified threat vectors based on industry consortiums like 3GPP and NSA’s ESF. An AI-based solution was developed and using Reinforcement Learning Toolbox™ to expose 5G vulnerabilities and optimize attack patterns based on an objective function. Our 5G security team identified potential mitigation techniques and used the digital twin environment to assess their effectiveness.
Tools used: MATLAB®, Reinforcement Learning Toolbox, Deep Learning Toolbox™
The simple drag-and-drop GUI interface and many features in Reinforcement Learning Toolbox made it easy for our engineers to analyze 5G vulnerabilities and come up with optimized solutions, and use the metrics for verification and validation purposes. As a result, our RL model achieved a 100% accuracy score. The faster built-in math and functions libraries shortened development/analysis time and the responsive technical support team solved issues quickly and professionally.
Modeling Radar and Wireless Coexistence
Congestion in the radio frequency (RF) spectrum and increasing demand for RF bandwidth has resulted in sharing the spectrum between different RF users, including radar systems and wireless communication systems. The coexistence between radar and communication systems generates some unwanted effects such as interference or blockage. By understanding these effects, proper mitigation techniques can be developed to ensure satisfactory performance of the two systems.
See how to model a scenario with a radar system and a wireless communication system working in the same environment and using the same spectrum. Assess the possible radar performance degradation and evaluate different mitigation techniques.
Learn how to:
Model a radar system in the vicinity of a wireless communication system, including RF and antenna subsystems, waveform designs, and a realistic propagation environment
Generate IQ signal at the radar receiver due to the target reflection and broadcasted wireless communication signal
Sense and identify received waveforms; apply beamforming techniques to avoid or minimize interference effects
Wireless Standards and AI: Enabling Future Wireless Connectivity
Wireless standards enable global wireless connectivity and access. Modern standards such as 5G, Wi-Fi, satellite communications and Bluetooth promote R&D in the industry and drive innovations forward. Applications of artificial intelligence (AI) techniques, used to optimize these systems, are also becoming more prevalent. Now explore capabilities in MATLAB® for wireless system design and applications of AI to tackle design challenges. Through case studies and reference examples, you will learn about:
What is new in 5G, Wi-Fi, satellite communications and Bluetooth
Coexistence modeling and performance assessment in the presence of interfering signals
Application of AI for wireless design (spectrum sensing, DPD, beam management)
Standards-based positioning, localization, and ranging applications
Secure, Automated, Internet-Based mmWave Test and Measurement with Xilinx RFSoC
Wireless system design at mmWave frequencies presents unique challenges, calling for multidisciplinary engineering teams scattered across geography and home offices with deep expertise in RF system design, digital signal processing, embedded software, and emerging wireless standards.
Join engineers from Avnet and Rohde & Schwarz as they demonstrate test solutions for Avnet’s Wideband mmWave Radio Development Kit for Xilinx Zynq® UltraScale+™ RFSoC Gen-3. You will observe automated measurement techniques based in MATLAB® using Rohde & Schwarz’s 5G NR signal generation and analysis for fast, remote testing through the new Rohde & Schwarz Secure Application Gateway™.
Explore the tradeoffs of testing critical wireless metrics, including ACLR and EVM, both 5G NR and standard-agnostic, from R&D to production. We will also investigate the performance of the bits-to-mmWave radio in terms of frequency response and distortion for frequency planning or optimizing the system design.
Connecting MATLAB to USRP for Wireless System Design
Learn how NI and MathWorks are collaborating to accelerate wireless system design with software-defined radios (SDRs) connected to MATLAB®. This session will introduce you to Wireless Testbench™, the latest toolbox for MATLAB that helps engineers explore and test wireless reference applications in real-time on SDR hardware. You will see how MATLAB and supported SDR hardware from Ettus ResearchTM, an NI brand, can be used for over-the-air wireless testing at maximum sample rates (250 MSPS). You can perform spectrum monitoring and signal detection of standards-based and custom signals by leveraging FPGA processing onboard the supported USRP devices. You will learn how to run MATLAB scripts to execute, configure, and probe the applications running on the SDR hardware.
Hands-On Workshop: Pocket AI and IoT: Turn Your Phone into a Smart Fitness Tracker
In this hands-on and interactive workshop, you will learn how to use sensors, AI, and IoT to build a smart fitness tracker using your own mobile device. Learn the basics of sensors, AI, machine learning, and IoT required for building this application. We will demystify the buzz around sensor analytics, machine learning, and IoT and show you how easy it is to build smart wearables. You will be energized and engaged while doing hands-on exercises and leave excited about IoT and wearables and taking up challenging projects in this domain. Discover how MATLAB Online™, MATLAB Drive™, ThingSpeak™, and machine learning algorithms (knn) are seamlessly integrated and used within MATLAB Mobile™.
Error Mode Identification in Gas Turbines Through Predictive Maintenance
MAN Energy Solutions SE is a German multinational company that produces diesel engines and turbomachinery for marine and stationary applications. The plant in Oberhausen builds gas turbines for mechanical drive applications (mainly pipeline compressors) and power production. The engines are distributed all over the world, often to remote areas where machine failure can have severe consequences.
MathWorks engagement started in 2020 with a proof of concept using machine learning techniques to detect error conditions in gas turbines. The goal of this consulting project is to automate the time-consuming process of visualizing and manually evaluating measured sensor data, in order to determine gas turbine error conditions at an early stage.
We used GateCycle™ to generate simulated gas turbine data for two different machine types. For each machine type, we generated two data sets: one comprehensive gridded set for training and validation and one random set for testing. We used the training/validation set to build a model in MATLAB® that predicts the error type and error magnitude with high accuracy using machine learning (classification validation ~ 1%). The model is a combination of a classifier (predicting the error type) and a regression model (predicting the magnitude of the error). Then, we tested the model against the test data (classification test error < 10%).
Since the number of sensors in the engine is usually limited, we then reduced the number of predictors from the full set (21 predictors) down to a minimal set of 6 predictors. Even with a set of only 10 predictors, the accuracy remained in the targeted range (classification test error < 10%).
The next step of the project will be to test the model against real field data. The long-term goal is the integration of the application into the engine control system environment, so engine condition monitoring can be done in real time.
Designing a Lidar Sensor Classifier Using a MATLAB Framework
A deep learning-based classifier is an important component in the environment perception of a moving vehicle, and classified output from the classifier is essential to proceed further in the vehicle data chain. There are several sensors in the vehicle network to provide the environment inferencing. We focused on lidar-based sensors in our experimentation to design a classifier for object classification from lidar sensor data. We targeted our approach to reveal needs for the production environment and realize an open-source deep network module using MATLAB® and Deep Learning Toolbox™. We also generated production code using Embedded Coder® to deploy into the embedded platform. We used this deep learning kit to design the DNN network and fine-tune and debug the missing layers. Once this model was executed in a MATLAB environment, we proceeded to generate production code using Embedded Coder. The generated code was then ready for compiling and deployment on NVIDIA–XAVIER hardware.
Automating an Audio Labeling Workflow with Deep Learning for Voice Activity Detection
Deep learning models require labeled data for training purposes. Labeling of the data is an important step. For audio files, labeling involves analyzing segments of audio, listening to them, and manually assigning appropriate labels for specific time slots in the audio files. However, such an audio labeling workflow is a labor-intensive process and slows the development cycle time. For instance, to train a deep learning model to classify segments into speech or noise for voice activity detection (VAD), we need to label the speech and noise segments in the audio files. In this presentation, we’ll discuss our experience using a pretrained deep learning model in a labeling algorithm and show how we adopted automation of an audio labeling workflow towards VAD, thereby reducing the development cycle time.
MATLAB with TensorFlow and PyTorch for Deep Learning
MATLAB® and Simulink® with deep learning frameworks, TensorFlow and PyTorch, provide enhanced capabilities for building and training your machine learning models. Via interoperability, you can take full advantage of the MATLAB ecosystem and integrate it with resources developed by the open-source community. You can combine workflows that include data-centric preprocessing, model tuning, model compression, model integration, and automatic code generation with models developed outside of MATLAB.
Explore the options and benefits, along with examples, of the various interoperability pathways available, including:
Importing and exporting models from TensorFlow, PyTorch, and ONNX into and from MATLAB
Coexecuting MATLAB alongside installations of TensorFlow and PyTorch
Fitting AI Models for Embedded Deployment
AI is no longer limited to powerful computing environments such as GPUs or high-end CPUs, and is often integrated into systems with limited resources like patient monitoring, diagnostic systems in vehicles, and manufacturing equipment. Fitting AI onto hardware with limited memory and power supply requires deliberate trade-offs between size of model, accuracy, inference speed, and power consumption—and that process is still challenging in many frameworks for AI development.
Optimizing AI models for limited hardware generally proceeds in these three steps:
Model Selection: Identify less complex models and neural networks that still achieve the required accuracy
Size Reduction: Tune the hyperparameters to generate a more compact model or prune the neural network
Quantization: Further reduce size by quantizing model parameters
Additionally, especially for signal and text problems, feature extraction and selection result in more compact models. This talk demonstrates model compression techniques in MATLAB® and Simulink® by fitting a machine learning model and pruning a convolutional network for an intelligent hearing aid.
Machine Learning with Simulink and NVIDIA Jetson
Deep learning and machine learning techniques have the ability to solve complex problems that traditional methods can’t adequately model, such as detecting objects in an image or accurately predicting battery state-of-charge based on current and voltage measurements. While these capabilities by themselves are remarkable, the AI model typically represents only a small piece of the system. Edge and embedded systems are driven by the increasing number, performance, and bandwidth of sensors. This in turn is driving the need for higher performance computing in the systems that integrate and process the sensors, and for software that enables easy and quick deployment. Explore how to leverage the NVIDIA Jetson™ platform and Simulink® AI with Model-Based Design approach to make the complexity of such systems more manageable. Learn how to use simulation for adequate testing and enable easy deployment across devices.
Hands-On Workshop: Low-Code AI: Making AI Accessible to Everyone
Learn how you can apply AI in your field without extensive knowledge in programming. This hands-on session includes a quick recap on the fundamentals of AI and three exercises where you will learn how to classify human activities using MATLAB® interactive tools and apps:
Accessing and preprocessing data acquired from a mobile device
Applying clustering to the unlabelled data using the Cluster Data Live Editor Task
Classifying the labeled data using two apps: Classification Learner app and the Deep Network Designer app
At the end of the workshop, you will be able to design and train different machine learning and deep learning models without extensive programming knowledge. You will also learn how to automatically generate code from the interactive workflow. This will not only help you to reuse the models without manually going through all the steps but also to learn programming or advance your coding skills.
Data-Centric AI for Signal Processing Applications
For some applications like autonomous driving, speech recognition, or machine translation, the adoption of AI can count on large datasets and abundant research. In those domains, investments most often focus on improving system performance through the design of ever more complex machine learning and deep learning models. On the other hand, in most industrial signal processing applications, data tend to be scarce and noisy, tailored models very rare, and traditional AI expertise hard to find.
This talk focuses on how data-centric workflows driven by domain-specific expertise can be used to significantly improve model performance and enable the adoption of AI in real-world applications. Learn more about signal data and specific recipes related to improving data and label quality, reducing variance and dimensionality, and selecting optimized feature-space representations and signal transformations. Explore popular simulation-based methods for data synthesis and augmentation and present the latest options for selecting suitable AI models to use as starting point.
Cleaning and Preparing Time Series Data
Time series data are everywhere. Whether it is from sensors on automated vehicles and manufacturing equipment, meteorological data, or financial data from the equities market, it helps us understand the behavior of a system over time. However, real-world time series data can have many issues like missing data, outliers, noise, etc. The data needs to be cleaned and prepped first before it can be analyzed or used for model development. Unfortunately, it is not always clear how to clean this data. Which algorithm should be used for filling missing values? Should outliers be removed first or noise? How is data that is measured using different sample rates synchronized? The process is iterative and can be very time consuming. In this session, we will show you how to use timetables with the new Data Cleaner app and Live Editor tasks to identify and fix common issues in time series data. We will cover different data cleaning methods using both code and low-code techniques that can make the data prep process more efficient.
Python for MATLAB Development
The py module in MATLAB® provides a binary interface to Python®. It opens the door to calling more than 350,000 Python modules directly from MATLAB, a truly amazing capability. This talk covers primary steps that will help you take full advantage of Python from MATLAB:
Create Python virtual environments that pair well with your MATLAB installation
Test the stability of your MATLAB/Python environment
Extend the Python search path in MATLAB so that it can find your Python code
Pass MATLAB arguments to Python functions
Convert values returned by Python functions to native MATLAB variables
Write Python bridge modules to span language interface gaps
Examples begin with simple one-line commands that read and write data to and from YAML files (with PyYAML) and progress to platform-independent ways to capture memory and CPU load (with psutil) and write Excel .xlsx files with arbitrary font styles, colors, and equations (with openpyxl). Python-to-MATLAB and MATLAB-to-Python variable conversions are demonstrated with the py2mat.m and mat2py.m utilities from Python for MATLAB Development.
Biomechanical Analysis and Visualization
This presentation will describe the application of MATLAB® to the field of biomechanical analysis. Industrial analysts and academics worldwide use BoB Biomechanics to study human movement and calculate the forces acting on the body and generated in the bones. BoB is normally used with motion capture equipment and has been applied in sports analysis for performance enhancement and injury reduction for tennis, cricket, baseball, basketball, rowing, cycling, and weightlifting. BoB has also been used by a Formula 1 team to study the movement of pit-stop personnel, enabling the slower members to learn from the techniques of the faster ones. Learn how to develop and deploy the BoB package and its built-in mathematical tools and advanced graphics within the MATLAB environment.
Creating an Algorithm for Personalized Fitness Programming
Beginning with the principles of periodization and progressive overload that are fundamental to improving human athletic performance, and relying on the nearly two decades of experience that co-founder Aaron Adams brings as an elite athlete, personal trainer, and coach, we have created an algorithm that provides custom daily workouts tailored to each athlete. The athlete or coach provides key metrics to the algorithm such as their goals, current fitness level and abilities, and equipment and time available for training. The algorithm then builds unique daily workouts that help the athlete meet their goals with exercises they have the skills and equipment to complete.
The workouts specify weights and reps for given sets and provide recommended warm-up and cooldown work. Some give a predicted "score" (either time or reps) that the athlete should achieve based on what they've told the algorithm. An element of randomness is built into the algorithm, so no two athletes will ever receive the same long-term program. Given the significant number of variables at play, it's unlikely any two athletes would receive identical workouts. The workouts are accessed via our app, which is available on both iOS and Android devices. The app serves primarily as a delivery tool for the workouts and functions as an interface to the algorithm itself, which is deployed as a .NET DLL to a VM scaleset hosted on Azure. With MATLAB Compiler™ and MATLAB Compiler SDK™, we can build the DLL directly within MATLAB® and deploy the exact version of the algorithm that is used for our development, testing, and analysis.
How to Turn Your Script into a Simple App
Custom-built apps are a great way to teach a concept, to automate common tasks, or to provide dashboards for interactively exploring complex data sets. And now with interactive controls in the Live Editor, if you can write a script, you can write an app. This talk will demonstrate how to convert MATLAB® scripts into simple notebook-style apps.
Hands-On Workshop: Using MATLAB with Python
Do you need to use MATLAB® and Python® together? MATLAB provides flexible, two-way integration with many programming languages, including Python. In this hands-on workshop, you’ll learn how to use MATLAB and Python together with practical examples. Specifically, you’ll learn how to:
Call Python libraries
Call user-defined Python commands, scripts, and modules
Manage and convert data
Package MATLAB algorithms to be called from Python
Mars Sample Fetch Rover: Autonomous, Robotic Sample Fetching
As part of the Mars Sample Return (ESA-NASA) mission, Airbus Defence and Space UK is developing the Mars Sample Fetch Rover. The rover's role is to autonomously drive and pick up samples left by the Perseverance rover on the Mars surface. This presentation will focus on the robotics at the front of the rover (camera systems and robotic arm) that are used to identify, grasp, and stow these samples (without human in the loop). This system is designed in MATLAB® and Simulink® and then translated into C code for flight software using the autocode packages.
Developing an Autonomous Cobot with Multimodal Control Using Model-Based Design
In recent years, diverse customer needs have led to an increase in the demand of a wide variety of products, services, and solutions. Creating a human-robot coworking environment to satisfy these customer needs requires a flexible robotics system configuration.
Current robotic systems can be inflexible because robots are typically designed to carry out predetermined actions based on specific instructions. To cope with this problem, we are developing a robot that can perform multimodal control by combining the arm, hand, camera, and other sensor information. By comprehensively judging this information, an adaptive dynamic control of the robotics system can be constructed. Thus, flexible robotics movement can be performed in various products, tasks, services, and solutions.
In this session, we will introduce our development of a robot hand which incorporates multiple sensors. The robot hand and its controller were designed and verified using Model-Based Design with MATLAB® and Simulink®. Specifically, Simscape Multibody™ was used to model the motor for the robot hand and simulate the contact force acting between the robot hand and the grasping object. By conjoining the virtual and real control structure of the robot hand, we could seamlessly implement the control system built by Simulink into the hardware. At the end of the session, we will show the modeling of the robot arm and its trajectory planning, the practical example of linking the virtual robot arm controller and robot arm movement in reality, and the integrated simulation of robot arm and hand using Robotics System Toolbox™ and ROS Toolbox. These autonomous coworking robot development processes can lead to the realization of cyber-physical systems (CPS).
Design and Simulate Scenarios for Automated Driving Applications
Development of advanced driver assist systems (ADAS) and autonomous driving applications often depends on simulation to reduce in-vehicle testing. The automotive industry is investing in standards like OpenSCENARIO to describe dynamic content in these driving simulation environments.
Learn how to author scenarios on realistic road networks designed in RoadRunner. You can use this workflow to simulate scenarios with built-in actors as well as author and integrate custom actor behaviors designed in MATLAB®, Simulink®, or CARLA. The scenarios can be exported to OpenSCENARIO for simulation and analysis in external tools if desired. Using this workflow, you can quickly author and simulate scenarios to gain system insight and test your designs.
Simulate and Deploy UAV Applications with SIL and HIL Workflows
Unmanned Aerial Vehicles (UAV) are safety critical systems where simulation and testing are essential for verifying control performance before conducting test flights. The latest developments in MathWorks tools let you integrate UAV onboard computers, ground control stations, and autopilots with plant models in Simulink® and scenario simulation using Unreal Engine® for various autonomous flight applications. In this talk, you will learn about:
Software-in-the-Loop (SIL) workflow to deploy UAV waypoint following on an onboard computer (NVIDIA® Jetson™) and test it with PX4 SIL simulation
Hardware-in-the-loop (HIL) workflow using the PX4 Hardware Support Package and an onboard computer (NVIDIA Jetson)
Integrating Simulink plant for UAV dynamics with HIL simulation and using depth image from Unreal Engine® to test the flight controllers for obstacle avoidance
Deployment on Pixhawk running in the HIL mode
UAV scenario simulation with photorealistic scenes using Unreal Engine
The new UAV Scenario Designer app, which lets you design and playback UAV trajectory based on OSM city maps
Developing a Racing Catamaran Powered by Hydrogen
Within its Research and Innovation Department, Capgemini Engineering has launched the SOGREEN project. Based on the case study linked to the regulations of the Energy category of the Monaco Energy Boat Challenge, we are developing a racing catamaran propelled by a mix of energy sources: battery, solar panels, and hydrogen fuel cell. The objective is to master this hybrid propulsion system and to optimize it, particularly in the context of the demanding technical constraints imposed by the challenge.
To achieve this goal, we first built a model using Simulink® and Simscape™ libraries to provide a tool for sizing and simulating performance.
The control-command (the management of electrical power components and safety devices as well as the recording of sensor data) is carried out using a proprietary Capgemini Engineering component: the MUXlab.
Within the framework of a privileged partnership between MathWorks and Capgemini Engineering, we are developing libraries (MUXlink) and a methodology which make it possible to use models produced in Simulink directly within the MUXlab, after a simple conversion, without any lines of code.
Finally, the development and operation of the catamaran is an opportunity to develop and test a complete digital twin.
We raced our prototype in the 2021 Monaco Energy Boat Challenge and finished on the podium for our first participation.
Energy Storage Systems: A Flexible Grid Asset
Energy storage systems are increasingly used in grids over the world. This growth is enabled by many facets that are continuously evolving: regulatory frameworks, standards, system controls, battery performance, consumer education, and costs. At Evlo, we have facilitated adoption of energy storage systems for the past seven years. We have deployed systems in small and large grids, including in extreme weather locations, using our proprietary real-time controller partly developed with Simulink®. In this presentation, we will demonstrate how energy storage systems can improve grid operations and facilitate increased usage of renewables.
Electric Drive Hardware-in-the-Loop (HIL) Testing: Skip the Beta Phase!
We used hardware-in-the-loop (HIL) to “skip the beta phase”—saving hundreds of thousands of dollars in time, materials, and rework by creating a system to test software and control cards for multimillion-dollar plant in real time, without the plant.
HIL technology ensures that our electric drive will perform the way we expect in the customer environment for a fraction of the cost of an actual motor, additional testing infrastructure, and reduced risk exposure for equipment and personnel. We can test day or night, with or without hardware. The innovation uses FPGA technology to simulate the electrical plant.
Enabling the Green Hydrogen Supply Chain with MATLAB and Simulink
Production of green hydrogen relies upon conversion of photovoltaic (sun) and/or eolic (wind) energy into hydrogen gas through electrolysis. This brings multidisciplinary challenges (concept design, planning, operation, maintenance) to guarantee satisfactory return of investment. Now learn how multidomain simulation empowers integration with grid and energy storage, power electronic design, and techno-economic studies.
Once produced, hydrogen must be compressed and transferred from tanks to fuel cells. Electricity is then regenerated for electric propulsion in ships, trucks, or busses. But how can hydrogen, an extremely sensitive gas, be handled safely at all stages? Model-Based Design provides a solid foundation for this. In this session, you will see how safe valve and cooling controls are modeled with Simulink®.
Finally, it may be that your company is responsible for system integration. Hydrogen-based fuel cells are acquired and integrated in a complex multidomain system and co-exist with other energy entities (e.g., battery, diesel generator). How can you maximize the return for all assets? Can testing be enabled by desktop models used early in the development cycle? Join us to find out how the MathWorks toolchain makes it possible.
Deploying Motor Control Algorithms to a TI C2000 Dual-Core Microcontroller
Using newer power semiconductors such as SiC and GaN is allowing the increase of sample rates for motor control applications. Implementing motor control algorithms on a multicore microcontroller or system-on-a-chip (SoC) processor may require an understanding of execution timing and delays due to peripherals, sensors, and device drivers. Simulation can help provide insight into how algorithms will execute and ensure proper timing is achieved. In this session, you will see how to use Simulink® and Model-Based Design to:
Model and simulate sensorless field-oriented control (FOC) using Motor Control Blockset™ and SoC Blockset™
Model the effects of device drivers and peripherals on implementation
Partition control algorithms for multicore execution
Schedule FOC algorithm execution between partitions
Generate code from the Simulink model to execute on the separate cores of the TI C2000™ dual-core microcontroller
Perform on-device profiling with streaming task execution and CPU utilization information
Rapid Prototyping of Embedded Designs Using NXP Model-Based Design Toolbox
Get to know NXP Model-Based Design Toolbox™—a connection between MathWorks and NXP ecosystems that allows rapid prototyping of complex embedded designs on NXP microcontrollers.
The Model-Based Design Toolbox can be used throughout all the application development phases, starting from an idea all the way to the final solution deployment.
Its integration with the rich MathWorks ecosystem allows modeling ideas into Simulink® designs, while enhancing the Model-Based Design paradigm key advantages such as simulation, verification and validation, automatic code generation, and reusability. It also provides a streamlined way to run embedded applications on NXP microcontrollers by integrating and linking target-specific software like drivers and libraries to Simulink applications. It also enables the cross-compilation of Simulink generated code and its download to the microcontrollers.
Such embedded applications can range across multiple domains like automotive, industrial and IoT, and can cover specific tasks like motor control, car access, lighting, or battery powered solutions.
If we talk about battery-powered applications, like electric vehicles or grid energy storage, battery management systems (BMS) play a key role in the optimal and safe power usage of the rechargeable cells. With MATLAB® and Simulink, BMS can be designed under safe conditions, from logic control states to complex functionalities like the state of charge (SOC) estimation. This task can be completed either based on traditional methods or by relying on data driven models using artificial intelligence. Once the simulation results are satisfying, the NXP Model Based Design Toolbox can deploy the BMS applications directly from Simulink on embedded targets. By using NXP Simulink blocks, the users can control the microcontroller peripherals, and also extend the hardware commands up to the analog front end, the NXP battery cell controllers.
In this presentation, we will highlight the main features of the NXP Model-Based Design Toolbox. We will demonstrate how to design a BMS application, covering the main development phases from idea to a prototype running on target.
Hands-On Workshop: Modeling Electrical Power Systems in Simscape Electrical
Explore modeling for electrical power system simulation using Simscape Electrical™. Through a worked example of a mixed AC/DC system, topics that will be covered include:
Introduction to Simscape Electrical
Selecting appropriate model fidelity for a given engineering task
Selecting appropriate solvers to effectively simulate a model of a given fidelity level
Ensuring behavioral consistency across a range of model fidelities
Using different sample-times for different physical networks to improve simulation performance
Preparing models for real-time simulation on multiple CPUs
Vehicle features and capabilities are transitioning from being mostly mechanically defined to being software defined. Premium vehicles today can have up to 150 million lines of code executing complex algorithms in their ECUs. OEMs are adopting agile software development methods to update and maintain software, driving the need for DevOps practices and tools. NXP, MathWorks, and AWS have collaborated on a DevOps solution for Model-Based Design for vehicle control algorithms. In this talk, AWS will present the full cloud-to-vehicle solution built with AWS developer tools by using MathWorks design tools and targeting execution on NXP vehicle network processors.
Deploying Cloud-Native MATLAB Algorithms in Kubernetes
Containers and microservice architectures have revolutionized IT infrastructure both in the cloud and on premises. The idea of infrastructure as code has meant application deployment is repeatable, easily scaled, and allows for zero downtime. Kubernetes is the industry standard for container orchestration, allowing you to abstract where an application runs and easily scale. Learn about our Kubernetes Reference Architecture and the new request-handling features to allow your production server-hosted function to handle arbitrary input.
Reuse of Simulink Components Within Chip-Level Design and Verification Environments
To keep pace with market needs for performance and cost efficiencies, the architecture of integrated circuits (ICs) needs to be highly optimized. Many semiconductor engineering teams are using Simulink® as their main tool to perform such optimizations, since it allows them to easily model and simulate multidomain systems, including analog and digital components. Moreover, this approach enables verification activities within Simulink, before the implementation phase has started, allowing engineers to find bugs early and shorten lead times. In this presentation, you will learn how Simulink design models and testbenches can be reused throughout the IC design cycle. We will demonstrate how design models can be converted to synthesizable RTL through HDL code generation and how Simulink verification components can be reused within UVM verification environments, avoiding duplication of effort between system-level activities (performed within Simulink) and RTL-level activities (performed using EDA tools).
Hands-On Workshop: Continuous Integration with MATLAB and GitHub Actions
Looking to get started with continuous integration and delivery (CI/CD) using MATLAB® with GitHub® Actions? Interested in learning how to automate the building and testing of a public MATLAB or Simulink® project on GitHub?
In this workshop, you will:
Use GitHub Actions to run the MATLAB tests for a public GitHub repository
Walk through the sample repository and the provided CI/CD configuration files
Use MATLAB Online™ to add new features and tests to the provided repository
To participate in this workshop, you will need two things:
Models Exchange and Virtual Integration with MATLAB and Simulink
See how Collins Aerospace Applied Research and Technology used Simulink® and the FMI standard to facilitate the internal collaboration between Collins Aerospace teams and streamline external collaboration between OEM and Collins product owners. You’ll see a comparison of the ability to export and import standalone FMUs with the coupling approach of the tools identifying difficulties and benefits. You’ll also explore multiple industrial and research use cases focusing on system and software validation and verification in a virtual environment, including exporting Simulink models towards different external simulation ecosystems and importing third parties’ virtual test benches in Simulink.
A Software Shift Left by Utilizing Model-Based Design and MathWorks Code Generation Tools
The development of digital front-end SoCs for 5G and beyond faces time-to-market pressure. This imposes strong pressure for pre-silicon verification before an RTL freeze, finally building confidence for a tapeout. Software availability is the key enabler for testing large and complex integrated SoCs as one whole system. Unfortunately, the software is often on a critical path, leading to verification shortcuts like test scripts. This leads to wasted labor and wasted opportunity in software shift-left testing.
Model-Based Design was used for both hardware and software reference design of a digital front-end subsystem, implementing an algorithmically complex closed-loop control system. This model offered a virtual platform to start software development even before any hardware was available. The software development was further reinforced with MathWorks code generation tools for fast code deployment and reduced software rewrites.
As a result, the design was verified with key test cases in a pre-silicon environment before an RTL freeze, for the first time, using real production software. This led to a considerable software shift left compared to earlier projects. Also, collaboration between algorithm, hardware, and software teams was improved because of the model-centric approach. Using Model-Based Design also enabled cross-team debugging of the problems found in pre-silicon verification directly in the reference model—for example, by finding more suitable parameter sets for test cases.
Adopting a new workflow and mindset is always challenging, but the transition was supported by targeted training held by MathWorks as well as technical support. The key takeaway from the challenges experienced is that dependencies from the model should be kept to a minimum, ensuring fast turnaround time in bug fixes and releases.
Fuel Cell Systems: The Challenge of Multiphysics Simulation
The reduction in CO2 emissions leads to a change in drive systems in the mobility sector. While a clear trend towards electrification by means of battery electric systems (BEVs) can be seen in the passenger car segment, the picture is different for light and heavy commercial vehicles. With fuel cells, electrical energy is generated on-board. In contrast to the BEV, this reduces weight and charging time, which is important in the commercial vehicle segment. As part of this work, Segula Technologies developed a multiphysical simulation tool for fuel cell system specification and control. Explore this holistic approach to modeling mechanical, electrical, hydraulical, and control systems with potential technical limitations as well as simulating performance. See examples of how to master these challenges using this tool.
Integrating AI-Based Virtual Sensors into Model-Based Design
Battery state of charge (SOC) is a critical signal for a battery management system (BMS). Yet, it cannot be directly measured. Virtual sensor modeling can help in situations like this, when the signal of interest cannot be measured or when a physical sensor adds too much cost and complexity to the design. Deep learning and machine learning techniques can be used as alternatives or supplements to Kalman filters and other well-known virtual sensing techniques. These AI-based virtual sensor models must integrate with other parts of the embedded system. In the case of a BMS, an AI-based SOC virtual sensor must be integrated with power limitation, fault detection, and cell balancing algorithms. Development of such a large and complex system requires integration, implementation, and testing of different components while minimizing expensive and time-consuming prototyping with actual hardware. Model-Based Design is a proven approach to accomplish this.
Learn how to develop virtual sensor models using feedforward neural networks, LSTMs, decision trees, and other AI techniques. Using the example of BMS SOC estimation, you will learn how to integrate AI models into Model-Based Design, so that you can test your design using simulation and implement it on an NXP S32K3xx board using automatic code generation. You will see how to evaluate and manage AI tradeoffs that span from model accuracy to deployment efficiency.
Hands-On Workshop: Automating Drone Analysis Using Simulation with MATLAB and Simscape
Are you ready to design a drone that delivers dinner? In this hands-on workshop you will use simulation to evaluate the performance of a food delivery drone on different missions. You will use MATLAB®, Simulink®, and Simscape™ to run tests and do the analysis. No experience with Simulink or Simscape is needed. We will guide you along the way.
Drones achieve maximize performance when engineers balance tradeoffs to create the best design. Evaluating the design under a wide range of conditions enables engineers to pick the combination of motor, battery, and payload parameters to meet this goal. Using simulation, you will automate the analysis of mechanical, electrical, and controller design of a quadcopter. You will use a Simscape model of the electromechanical system to evaluate the performance of the drone on delivery missions. The analysis will be automated using MATLAB and Python scripts, and you will explore the effect of payload size and speed of travel on the stability of the system. You will also use the simulation to determine the requirements of the electrical system and how to maximize range for sets of deliveries.
You will gain a better understanding of modeling approaches that enable engineering design of mechatronic systems.
System and Software Development and Safety Analysis for Digital Product Development
The automotive industry has seen major changes in the span of just over a century. We are experiencing an exciting digital transformation. The need for new product development methodology, processes, tools, and architecture is amplified by trends such as electromobility, automated driving, and modern mobility services. For such applications, software is essential. Future vehicles will be distinguished by software and digital features that can be updated on an ongoing basis rather than classic characteristics such as engine displacement.
Our priority is to adopt well-defined processes, methods, and tools for digital product development by bringing in agility and predictability to key challenges:
Defining the digital blueprint for the systems upfront during product development
Adopting an MBSE approach in a holistic way
Synchronizing system architecture and software architecture
Creating traceability across the life cycle
Adhering to various standards like APSICE and ISO 26262
Performing model-based safety analysis
Improving code quality
Reducing design effort and communication overhead<
Adopting systems engineering methodology to provide a structured approach to solving complex engineering challenges
Using tools from MathWorks like System Composer™ to define digital blueprints for our engineering systems
Verifying dynamic behavior of our systems using MATLAB® and Simulink® during early stages of our development life cycle via modeling and simulation techniques
Generating production code from models to improve efficiency and quality
Achieving traceability easily between system and other multidisciplinary elements (software, hardware, and mechanical), eliminating manual traceability and improving new features like suspect linking
Reducing overall development time by 30–40%.
Why Models Are Essential to Digital Engineering
Digital engineering is a trending industry buzzword. It's something that organizations strive to embrace and tool vendors claim to implement. But what is the practical reality behind the buzz? What are some of the essential aspects of an engineering ecosystem that actually provide the value promised? In this talk, Brian Douglas of Control Systems Lectures and MATLAB® Tech Talks, and Alan Moore, one of the original authors of SysML and co-author of “A Practical Guide to SysML,” discuss exactly these questions and show how models are a central and essential element of digital engineering.
Bridging System and Component Design for Vehicle Electrification Using Model-Based Systems Engineering (MBSE)
The increased complexity of vehicle architecture and demand for new technology like electrification is leading to a need for a systematic design approach. MBSE can address this requirement. TCS in collaboration with MathWorks developed an approach for adapting MBSE for vehicle electrification using System Composer™ and Simulink® tools. This will help organizations define system architecture, perform trade-off analysis, and improve stakeholder communication.
Explore the benefits of MBSE:
Seamless traceability across the artefacts from system requirements to architecture to detailed design
Improved stakeholder communication and reduction in overall development time
Increased collaboration between OEMs and suppliers at crucial stages of requirements and system specification finalization
Useful approach for OEM/Tier 1 suppliers already using MathWorks tools
Digital Transformation in Education: Lightning Round
Thought leaders in higher education will share best practices and examples of the digital transformation of their courses using MATLAB® and Simulink®. This session will be presented in a lightning round format. Several educators from around the world will share their thoughts and examples in less than 5 minutes each.
Using Virtual Twins for Distance Learning in Control Systems Labs
In 2020, amid the pandemic, the control systems lab for third-year mechanical engineering students at Hochschule Stralsund was changed to allow for distance learning, thereby reducing lab attendance time. The lab employed virtualization solutions from Quanser to conduct control experiments with MATLAB® and Simulink®, such as measuring the step response of a servo motor and designing a PID controller for the pitch of a 2DOF helicopter model. In this novel hybrid learning situation, each real experiment was paired with a virtual twin. Students conducted their control experiments off campus using the virtual twins, then reran their Simulink models later in the lab. This talk reports on the design of the hybrid learning environment and on the experience from the two semesters. Feedback from students clearly indicates that virtual twins may significantly improve the learning situation. The talk concludes with an outlook on the next stage of virtual labs at Hochschule Stralsund. In particular, the hybrid learning approach will be extended and rolled out to other labs. Furthermore, the Simulink and Quanser virtualization infrastructure will be hosted by the university's data center, thereby mitigating the dependency on the concrete PC environment of the lab participants.
Electric Drives: From Basic Models to Fuzzy and Neural Network Controllers
Today’s undergraduate students must understand electric drives for controlling electric vehicles, drones, robots, and more. These advanced applications require complex experimental and theoretical knowledge to develop industrial or academic applications that can help improve quality of life. However, conventional courses often fail to meet this need or cover certain transversal (reasoning to face complexity, self-knowledge, communication, innovative solutions) and disciplinary skills. Thus, a theoretical course that allows students to understand and gain experimental knowledge can help achieve competencies that students can use in their professional lives. This course has demonstrated that students can propose practical solutions using MATLAB® and Simulink®—some have presented their proposals in conferences. We will show how to create an active approach in which students gain knowledge for designing rapid prototypes using the Tec-21 Challenge-Based Learning framework and Model-Based Design.
This course covers basic and advanced controllers such as PID, fuzzy logic, and neural network controllers. Some experiments are conducted using low-cost components such as Arduino® boards or DC motors. Students can use the same Simulink program to perform an advanced simulation in real time and rapidly move from primary controllers to advanced ones. Furthermore, the students learn how to create a rapid prototype in a short period to find solutions to industrial problems. This course also gives an overview of real-time simulations to achieve a high-fidelity simulation.
Electrification, AI, and the Future of Engineering Education
The electrification megatrend is driving the replacement of less efficient technologies and helping us achieve a more sustainable future. With the switch to power electronics, batteries, and electric machines of all sizes, it has become commonplace to deploy more and more embedded devices to control them.
At the same time, with more access to data and computing power than ever before, machine learning is providing us with new ways to develop algorithms. When combined with ever more electronic and programmable machines, we are facing the opportunity and the challenge to build increasingly autonomous systems.
How can engineers architect such complex systems, iterate quickly, and validate their designs along the way? For many companies across industries, from renewable energies to mechatronics or transportation, the answer is Model-Based Design. In this presentation, we will look at how they are leveraging MATLAB® and domain-specific tools, with Simulink® as an integration platform to model multidomain systems, validate their behavior, and deploy code for them.
With such convergence of mechanics, electronics, and software, how must the skills of future engineers evolve? We will share examples of how leading universities around the world are adapting their curricula to include more active learning with professional tools to help their students gain interdisciplinary skills and systems thinking.
Preparing Engineers for the Growing AI Workforce
Artificial intelligence (AI) is driving a massive change in the way engineers, scientists, and programmers develop and improve products and services. All engineering fields today use AI in one form or another, and many of today's industrial challenges call for engineers prepared to incorporate AI into their workflows. Find out how MathWorks tools empower engineers, including those with minimal AI experience, to develop better systems that use AI workflows. Additionally, we will discuss how to accelerate the incorporation of AI in engineering courses.
You will walk away from this session with a better understanding of how a constant dialog between industry and academia can prepare engineers for the AI megatrend.
Accelerating Research with a Personal MATLAB Parallel Cloud
A critical part of econometric analysis is having a systemic and formally designed model that gives reliability and confidence to the inference drawn. Monte Carlo experiments play an important role for econometricians to test their new model, but these frameworks require significant computing power. Given the rapid development of cloud computing services, MATLAB Parallel Server™ provides a convenient solution to set up a personal parallel cluster for academic research. University students can take a similar approach even for curriculum-related work. In this talk, we will explore how using this cluster can accelerate research that seeks to identify the technological changes in the world’s productivity as well as lessons learned from setting up and running the cluster.
Hands-On Workshop: Introduction to Object-Oriented Programming with MATLAB
define structures called objects that combine data together with functions that operate on that data. In MATLAB®, you can create objects that model the behavior of devices and systems in the real world. Those objects can then be used as building blocks in applications used to simulate and analyze complex systems. Using object-oriented programming in MATLAB, you can manage software complexity by organizing your code into logical components that are easier to maintain and extend. Your objects can evolve and change over time without introducing incompatibilities in client code. In this workshop you will learn about the benefits of object-oriented programming. You will create a simple class with properties and methods. You will use attributes to control the visibility and accessibility of properties and methods to create a well-defined interface and hide internal complexity. You will see how inheritance can be used to define a hierarchy of related classes.
The Women in Tech session brings together a panel of women engineers, sharing their insights and journeys in the world of science and engineering. This live session provides an opportunity to learn, be inspired and network. All MATLAB EXPO attendees are welcome to attend.