Keynote: Are You Ready for AI? Is AI Ready for You?
AI, or artificial intelligence, is powering a massive shift in how engineers, scientists, and programmers develop and improve products and services. 85% of executives expect to gain or strengthen their competitive advantage through the use of AI, but is AI really poised to transform your research, products, or business?
In this presentation, head of MATLAB® Product Management, Michelle Hirsch, demystifies AI, challenging you to look for opportunities to leverage it in your work. You will also learn how MATLAB and Simulink® are giving engineers and scientists AI capabilities that were once available only to highly-specialized software developers and data scientists.
Addressing complexity in Automotive Software using Model-Based Design
This presentation focuses on addressing complexity in automotive software using Model-Based Design.
What's New in MATLAB and Simulink
Learn about the new capabilities in the latest releases of MATLAB® and Simulink® that will help your research, design, and development workflows become more efficient. MATLAB highlights include updates for writing and sharing code with the Live Editor, developing and sharing MATLAB apps with App Designer, and managing and analyzing data. Simulink highlights include updates to the Simulation Manager that allow you to run multiple simulations in parallel and new smart editing capabilities to build up models even faster. There are also new tools that make it easier to automatically upgrade your projects to the latest release.
Data Analytics Applications
Predictive Maintenance Using MATLAB and Simulink
For many industrial applications, accurately determining the time to maintenance in advance avoids larger, costly fixes down the line. This talk will cover how MATLAB® and Simulink® provide a platform that lets you explore different machine learning, signal processing, and dynamic modeling techniques to develop an algorithm that can accurately determine when your machine will require maintenance. In case you don't have the sensor data required to train your algorithm, you can use Simulink models of your machines to generate synthetic data that is representative of faulty behavior. After you have trained and validated your algorithm, you can then integrate it with your embedded devices and enterprise IT platforms.
Using Fleet Analytics and MATLAB to Build Strategies for BS-VI Development
The motivation for this project at Honda is to develop strategies for BS6 norms and create tests for fuel economy and emission constraints. In this project, data is being collected in large volumes through telematics from field vehicles of different makes and operating in multiple geographical and climatic conditions. These varying operating conditions and driving patterns lead to varying vehicle performance, drive efficiency, and emissions. Furthermore, it leads to a vast difference in calibration needs since performance changes with the different riding and operating conditions of the field vehicle.
The challenges in this project were mainly the volume of the data that had to be churned to come up with any valid analysis, leading to a big data problem. The analysis had to be performed for data exploration, and feature engineering was aimed at achieving an understanding of the fuel economy profile for different geographical areas, variations based on temperature, and geographical terrain, as well as generating drive cycles capturing real-world driving scenarios. The team at Honda also needed to scale up to reduce the computational time due to the huge amount of data.
In this presentation, you will learn how the key challenges described above were addressed using MATLAB® and toolboxes.
Shubham Garg, Honda
Scaling up MATLAB Analytics with Kafka and Cloud Services
As the size and variety of your engineering data has grown, so has the capability to access, process, and analyze those (big) engineering data sets in MATLAB®. With the rise of streaming data technologies and large-scale cloud infrastructure, the volume and velocity of this data has increased significantly, influencing new approaches to handle data in motion. This presentation and demo highlights the use of MATLAB as a data analytics platform with best-in-class stream processing frameworks and cloud infrastructure to express MATLAB based workflows that enable decision-making in “near real time” through the application of machine learning models. It demonstrates how to use MATLAB Production Server™ to deploy these models on streams of data from Apache® Kafka®. The demonstration shows a full workflow from the development of a machine learning model in MATLAB to deploying it to work with a real-world sized problem running on the cloud.
Developing Optimization Algorithms for Real-World Applications
Efficient utilization of resources is one of the prime considerations while streamlining processes, be it to reduce either operational or computational costs. Day-to-day applications require trying out various approaches and then selecting reliable optimization routines. For example, how does someone decide on the number of components to purchase to increase yield in a production throughput while keeping in mind the constraint on components pricing? Using the huge amount of process data that is collected, how are sensitive parameters, which significantly affect the output, determined? What should the value of these parameters be or which settings should be tweaked in these components to maximize the returns with minimal cost? These and many more factors may need to be considered to set up efficient workflows or design processes with the objective of maximizing rewards and minimizing risks.
This talk describes tools and techniques in MATLAB® that can help you make informed engineering decisions, by introducing the traditional design optimization approach for tackling the above-mentioned scenarios. You will learn how to:
- Formulate the mathematical problem for a given task
- Use optimization techniques to arrive at the optimal solution
- Use discrete event simulation in conjunction with the optimization task
Tackling Big Data Using MATLAB
With the rise of analytics in all the industry segments, we see a huge increase in the size and complexity of data collected. Handling and understanding the data, thus, becomes challenging, particularly when the data does not fit in memory. MATLAB® provides a single, high-performance environment for building analytics and makes it easy, convenient, and scalable to analyze and process big data without having to learn big data programming.
Topics covered include:
- Accessing big data in variety of file formats like spreadsheets, images, text from files, datastores, and Hadoop® Distributed File System
- Visualizing, cleaning, and processing the data to analyze trends
- Running MATLAB based analytics on Apache® Spark™
Controls and Embedded Systems
Designing Efficient Power Electronics Systems Using Simulation
Designing efficient power electronics systems has become critical with the evolving need for an electric grid, the rise of electric vehicles, and the expansion of variable speed motors for increasing efficiency in industrial applications. Some of the key challenging questions that power electronics engineers often have in designing such efficient power converters are how to reduce the size of components; how to determine various losses of the power electronics system; and how to design feedback control algorithm and test the power electronics controllers in real time.
In this talk, with the help of real-world examples, MathWorks engineers showcase how the above-mentioned challenges can be addressed using a simulation-based approach. You will learn more about how to:
- Model ideal and detailed nonlinear power electronics switches quickly
- Model multi-domain components in a single environment
- Design feedback control algorithms and perform real-time simulation
Lithium-Ion Battery Parameter Estimation for HIL, SIL, and MIL Validation
With the rising focus on electric vehicles, lithium-ion batteries are widely used in the automotive market (EV, PEHV vehicles). The battery cells are arranged in a way that the battery packs are generated, and it becomes very important to protect the battery from over charge, deep discharge, and thermal runway.
The battery management system (BMS) is used to estimate the state of charge (SOC), state of health (SOH), and protect the battery packs from failure. To validate and perform a robust testing of BMS for a hardware-in-the-loop (HIL), model-in-the-loop (MIL), or software-in-the-loop (SIL) system, exact battery parameters are required.
In this presentation, you will learn
Full Vehicle Simulation for Electrification and Automated Driving Applications
The latest trends in the automotive world, such as powertrain electrification and automated driving, require engineers to have an accurate full vehicle simulation model, which can help them in making design tradeoffs and verifying their control algorithms before physical prototype components or vehicles are available. Building a full vehicle simulation model that satisfies these needs requires a lot of investments in terms of domain and tool knowledge and consumes more time.
In this talk, with the help of real-world examples, MathWorks engineers showcase how the recent developments in MathWorks solutions can address the above-mentioned challenges, thereby accelerating the vehicle development process. You will learn how to:
- Use a standard model architecture that can be reused throughout the development process
- Perform powertrain matching analysis and component selection
- Use these models for chassis control design and optimization
Verification and Validation of High-Integrity Systems
Simulation with Model-Based Design is a key capability to understanding the behavior of increasingly complex designs. MathWorks verification and validation products complement simulation with additional rigor, automation, and insight to verify your designs are functionally correct, in compliance with standards such as ISO 26262 and DO-178C, and correctly implemented on target hardware. This talk discusses new capabilities to support requirements modeling; automated guideline checking; and test coverage analysis including dynamic testing and static analysis of model and code. You will learn how to apply these capabilities systematically throughout a production development process to achieve higher quality and productivity.
Generating Industry Standards Production C Code Using Embedded Coder
Generating production-ready code automatically using Embedded Coder® has been a widely adopted practice in multiple industry segments, including automotive, aerospace, and defense. Automatic code generation enables efficient adoption of Model-Based Design, reducing the number of iterations in a typical industry-based product development cycle and eliminating errors introduced due to manual coding. Generating optimized, industry standards-compliant, and production-ready code requires adherence to design and coding standards, such as AUTOSAR, MISRA®, and safety standards like DO-178, ISO 26262, and IEC 61508. This talk highlights the features of Embedded Coder you can use to generate code that meets industrial standards, as well as the flexibility they offer when configuring the model and generating optimized production-ready code.
Signal Processing Systems
Designing and Testing Voice Interfaces through Microphone Array Modeling, Audio Prototyping, and Text Analytics
Voice assistants have shifted expectations on the future of human-machine interfaces. They are great examples of IoT products integrating the use of different sensors, device connectivity, and advanced algorithms. Successful innovators tackling similarly complex problems today need agile development tools that can leverage existing resources and create prototypes early on.
In this talk, you will learn more about:
- The process of modeling and simulating microphone arrays for the development of voice interfaces for IoT devices
- Early prototyping through real-time streaming and processing of audio signals
- Speech processing and analysis
Verifying the Hardware Implementation of Automotive Radar Signal Processing with MATLAB
One of the key challenges offered by automotive radar designs is the functional and performance verification of signal processing hardware implementations like
This methodology serves the basic requirement of checking
Shashank Venugopal presents the team’s experiences of integrating checks based on MATLAB into SV/UVM-based digital verification environments using DPI-C flow.
Sainath K, Design Engineer, NXP
Shashank Venugopal, Design Engineer, NXP
5G: What’s Behind the Next Generation of Mobile Communications?
Learn how MATLAB® and Simulink® help you develop 5G wireless systems, including new digital, RF, and antenna array technologies that enable the ambitious performance goals of the new mobile communications standard.
This talk presents and demonstrates available tools and techniques for designing and testing 5G new radio physical layer algorithms; massive MIMO architectures and hybrid beamforming techniques for mmWave frequencies; and details on modeling and mitigating channel and RF impairments.
Designing and Integrating Antenna Arrays with Multi-Function Radar Systems
The multi-function radar system is an emerging technology, enabling radars to perform multiple tasks, such as searching and tracking, simultaneously. Modeling the antenna and integrating it with the system is very critical to detecting and addressing issues early. MATLAB® helps you in designing antennas and antenna arrays, rapidly trying different configurations, and integrating them earlier at the system level.
In this talk, you will learn how to model antenna and antenna arrays and integrate them with multi-function radar systems. Topics covered include:
- Analyzing the performance of custom printed antennas and fabricating them using Gerber files
- Performing array analysis by computing coupling among antenna elements
- Integrating antenna models with the rest of the system
- Modeling and simulating multi-functional capabilities of radars
Designing and Verifying Digital and Mixed-Signal Systems
When designing mixed-signal systems, isolated analog and digital design flows can cause integration issues to be discovered late, which can then lead to delayed projects and costly design rework. In addition, slow simulations in traditional EDA tools limit design exploration and exhaustive test coverage during verification which can lead to a non-optimal, and in the worst case, faulty design. System-level behavioral modeling using MathWorks tools allows issues to be discovered earlier in
System levelmodeling and simulation of analog and mixed-signal circuits
- Generation of synthesizable digital RTL code
- Verification of SPICE models and HDL code by co-simulating with EDA tools
- Analysis of simulation data with advanced post-processing
Robotics and Autonomous Systems
Automated Driving Development with MATLAB and Simulink
ADAS and autonomous driving technologies are redefining the automotive industry, changing all aspects of transportation, from daily commutes to long-haul trucking. Engineers across the industry use Model-Based Design with MATLAB® and Simulink® to develop their automated driving systems. This talk demonstrates how MATLAB and Simulink serve as an integrated development environment for the different domains required for automated driving, including perception, sensor fusion, and control design.
This talk covers:
- Perception algorithm design using deep learning
- Sensor fusion design and verification
- Control algorithm design with Model Predictive Control Toolbox™
Research is a systematic investigative process. L&T Technology Services works in autonomous drive, increasing knowledge and working with various OEMs and Tier-1s to discover new facts and implementations in a limited time. Although many problems turn out to have several solutions (the means to close the gap or correct the deviation), difficulties arise where such means are either not obvious or are not immediately available. Similarly, L&T Technology Services faced many challenges in developing autonomous applications, such as a sensor fusion model, deep learning architecture, machine learning, Lidar-based object detection, and a control model, which can be time-consuming to develop.
The company identified MATLAB® as a key tool that provides significant toolboxes that enable them to move faster and to precisely prove concepts. AEB (automated emergency braking) is an important feature in automated driving systems, where the goal is to provide correct, timely, and reliable control signal for the system to act on impending collision with the objects in front of the vehicle. AEB has various practical challenges. A single sensor (monocular camera) system would be the right one for ADAS system, whereas a system taking action would need absolute certainty and require supplementary sensors.
In this presentation, Gopinath Chidambaram will explain how the challenges L&T Technology Services faced in autonomous system development have been addressed using MATLAB and Simulink®.
Demystifying Deep Learning
Deep learning can achieve state-of-the-art accuracy for many tasks considered algorithmically unsolvable using traditional machine learning, including classifying objects in a scene or recognizing optimal paths in an environment. Gain practical knowledge of the domain of deep learning and discover new MATLAB® features that simplify these tasks and eliminate the low-level programming. From prototype to production, you’ll see demonstrations on building and training neural networks and hear a discussion on automatically converting a model to CUDA® to run natively on GPUs.
Deploying Deep Neural Networks to Embedded GPUs and CPUs
Designing and deploying deep learning and computer vision applications to embedded CPU and GPU platforms is challenging because of resource constraints inherent in embedded devices. A MATLAB® based workflow facilitates the design of these applications, and automatically generated C or CUDA® code can be deployed on boards like the Jetson TX2 and DRIVE™ PX to achieve very fast inference.
The presentation illustrates how MATLAB supports all major phases of this workflow. Starting with algorithm design, the algorithm may employ deep neural networks augmented with traditional computer vision techniques and can be tested and verified within MATLAB. Next, these networks are trained using GPU and parallel computing support for MATLAB either on the desktop, cluster, or the cloud. Finally, GPU Coder™ generates portable and optimized C/C++ and/or CUDA® code from the MATLAB algorithm, which is then cross-compiled and deployed to CPUs and/or a Tegra® board. Benchmarks show that performance of the auto-generated CUDA code is ~2.5x faster than MXNet, ~5x faster than Caffe2, ~7x faster than TensorFlow®, and on par with TensorRT™ implementation.
Developing Algorithms for Robotics and Autonomous Systems
Robotics researchers and engineers use MATLAB® and Simulink® to design and tune algorithms for perception, planning, and controls; model real-world systems; and automatically generate code—all from one software environment. In this presentation, you learn how to develop autonomous systems that are complex with multiple sensors, need continuous planning and decision making, as well as have controls and motion requirements. An approach to adopt these interconnected technologies and make them work seamlessly is Model-Based Design. It centers on the use of system models throughout the development process for design, analysis, simulation, automatic code generation, and verification. Through the lens of an autonomous drone example, see how techniques in perception, such as deep learning, can be integrated with algorithms for motion planning and control of autonomous flying systems.