Keynote: Are You Ready for AI? Is AI Ready for You?
AI, or artificial intelligence, is powering a massive shift in how engineers, scientists, and programmers develop and improve products and services. 85% of executives expect to gain or strengthen their competitive advantage through the use of AI, but is AI really poised to transform your research, products, or business?
In this presentation, head of MATLAB® Product Management, Michelle Hirsch, demystifies AI, challenging you to look for opportunities to leverage it in your work. You will also learn how MATLAB and Simulink® are giving engineers and scientists AI capabilities that were once available only to highly-specialized software developers and data scientists.
Advanced Automotive Technology Trends and Model-Based Design Approach
Today’s global automotive trends focus on connected, autonomous, shared, and electric mobility solutions. With these changing trends, automobiles are becoming smarter. This brings in a voluminous increase in E/E content as well as software complexities in size and architecture. Contradictorily challenges are for shorter development times as well as newer and faster models to keep in competition. Most of the time, differentiators are based on smartness, making a methodology and an environment which is modular, scalable, reusable, and maintainable necessary.
What's New in MATLAB and Simulink
Learn about the new capabilities in the latest releases of MATLAB® and Simulink® that will help your research, design, and development workflows become more efficient. MATLAB highlights include updates for writing and sharing code with the Live Editor, developing and sharing MATLAB apps with App Designer, and managing and analyzing data. Simulink highlights include updates to the Simulation Manager that allow you to run multiple simulations in parallel and new smart editing capabilities to build up models even faster. There are also new tools that make it easier to automatically upgrade your projects to the latest release.
Data Analytics Applications
Predictive Maintenance Using MATLAB and Simulink
For many industrial applications, accurately determining the time to maintenance in advance avoids larger, costly fixes down the line. This talk will cover how MATLAB® and Simulink® provide a platform that lets you explore different machine learning, signal processing, and dynamic modeling techniques to develop an algorithm that can accurately determine when your machine will require maintenance. In case you don't have the sensor data required to train your algorithm, you can use Simulink models of your machines to generate synthetic data that is representative of faulty behavior. After you have trained and validated your algorithm, you can then integrate it with your embedded devices and enterprise IT platforms.
A Systematic Control Approach to Improve the Energy Efficiency of an Industrial Cooling Tower
A cooling tower is used to extract the heat generated from chemical and process industries to the atmosphere. It is often observed that the cooling tower constitutes a major part of the energy consumption in industries. This is because the cooling tower’s operation is not within the optimum range, and if operated optimally, it can result in significant energy savings.
Since the cooling tower system is an integral part of almost all industries, saving energy will be a very effective way to cope with present-day environmental conditions. The operation of cooling tower depends on both mass transfer and heat transfer, and the impetus is relative humidity as well as the ambient temperature. The environmental conditions cannot be controlled, so the cooling tower needs to be tuned up for an optimized run. The objective of the present work is to dynamically model and find the optimized operating conditions for cooling tower. Therefore, an equilibrium model was created at steady state conditions to predict the equilibrium number of stages required for optimal function of existing industrial cooling towers.
In this presentation, you will learn how MATLAB® was used to create the mathematical model and how the optimization is performed on the model. You will also learn how the model was validated by simulating an ASPEN Plus model for the cooling tower with reported data.
Scaling up MATLAB Analytics with Kafka and Cloud Services
As the size and variety of your engineering data has grown, so has the capability to access, process, and analyze those (big) engineering data sets in MATLAB®. With the rise of streaming data technologies and large-scale cloud infrastructure, the volume and velocity of this data has increased significantly, influencing new approaches to handle data in motion. This presentation and demo highlights the use of MATLAB as a data analytics platform with best-in-class stream processing frameworks and cloud infrastructure to express MATLAB based workflows that enable decision-making in “near real time” through the application of machine learning models. It demonstrates how to use MATLAB Production Server™ to deploy these models on streams of data from Apache® Kafka®. The demonstration shows a full workflow from the development of a machine learning model in MATLAB to deploying it to work with a real-world sized problem running on the cloud.
Developing Optimization Algorithms for Real-World Applications
Efficient utilization of resources is one of the prime considerations while streamlining processes, be it to reduce either operational or computational costs. Day-to-day applications require trying out various approaches and then selecting reliable optimization routines. For example, how does someone decide on the number of components to purchase to increase yield in a production throughput while keeping in mind the constraint on components pricing? Using the huge amount of process data that is collected, how are sensitive parameters, which significantly affect the output, determined? What should the value of these parameters be or which settings should be tweaked in these components to maximize the returns with minimal cost? These and many more factors may need to be considered to set up efficient workflows or design processes with the objective of maximizing rewards and minimizing risks.
This talk describes tools and techniques in MATLAB® that can help you make informed engineering decisions, by introducing the traditional design optimization approach for tackling the above-mentioned scenarios. You will learn how to:
- Formulate the mathematical problem for a given task
- Use optimization techniques to arrive at the optimal solution
- Use discrete event simulation in conjunction with the optimization task
Tackling Big Data Using MATLAB
With the rise of analytics in all the industry segments, we see a huge increase in the size and complexity of data collected. Handling and understanding the data, thus, becomes challenging, particularly when the data does not fit in memory. MATLAB® provides a single, high-performance environment for building analytics and makes it easy, convenient, and scalable to analyze and process big data without having to learn big data programming.
Topics covered include:
- Accessing big data in variety of file formats like spreadsheets, images, text from files, datastores, and Hadoop® Distributed File System
- Visualizing, cleaning, and processing the data to analyze trends
- Running MATLAB based analytics on Apache® Spark™
Controls and Embedded Systems
Designing Efficient Power Electronics Systems Using Simulation
Designing efficient power electronics systems has become critical with the evolving need for an electric grid, the rise of electric vehicles, and the expansion of variable speed motors for increasing efficiency in industrial applications. Some of the key challenging questions that power electronics engineers often have in designing such efficient power converters are how to reduce the size of components; how to determine various losses of the power electronics system; and how to design feedback control algorithm and test the power electronics controllers in real time.
In this talk, with the help of real-world examples, MathWorks engineers showcase how the above-mentioned challenges can be addressed using a simulation-based approach. You will learn more about how to:
- Model ideal and detailed nonlinear power electronics switches quickly
- Model multi-domain components in a single environment
- Design feedback control algorithms and perform real-time simulation
Modeling of a Road Condition Estimator Using Machine Learning
Development of a logic without a good understanding of the domain and various scenarios is a challenging task. However, with the increase in computational ability and with advancements in the areas of artificial intelligence, data-driven modeling techniques, such as machine learning, offer a new perspective to address the issue and make use of the data in the best possible manner.
This talk is centered around the use of signal processing and machine learning techniques tools in the field of automotive engineering. The team at Mahindra Truck and Bus Division has tried to find a solution to an existing problem using a machine learning approach. The company shares challenges the team faced in thinking and building the algorithm and how they utilized MATLAB® tools to find a solution to the identified problem.
Full Vehicle Simulation for Electrification and Automated Driving Applications
The latest trends in the automotive world, such as powertrain electrification and automated driving, require engineers to have an accurate full vehicle simulation model, which can help them in making design tradeoffs and verifying their control algorithms before physical prototype components or vehicles are available. Building a full vehicle simulation model that satisfies these needs requires a lot of investments in terms of domain and tool knowledge and consumes more time.
In this talk, with the help of real-world examples, MathWorks engineers showcase how the recent developments in MathWorks solutions can address the above-mentioned challenges, thereby accelerating the vehicle development process. You will learn how to:
- Use a standard model architecture that can be reused throughout the development process
- Perform powertrain matching analysis and component selection
- Use these models for chassis control design and optimization
Verification and Validation of High-Integrity Systems
Simulation with Model-Based Design is a key capability to understanding the behavior of increasingly complex designs. MathWorks verification and validation products complement simulation with additional rigor, automation, and insight to verify your designs are functionally correct, in compliance with standards such as ISO 26262 and DO-178C, and correctly implemented on target hardware. This talk discusses new capabilities to support requirements modeling; automated guideline checking; and test coverage analysis including dynamic testing and static analysis of model and code. You will learn how to apply these capabilities systematically throughout a production development process to achieve higher quality and productivity.
Generating Industry Standards Production C Code Using Embedded Coder
Generating production-ready code automatically using Embedded Coder® has been a widely adopted practice in multiple industry segments, including automotive, aerospace, and defense. Automatic code generation enables efficient adoption of Model-Based Design, reducing the number of iterations in a typical industry-based product development cycle and eliminating errors introduced due to manual coding. Generating optimized, industry standards-compliant, and production-ready code requires adherence to design and coding standards, such as AUTOSAR, MISRA®, and safety standards like DO-178, ISO 26262, and IEC 61508. This talk highlights the features of Embedded Coder you can use to generate code that meets industrial standards, as well as the flexibility they offer when configuring the model and generating optimized production-ready code.
Signal Processing Systems
Designing and Testing Voice Interfaces through Microphone Array Modeling, Audio Prototyping, and Text Analytics
Voice assistants have shifted expectations on the future of human-machine interfaces. They are great examples of IoT products integrating the use of different sensors, device connectivity, and advanced algorithms. Successful innovators tackling similarly complex problems today need agile development tools that can leverage existing resources and create prototypes early on.
In this talk, you will learn more about:
- The process of modeling and simulating microphone arrays for the development of voice interfaces for IoT devices
- Early prototyping through real-time streaming and processing of audio signals
- Speech processing and analysis
Numerical Simulation of a Tsunami on a GPU
On December 26, 2004, an undersea megathrust earthquake triggered a series of deadly tsunamis that killed lots of people and damaged large properties bordering the Indian Ocean. These devastating tsunamis resulted in the establishment of early tsunami warning systems in tsunami-prone regions, with a prime motivation to detect tsunamis in advance and issue warnings to prevent loss of life and damage. The major components for detecting tsunamis are the detection of tsunamigenic earthquakes, continuous monitoring of sea levels, and numerical simulation of tsunami to estimate the water levels and travel times. The major challenge of estimation is the numerical simulation of tsunamis. Specifically, solving the governing shallow water equations, as an initial value problem on a large domain for a long time interval. In order to contribute to the early warnings, the sooner the simulation time the better.
In the presentation, Dr. Siva Srinivas Kolukula investigates the achievable speed of tsunami propagation simulations on a GPU using Parallel Computing Toolbox™. The governing linear shallow water equations are solved by employing the finite difference method. Using MATLAB® for simple GPU computing, numerical simulation is accelerated. A good performance in simulation speed is noticed using MATLAB for simple GPUs. The results are compared with real-time water observations in order to validate the MATLAB code and is found to be a good match.
Dr. Siva Srinivas Kolukula, Project Scientist - B, Indian National Centre for Ocean Information Services
5G: What’s Behind the Next Generation of Mobile Communications?
Learn how MATLAB® and Simulink® help you develop 5G wireless systems, including new digital, RF, and antenna array technologies that enable the ambitious performance goals of the new mobile communications standard.
This talk presents and demonstrates available tools and techniques for designing and testing 5G new radio physical layer algorithms; massive MIMO architectures and hybrid beamforming techniques for mmWave frequencies; and details on modeling and mitigating channel and RF impairments.
Designing and Integrating Antenna Arrays with Multi-Function Radar Systems
The multi-function radar system is an emerging technology, enabling radars to perform multiple tasks, such as searching and tracking, simultaneously. Modeling the antenna and integrating it with the system is very critical to detecting and addressing issues early. MATLAB® helps you in designing antennas and antenna arrays, rapidly trying different configurations, and integrating them earlier at the system level.
In this talk, you will learn how to model antenna and antenna arrays and integrate them with multi-function radar systems. Topics covered include:
- Analyzing the performance of custom printed antennas and fabricating them using Gerber files
- Performing array analysis by computing coupling among antenna elements
- Integrating antenna models with the rest of the system
- Modeling and simulating multi-functional capabilities of radars
Designing and Prototyping Digital Systems on SoC FPGAs
Digital system designers are increasingly moving to SoC FPGAs for the implementation of their applications due to the high-speed compute capabilities of FPGAs and the ability to perform complex operations on DSPs or MCUs. This combination of ARM® cores along with the programmable logic of a conventional FPGA requires designers to adopt hardware and software co-design methodology. With Model-Based Design, design teams can simulate models of complete systems, partition designs between hardware and software, and use automatic C/C++ and HDL code generation to prototype on Xilinx® Zynq® or Intel® SoC platforms in an integrated workflow.
In this talk, you will learn how to move from design to prototype on SoC FPGAs through:
- Exploration of hardware-software architecture partitioning
- Automatic HDL and C code generation for FPGA fabric and ARM MCU
- Generation of interface logic and software between FPGA and ARM
- Implementation and prototyping on Xilinx Zynq and Intel SoC platforms
Robotics and Autonomous Systems
Automated Driving Development with MATLAB and Simulink
ADAS and autonomous driving technologies are redefining the automotive industry, changing all aspects of transportation, from daily commutes to long-haul trucking. Engineers across the industry use Model-Based Design with MATLAB® and Simulink® to develop their automated driving systems. This talk demonstrates how MATLAB and Simulink serve as an integrated development environment for the different domains required for automated driving, including perception, sensor fusion, and control design.
This talk covers:
- Perception algorithm design using deep learning
- Sensor fusion design and verification
- Control algorithm design with Model Predictive Control Toolbox™
Research is a systematic investigative process. L&T Technology Services works in autonomous drive, increasing knowledge and working with various OEMs and Tier-1s to discover new facts and implementations in a limited time. Although many problems turn out to have several solutions (the means to close the gap or correct the deviation), difficulties arise where such means are either not obvious or are not immediately available. Similarly, L&T Technology Services faced many challenges in developing autonomous applications, such as a sensor fusion model, deep learning architecture, machine learning, Lidar-based object detection, and a control model, which can be time-consuming to develop.
The company identified MATLAB® as a key tool that provides significant toolboxes that enable them to move faster and to precisely prove concepts. AEB (automated emergency braking) is an important feature in automated driving systems, where the goal is to provide correct, timely, and reliable control signal for the system to act on impending collision with the objects in front of the vehicle. AEB has various practical challenges. A single sensor (monocular camera) system would be the right one for ADAS system, whereas a system taking action would need absolute certainty and require supplementary sensors.
In this presentation, Gopinath Chidambaram will explain how the challenges L&T Technology Services faced in autonomous system development have been addressed using MATLAB and Simulink®.
Demystifying Deep Learning
Deep learning can achieve state-of-the-art accuracy for many tasks considered algorithmically unsolvable using traditional machine learning, including classifying objects in a scene or recognizing optimal paths in an environment. Gain practical knowledge of the domain of deep learning and discover new MATLAB® features that simplify these tasks and eliminate the low-level programming. From prototype to production, you’ll see demonstrations on building and training neural networks and hear a discussion on automatically converting a model to CUDA® to run natively on GPUs.
Deploying Deep Neural Networks to Embedded GPUs and CPUs
Designing and deploying deep learning and computer vision applications to embedded CPU and GPU platforms is challenging because of resource constraints inherent in embedded devices. A MATLAB® based workflow facilitates the design of these applications, and automatically generated C or CUDA® code can be deployed on boards like the Jetson TX2 and DRIVE™ PX to achieve very fast inference.
The presentation illustrates how MATLAB supports all major phases of this workflow. Starting with algorithm design, the algorithm may employ deep neural networks augmented with traditional computer vision techniques and can be tested and verified within MATLAB. Next, these networks are trained using GPU and parallel computing support for MATLAB either on the desktop, cluster, or the cloud. Finally, GPU Coder™ generates portable and optimized C/C++ and/or CUDA® code from the MATLAB algorithm, which is then cross-compiled and deployed to CPUs and/or a Tegra® board. Benchmarks show that performance of the auto-generated CUDA code is ~2.5x faster than MXNet, ~5x faster than Caffe2, ~7x faster than TensorFlow®, and on par with TensorRT™ implementation.
Developing Algorithms for Robotics and Autonomous Systems
Robotics researchers and engineers use MATLAB® and Simulink® to design and tune algorithms for perception, planning, and controls; model real-world systems; and automatically generate code—all from one software environment. In this presentation, you learn how to develop autonomous systems that are complex with multiple sensors, need continuous planning and decision making, as well as have controls and motion requirements. An approach to adopt these interconnected technologies and make them work seamlessly is Model-Based Design. It centers on the use of system models throughout the development process for design, analysis, simulation, automatic code generation, and verification. Through the lens of an autonomous drone example, see how techniques in perception, such as deep learning, can be integrated with algorithms for motion planning and control of autonomous flying systems.