Keynote: Are You Ready for AI? Is AI Ready for You?
AI, or artificial intelligence, is powering a massive shift in how engineers, scientists, and programmers develop and improve products and services. 85% of executives expect to gain or strengthen their competitive advantage through the use of AI, but is AI really poised to transform your research, products, or business?
In this presentation, head of MATLAB® Product Management, Michelle Hirsch, demystifies AI, challenging you to look for opportunities to leverage it in your work. You will also learn how MATLAB and Simulink® are giving engineers and scientists AI capabilities that were once available only to highly-specialized software developers and data scientists.
The Role of Simulation in Contemporary Industrial Research – Sharing Experiences
The presentation will give a quick overview of some of the areas of industrial research being carried out at GE Global Research. Over the last decade, the speed at which we need to move an idea from the laboratory to the field has reduced drastically, with no compromise on accuracy. The role of simulation in this journey, including the usage of various features and tool boxes available from MathWorks, will be highlighted through specific case studies in the energy and transportation segments.
What's New in MATLAB and Simulink
Learn about the new capabilities in the latest releases of MATLAB® and Simulink® that will help your research, design, and development workflows become more efficient. MATLAB highlights include updates for writing and sharing code with the Live Editor, developing and sharing MATLAB apps with App Designer, and managing and analyzing data. Simulink highlights include updates to the Simulation Manager that allow you to run multiple simulations in parallel and new smart editing capabilities to build up models even faster. There are also new tools that make it easier to automatically upgrade your projects to the latest release.
Data Analytics Applications
Predictive Maintenance Using MATLAB and Simulink
For many industrial applications, accurately determining the time to maintenance in advance avoids larger, costly fixes down the line. This talk will cover how MATLAB® and Simulink® provide a platform that lets you explore different machine learning, signal processing, and dynamic modeling techniques to develop an algorithm that can accurately determine when your machine will require maintenance. In case you don't have the sensor data required to train your algorithm, you can use Simulink models of your machines to generate synthetic data that is representative of faulty behavior. After you have trained and validated your algorithm, you can then integrate it with your embedded devices and enterprise IT platforms.
Deep Learning Based Modeling of a Gas Turbine
With rising complexities in control algorithms for systems like gas turbine engine, a good dynamic model is necessary for validating the algorithm. There are multiple ways in which the dynamic models can be developed. White-box modelling approaches rely on energy and thermodynamic balance equations.
Hence, assumptions and linearization methods are required to simplify and solve complex dynamics. Models and control systems designed using simplified linearized equations are not accurate enough to capture system dynamics precisely.
In this presentation, you will learn how to make use of data from the gas turbine system to model the dynamics of the system with better accuracy using a neural network and deep learning method.
Dr. P.S.V. Nataraj, IIT Bombay
Scaling up MATLAB Analytics with Kafka and Cloud Services
As the size and variety of your engineering data has grown, so has the capability to access, process, and analyze those (big) engineering data sets in MATLAB®. With the rise of streaming data technologies and large-scale cloud infrastructure, the volume and velocity of this data has increased significantly, influencing new approaches to handle data in motion. This presentation and demo highlights the use of MATLAB as a data analytics platform with best-in-class stream processing frameworks and cloud infrastructure to express MATLAB based workflows that enable decision-making in “near real time” through the application of machine learning models. It demonstrates how to use MATLAB Production Server™ to deploy these models on streams of data from Apache® Kafka®. The demonstration shows a full workflow from the development of a machine learning model in MATLAB to deploying it to work with a real-world sized problem running on the cloud.
Developing Optimization Algorithms for Real-World Applications
Efficient utilization of resources is one of the prime considerations while streamlining processes, be it to reduce either operational or computational costs. Day-to-day applications require trying out various approaches and then selecting reliable optimization routines. For example, how does someone decide on the number of components to purchase to increase yield in a production throughput while keeping in mind the constraint on components pricing? Using the huge amount of process data that is collected, how are sensitive parameters, which significantly affect the output, determined? What should the value of these parameters be or which settings should be tweaked in these components to maximize the returns with minimal cost? These and many more factors may need to be considered to set up efficient workflows or design processes with the objective of maximizing rewards and minimizing risks.
This talk describes tools and techniques in MATLAB® that can help you make informed engineering decisions, by introducing the traditional design optimization approach for tackling the above-mentioned scenarios. You will learn how to:
- Formulate the mathematical problem for a given task
- Use optimization techniques to arrive at the optimal solution
- Use discrete event simulation in conjunction with the optimization task
Tackling Big Data Using MATLAB
With the rise of analytics in all the industry segments, we see a huge increase in the size and complexity of data collected. Handling and understanding the data, thus, becomes challenging, particularly when the data does not fit in memory. MATLAB® provides a single, high-performance environment for building analytics and makes it easy, convenient, and scalable to analyze and process big data without having to learn big data programming.
Topics covered include:
- Accessing big data in variety of file formats like spreadsheets, images, text from files, datastores, and Hadoop® Distributed File System
- Visualizing, cleaning, and processing the data to analyze trends
- Running MATLAB based analytics on Apache® Spark™
Controls and Embedded Systems
Designing Efficient Power Electronics Systems Using Simulation
Designing efficient power electronics systems has become critical with the evolving need for an electric grid, the rise of electric vehicles, and the expansion of variable speed motors for increasing efficiency in industrial applications. Some of the key challenging questions that power electronics engineers often have in designing such efficient power converters are how to reduce the size of components; how to determine various losses of the power electronics system; and how to design feedback control algorithm and test the power electronics controllers in real time.
In this talk, with the help of real-world examples, MathWorks engineers showcase how the above-mentioned challenges can be addressed using a simulation-based approach. You will learn more about how to:
- Model ideal and detailed nonlinear power electronics switches quickly
- Model multi-domain components in a single environment
- Design feedback control algorithms and perform real-time simulation
Development and Deployment of Virtual Kinematics and Compliance Test System
Designing and improving vehicle dynamics attributes has become significantly important in commercial vehicle development. Upfront simulation of vehicle dynamics characteristics early in development offers huge time and cost advantages over complete dependency on tuning with physical vehicles. Importantly, it enables robust design by optimizing vehicle systems for the targeted performance.
For high-fidelity simulation of vehicle dynamics behaviors, the significance of proper kinematic and compliance (K&C) parameters can’t be overstated. Generally, these parameters are measured using physical K&C test rig. A K&C rig for commercial vehicle testing costs several hundred million rupees and takes a few years to set up. In the absence of physical rig, a virtual K&C measurement methodology is developed using Simscape Multibody™ and other toolboxes of MATLAB® and Simulink®.
The virtual K&C system comprises of three modules, one each for vehicle, rig, and control and are interfaced to function as a measurement rig. The model comprises of rigid bodies, flexible members, a 3D kinematic chain, joints, and a hydraulic system. The parameterized modelling enables swift change of vehicle configurations. Comparison of parameters predicted with measured values resulted in high correlation, and thus, increased engineering confidence.
In this presentation, you will learn how this work resulted in lean and cost-efficient vehicle dynamics simulation, investigation of numerous combinations, and optimized design for one of major running vehicle development program.
Muralidharan C, Divisional Manager, Vehicle Dynamics, Ashok Leyland
Hardware and Software Co-Design for Motor Control Applications
Electric motors are everywhere and are finding new applications every day. The technology to control motors is also evolving to be based on new platforms, such as Xilinx® Zynq®, that combine embedded processors with the programmable logic of FPGAs.
In this talk, you will learn how C and HDL code generation are used to produce implementations on Xilinx Zynq SoCs. You will also explore practical methods for developing motor controllers targeting Zynq SoCs, including the use of new HDL debugging capabilities.
Verification and Validation of High-Integrity Systems
Simulation with Model-Based Design is a key capability to understanding the behavior of increasingly complex designs. MathWorks verification and validation products complement simulation with additional rigor, automation, and insight to verify your designs are functionally correct, in compliance with standards such as ISO 26262 and DO-178C, and correctly implemented on target hardware. This talk discusses new capabilities to support requirements modeling; automated guideline checking; and test coverage analysis including dynamic testing and static analysis of model and code. You will learn how to apply these capabilities systematically throughout a production development process to achieve higher quality and productivity.
Generating Industry Standards Production C Code Using Embedded Coder
Generating production-ready code automatically using Embedded Coder® has been a widely adopted practice in multiple industry segments, including automotive, aerospace, and defense. Automatic code generation enables efficient adoption of Model-Based Design, reducing the number of iterations in a typical industry-based product development cycle and eliminating errors introduced due to manual coding. Generating optimized, industry standards-compliant, and production-ready code requires adherence to design and coding standards, such as AUTOSAR, MISRA®, and safety standards like DO-178, ISO 26262, and IEC 61508. This talk highlights the features of Embedded Coder you can use to generate code that meets industrial standards, as well as the flexibility they offer when configuring the model and generating optimized production-ready code.
Signal Processing Systems
Designing and Testing Voice Interfaces through Microphone Array Modeling, Audio Prototyping, and Text Analytics
Voice assistants have shifted expectations on the future of human-machine interfaces. They are great examples of IoT products integrating the use of different sensors, device connectivity, and advanced algorithms. Successful innovators tackling similarly complex problems today need agile development tools that can leverage existing resources and create prototypes early on.
In this talk, you will learn more about:
- The process of modeling and simulating microphone arrays for the development of voice interfaces for IoT devices
- Early prototyping through real-time streaming and processing of audio signals
- Speech processing and analysis
Numerical Simulation of a Tsunami on a GPU
On December 26, 2004, an undersea megathrust earthquake triggered a series of deadly tsunamis that killed lots of people and damaged large properties bordering the Indian Ocean. These devastating tsunamis resulted in the establishment of early tsunami warning systems in tsunami-prone regions, with a prime motivation to detect tsunamis in advance and issue warnings to prevent loss of life and damage. The major components for detecting tsunamis are the detection of tsunamigenic earthquakes, continuous monitoring of sea levels, and numerical simulation of tsunami to estimate the water levels and travel times. The major challenge of estimation is the numerical simulation of tsunamis. Specifically, solving the governing shallow water equations, as an initial value problem on a large domain for a long time interval. In order to contribute to the early warnings, the sooner the simulation time the better.
In the presentation, Dr. Siva Srinivas Kolukula investigates the achievable speed of tsunami propagation simulations on a GPU using Parallel Computing Toolbox™. The governing linear shallow water equations are solved by employing the finite difference method. Using MATLAB® for simple GPU computing, numerical simulation is accelerated. A good performance in simulation speed is noticed using MATLAB for simple GPUs. The results are compared with real-time water observations in order to validate the MATLAB code and is found to be a good match.
Dr. Siva Srinivas Kolukula, Project Scientist - B, Indian National Centre for Ocean Information Services
5G: What’s Behind the Next Generation of Mobile Communications?
Learn how MATLAB® and Simulink® help you develop 5G wireless systems, including new digital, RF, and antenna array technologies that enable the ambitious performance goals of the new mobile communications standard.
This talk presents and demonstrates available tools and techniques for designing and testing 5G new radio physical layer algorithms; massive MIMO architectures and hybrid beamforming techniques for mmWave frequencies; and details on modeling and mitigating channel and RF impairments.
Designing and Integrating Antenna Arrays with Multi-Function Radar Systems
The multi-function radar system is an emerging technology, enabling radars to perform multiple tasks, such as searching and tracking, simultaneously. Modeling the antenna and integrating it with the system is very critical to detecting and addressing issues early. MATLAB® helps you in designing antennas and antenna arrays, rapidly trying different configurations, and integrating them earlier at the system level.
In this talk, you will learn how to model antenna and antenna arrays and integrate them with multi-function radar systems. Topics covered include:
- Analyzing the performance of custom printed antennas and fabricating them using Gerber files
- Performing array analysis by computing coupling among antenna elements
- Integrating antenna models with the rest of the system
- Modeling and simulating multi-functional capabilities of radars
Designing and Prototyping Digital Systems on SoC FPGAs
Digital system designers are increasingly moving to SoC FPGAs for the implementation of their applications due to the high-speed compute capabilities of FPGAs and the ability to perform complex operations on DSPs or MCUs. This combination of ARM® cores along with the programmable logic of a conventional FPGA requires designers to adopt hardware and software co-design methodology. With Model-Based Design, design teams can simulate models of complete systems, partition designs between hardware and software, and use automatic C/C++ and HDL code generation to prototype on Xilinx® Zynq® or Intel® SoC platforms in an integrated workflow.
In this talk, you will learn how to move from design to prototype on SoC FPGAs through:
- Exploration of hardware-software architecture partitioning
- Automatic HDL and C code generation for FPGA fabric and ARM MCU
- Generation of interface logic and software between FPGA and ARM
- Implementation and prototyping on Xilinx Zynq and Intel SoC platforms
Robotics and Autonomous Systems
Automated Driving Development with MATLAB and Simulink
ADAS and autonomous driving technologies are redefining the automotive industry, changing all aspects of transportation, from daily commutes to long-haul trucking. Engineers across the industry use Model-Based Design with MATLAB® and Simulink® to develop their automated driving systems. This talk demonstrates how MATLAB and Simulink serve as an integrated development environment for the different domains required for automated driving, including perception, sensor fusion, and control design.
This talk covers:
- Perception algorithm design using deep learning
- Sensor fusion design and verification
- Control algorithm design with Model Predictive Control Toolbox™
Research is a systematic investigative process. L&T Technology Services works in autonomous drive, increasing knowledge and working with various OEMs and Tier-1s to discover new facts and implementations in a limited time. Although many problems turn out to have several solutions (the means to close the gap or correct the deviation), difficulties arise where such means are either not obvious or are not immediately available. Similarly, L&T Technology Services faced many challenges in developing autonomous applications, such as a sensor fusion model, deep learning architecture, machine learning, Lidar-based object detection, and a control model, which can be time-consuming to develop.
The company identified MATLAB® as a key tool that provides significant toolboxes that enable them to move faster and to precisely prove concepts. AEB (automated emergency braking) is an important feature in automated driving systems, where the goal is to provide correct, timely, and reliable control signal for the system to act on impending collision with the objects in front of the vehicle. AEB has various practical challenges. A single sensor (monocular camera) system would be the right one for ADAS system, whereas a system taking action would need absolute certainty and require supplementary sensors.
In this presentation, Gopinath Chidambaram will explain how the challenges L&T Technology Services faced in autonomous system development have been addressed using MATLAB and Simulink®.
Demystifying Deep Learning
Deep learning can achieve state-of-the-art accuracy for many tasks considered algorithmically unsolvable using traditional machine learning, including classifying objects in a scene or recognizing optimal paths in an environment. Gain practical knowledge of the domain of deep learning and discover new MATLAB® features that simplify these tasks and eliminate the low-level programming. From prototype to production, you’ll see demonstrations on building and training neural networks and hear a discussion on automatically converting a model to CUDA® to run natively on GPUs.
Deploying Deep Neural Networks to Embedded GPUs and CPUs
Designing and deploying deep learning and computer vision applications to embedded CPU and GPU platforms is challenging because of resource constraints inherent in embedded devices. A MATLAB® based workflow facilitates the design of these applications, and automatically generated C or CUDA® code can be deployed on boards like the Jetson TX2 and DRIVE™ PX to achieve very fast inference.
The presentation illustrates how MATLAB supports all major phases of this workflow. Starting with algorithm design, the algorithm may employ deep neural networks augmented with traditional computer vision techniques and can be tested and verified within MATLAB. Next, these networks are trained using GPU and parallel computing support for MATLAB either on the desktop, cluster, or the cloud. Finally, GPU Coder™ generates portable and optimized C/C++ and/or CUDA® code from the MATLAB algorithm, which is then cross-compiled and deployed to CPUs and/or a Tegra® board. Benchmarks show that performance of the auto-generated CUDA code is ~2.5x faster than MXNet, ~5x faster than Caffe2, ~7x faster than TensorFlow®, and on par with TensorRT™ implementation.
Developing Algorithms for Robotics and Autonomous Systems
Robotics researchers and engineers use MATLAB® and Simulink® to design and tune algorithms for perception, planning, and controls; model real-world systems; and automatically generate code—all from one software environment. In this presentation, you learn how to develop autonomous systems that are complex with multiple sensors, need continuous planning and decision making, as well as have controls and motion requirements. An approach to adopt these interconnected technologies and make them work seamlessly is Model-Based Design. It centers on the use of system models throughout the development process for design, analysis, simulation, automatic code generation, and verification. Through the lens of an autonomous drone example, see how techniques in perception, such as deep learning, can be integrated with algorithms for motion planning and control of autonomous flying systems.