Featured Presentations
Dr. Supriya Sarkar, Head of Environmental Research Group, Tata Steel
Gautam Ponnappa PC, Training Engineer, MathWorks
Naga Chakrapani Pemmaraju, Senior Application Engineer, MathWorks India
R Vijayalayan, Manager – Control Design Application Engineering, MathWorks India
Durvesh Kulkarni, Senior Application Engineer, MathWorks India
Gaurav Dubey, Senior Team Lead – Pilot Engineering, MathWorks India
Vaishnavi H R, Application Engineer, MathWorks India
Swathi Balki, Pilot Engineer, MathWorks India
Vinod Thomas, Senior Training Engineer, MathWorks India
Shashank Venugopal, Design Engineer, NXP
Abhisek Roy, Application Engineer, MathWorks India
Ramanuja Jagannathan – Engineering Development Group
Arun Mathamkode, Application Support Engineer
Sruthi Geetha
Veer Alakshendra
Sruthi Geetha
Veer Alakshendra
Keynote: Are You Ready for AI? Is AI Ready for You
Michelle Hirsch, Head of MATLAB Product Management, MathWorks
AI, or artificial intelligence, is powering a massive shift in how engineers, scientists, and programmers develop and improve products and services. 85% of executives expect to gain or strengthen their competitive advantage through the use of AI, but is AI really poised to transform your research, products, or business?
In this presentation, head of MATLAB® Product Management, Michelle Hirsch, demystifies AI, challenging you to look for opportunities to leverage it in your work. You will also learn how MATLAB and Simulink® are giving engineers and scientists AI capabilities that were once available only to highly-specialized software developers and data scientists.
Developing Algorithms for Robotics and Autonomous Systems
Dhirendra Singh, Senior Application Engineer, MathWorks India
Abhisek Roy, Application Engineer, MathWorks India
Robotics researchers and engineers use MATLAB® and Simulink® to design and tune algorithms for perception, planning, and controls; model real-world systems; and automatically generate code—all from one software environment. In this presentation, you learn how to develop autonomous systems that are complex with multiple sensors, need continuous planning and decision making, as well as have controls and motion requirements. An approach to adopt these interconnected technologies and make them work seamlessly is Model-Based Design. It centers on the use of system models throughout the development process for design, analysis, simulation, automatic code generation, and verification. Through the lens of an autonomous drone example, see how techniques in perception, such as deep learning, can be integrated with algorithms for motion planning and control of autonomous flying systems.
What's New in MATLAB and Simulink R2018a
Prashant Rao, Technical Manager, MathWorks India
Learn about the new capabilities in the latest releases of MATLAB® and Simulink® that will help your research, design, and development workflows become more efficient. MATLAB highlights include updates for writing and sharing code with the Live Editor, developing and sharing MATLAB apps with App Designer, and managing and analyzing data. Simulink highlights include updates to the Simulation Manager that allow you to run multiple simulations in parallel and new smart editing capabilities to build up models even faster. There are also new tools that make it easier to automatically upgrade your projects to the latest release.
A Systematic Control Approach to Improve the Energy Efficiency of an Industrial Cooling Tower
Dr. Pinakpani Biswas, Principal Scientist, Tata Steel
Dr. Supriya Sarkar, Head of Environmental Research Group, Tata Steel
A cooling tower is used to extract the heat generated from chemical and process industries to the atmosphere. It is often observed that the cooling tower constitutes a major part of the energy consumption in industries. This is because the cooling tower’s operation is not within the optimum range, and if operated optimally, it can result in significant energy savings.
Since the cooling tower system is an integral part of almost all industries, saving energy will be a very effective way to cope with present-day environmental conditions. The operation of cooling tower depends on both mass transfer and heat transfer, and the impetus is relative humidity as well as the ambient temperature. The environmental conditions cannot be controlled, so the cooling tower needs to be tuned up for an optimized run. The objective of the present work is to dynamically model and find the optimized operating conditions for cooling tower. Therefore, an equilibrium model was created at steady state conditions to predict the equilibrium number of stages required for optimal function of existing industrial cooling towers.
In this presentation, you will learn how MATLAB® was used to create the mathematical model and how the optimization is performed on the model. You will also learn how the model was validated by simulating an ASPEN Plus model for the cooling tower with reported data.
Deep Learning Based Modeling of a Gas Turbine
Dr. P.S.V. Nataraj, IIT Bombay
With rising complexities in control algorithms for systems like gas turbine engine, a good dynamic model is necessary for validating the algorithm. There are multiple ways in which the dynamic models can be developed. White-box modelling approaches rely on energy and thermodynamic balance equations.
Hence, assumptions and linearization methods are required to simplify and solve complex dynamics. Models and control systems designed using simplified linearized equations are not accurate enough to capture system dynamics precisely.
In this presentation, you will learn how to make use of data from the gas turbine system to model the dynamics of the system with better accuracy using a neural network and deep learning method.
Developing Optimization Algorithms for Real-World Applications
Dr. Lakshminarayan Viju Ravichandran, Senior Education Technical Evangelist, MathWorks
Gautam Ponnappa PC, Training Engineer, MathWorks
Efficient utilization of resources is one of the prime considerations while streamlining processes, be it to reduce either operational or computational costs. Day-to-day applications require trying out various approaches and then selecting reliable optimization routines. For example, how does someone decide on the number of components to purchase to increase yield in a production throughput while keeping in mind the constraint on components pricing? Using the huge amount of process data that is collected, how are sensitive parameters, which significantly affect the output, determined? What should the value of these parameters be or which settings should be tweaked in these components to maximize the returns with minimal cost? These and many more factors may need to be considered to set up efficient workflows or design processes with the objective of maximizing rewards and minimizing risks.
This talk describes tools and techniques in MATLAB® that can help you make informed engineering decisions, by introducing the traditional design optimization approach for tackling the above-mentioned scenarios. You will learn how to:
- Formulate the mathematical problem for a given task
- Use optimization techniques to arrive at the optimal solution
- Use discrete event simulation in conjunction with the optimization task
Predictive Maintenance Using MATLAB and Simulink
Amit Doshi, Senior Application Engineer, MathWorks
For many industrial applications, accurately determining the time to maintenance in advance avoids larger, costly fixes down the line. This talk will cover how MATLAB® and Simulink® provide a platform that lets you explore different machine learning, signal processing, and dynamic modeling techniques to develop an algorithm that can accurately determine when your machine will require maintenance. In case you don't have the sensor data required to train your algorithm, you can use Simulink models of your machines to generate synthetic data that is representative of faulty behavior. After you have trained and validated your algorithm, you can then integrate it with your embedded devices and enterprise IT platforms.
Scaling up MATLAB Analytics with Kafka and Cloud Services
Pallavi Kar, Application Engineer, MathWorks
As the size and variety of your engineering data has grown, so has the capability to access, process, and analyze those (big) engineering data sets in MATLAB®. With the rise of streaming data technologies and large-scale cloud infrastructure, the volume and velocity of this data has increased significantly, influencing new approaches to handle data in motion. This presentation and demo highlights the use of MATLAB as a data analytics platform with best-in-class stream processing frameworks and cloud infrastructure to express MATLAB based workflows that enable decision-making in “near real time” through the application of machine learning models. It demonstrates how to use MATLAB Production Server™ to deploy these models on streams of data from Apache® Kafka®. The demonstration shows a full workflow from the development of a machine learning model in MATLAB to deploying it to work with a real-world sized problem running on the cloud.
Tackling Big Data Using MATLAB
Alka Nair, Application Engineer, MathWorks
With the rise of analytics in all the industry segments, we see a huge increase in the size and complexity of data collected. Handling and understanding the data, thus, becomes challenging, particularly when the data does not fit in memory. MATLAB® provides a single, high-performance environment for building analytics and makes it easy, convenient, and scalable to analyze and process big data without having to learn big data programming.
Topics covered include:
- Accessing big data in variety of file formats like spreadsheets, images, text from files, datastores, and Hadoop® Distributed File System
- Visualizing, cleaning, and processing the data to analyze trends
- Running MATLAB based analytics on Apache® Spark™
Using Fleet Analytics and MATLAB to Build Strategies for BS-VI Development
Shubham Garg, Honda
The motivation for this project at Honda is to develop strategies for BS6 norms and create tests for fuel economy and emission constraints. In this project, data is being collected in large volumes through telematics from field vehicles of different makes and operating in multiple geographical and climatic conditions. These varying operating conditions and driving patterns lead to varying vehicle performance, drive efficiency, and emissions. Furthermore, it leads to a vast difference in calibration needs since performance changes with the different riding and operating conditions of the field vehicle.
The challenges in this project were mainly the volume of the data that had to be churned to come up with any valid analysis, leading to a big data problem. The analysis had to be performed for data exploration, and feature engineering was aimed at achieving an understanding of the fuel economy profile for different geographical areas, variations based on temperature, and geographical terrain, as well as generating drive cycles capturing real-world driving scenarios. The team at Honda also needed to scale up to reduce the computational time due to the huge amount of data.
In this presentation, you will learn how the key challenges described above were addressed using MATLAB® and toolboxes.
Designing Efficient Power Electronics Systems Using Simulation
Vivek Raju, Senior Application Engineer, MathWorks India
Naga Chakrapani Pemmaraju, Senior Application Engineer, MathWorks India
Designing efficient power electronics systems has become critical with the evolving need for an electric grid, the rise of electric vehicles, and the expansion of variable speed motors for increasing efficiency in industrial applications. Some of the key challenging questions that power electronics engineers often have in designing such efficient power converters are how to reduce the size of components; how to determine various losses of the power electronics system; and how to design feedback control algorithm and test the power electronics controllers in real time.
In this talk, with the help of real-world examples, MathWorks engineers showcase how the above-mentioned challenges can be addressed using a simulation-based approach. You will learn more about how to:
- Model ideal and detailed nonlinear power electronics switches quickly
- Model multi-domain components in a single environment
- Design feedback control algorithms and perform real-time simulation
Development and Deployment of Virtual Kinematics and Compliance Test System
Designing and improving vehicle dynamics attributes has become significantly important in commercial vehicle development. Upfront simulation of vehicle dynamics characteristics early in development offers huge time and cost advantages over complete dependency on tuning with physical vehicles. Importantly, it enables robust design by optimizing vehicle systems for the targeted performance.
For high-fidelity simulation of vehicle dynamics behaviors, the significance of proper kinematic and compliance (K&C) parameters can’t be overstated. Generally, these parameters are measured using physical K&C test rig. A K&C rig for commercial vehicle testing costs several hundred million rupees and takes a few years to set up. In the absence of physical rig, a virtual K&C measurement methodology is developed using Simscape Multibody™ and other toolboxes of MATLAB® and Simulink®.
The virtual K&C system comprises of three modules, one each for vehicle, rig, and control and are interfaced to function as a measurement rig. The model comprises of rigid bodies, flexible members, a 3D kinematic chain, joints, and a hydraulic system. The parameterized modelling enables swift change of vehicle configurations. Comparison of parameters predicted with measured values resulted in high correlation, and thus, increased engineering confidence.
In this presentation, you will learn how this work resulted in lean and cost-efficient vehicle dynamics simulation, investigation of numerous combinations, and optimized design for one of major running vehicle development program.
Full Vehicle Simulation for Electrification and Automated Driving Applications
Prasanna Deshpande, Team Lead – Control Design Application Engineering, MathWorks India
R Vijayalayan, Manager – Control Design Application Engineering, MathWorks India
The latest trends in the automotive world, such as powertrain electrification and automated driving, require engineers to have an accurate full vehicle simulation model, which can help them in making design tradeoffs and verifying their control algorithms before physical prototype components or vehicles are available. Building a full vehicle simulation model that satisfies these needs requires a lot of investments in terms of domain and tool knowledge and consumes more time.
In this talk, with the help of real-world examples, MathWorks engineers showcase how the recent developments in MathWorks solutions can address the above-mentioned challenges, thereby accelerating the vehicle development process. You will learn how to:
- Use a standard model architecture that can be reused throughout the development process
- Perform powertrain matching analysis and component selection
- Use these models for chassis control design and optimization
Generating Industry Standards Production C Code Using Embedded Coder
Rajat Arora, Application Engineer, MathWorks India
Durvesh Kulkarni, Senior Application Engineer, MathWorks India
Generating production-ready code automatically using Embedded Coder® has been a widely adopted practice in multiple industry segments, including automotive, aerospace, and defense. Automatic code generation enables efficient adoption of Model-Based Design, reducing the number of iterations in a typical industry-based product development cycle and eliminating errors introduced due to manual coding. Generating optimized, industry standards-compliant, and production-ready code requires adherence to design and coding standards, such as AUTOSAR, MISRA®, and safety standards like DO-178, ISO 26262, and IEC 61508. This talk highlights the features of Embedded Coder you can use to generate code that meets industrial standards, as well as the flexibility they offer when configuring the model and generating optimized production-ready code.
Hardware and Software Co-Design for Motor Control Applications
Durvesh Kulkarni, Senior Application Engineer, MathWorks India
Gaurav Dubey, Senior Team Lead – Pilot Engineering, MathWorks India
Electric motors are everywhere and are finding new applications every day. The technology to control motors is also evolving to be based on new platforms, such as Xilinx® Zynq®, that combine embedded processors with the programmable logic of FPGAs.
In this talk, you will learn how C and HDL code generation are used to produce implementations on Xilinx Zynq SoCs. You will also explore practical methods for developing motor controllers targeting Zynq SoCs, including the use of new HDL debugging capabilities.
Lithium-Ion Battery Parameter Estimation for HIL, SIL, and MIL Validation
Thayalan Shanmugam, Assistant Project Manager, Renault-Nissan Technology and Business Center India Pvt. Ltd. (RNTBCI)
Praveen Kumar, Senior Software Engineer, Renault-Nissan Technology and Business Center India Pvt. Ltd. (RNTBCI)
With the rising focus on electric vehicles, lithium-ion batteries are widely used in the automotive market (EV, PEHV vehicles). The battery cells are arranged in a way that the battery packs are generated, and it becomes very important to protect the battery from over charge, deep discharge, and thermal runway.
The battery management system (BMS) is used to estimate the state of charge (SOC), state of health (SOH), and protect the battery packs from failure. To validate and perform a robust testing of BMS for a hardware-in-the-loop (HIL), model-in-the-loop (MIL), or software-in-the-loop (SIL) system, exact battery parameters are required.
In this presentation, you will learn on how the parameters of the batteries are identified for validating the BMS using experimental data from a test bench and mathematical model of the battery packs.
Modeling of a Road Condition Estimator Using Machine Learning
K H Shankar Narayanan, Engineer, Vehicle Attributes and Analytics, Mahindra Truck and Bus Division
Development of a logic without a good understanding of the domain and various scenarios is a challenging task. However, with the increase in computational ability and with advancements in the areas of artificial intelligence, data-driven modeling techniques, such as machine learning, offer a new perspective to address the issue and make use of the data in the best possible manner.
This talk is centered around the use of signal processing and machine learning techniques tools in the field of automotive engineering. The team at Mahindra Truck and Bus Division has tried to find a solution to an existing problem using a machine learning approach. The company shares challenges the team faced in thinking and building the algorithm and how they utilized MATLAB® tools to find a solution to the identified problem.
Verification and Validation of High-Integrity Systems
Chethan CU, Senior Pilot Engineer, MathWorks India
Vaishnavi H R, Application Engineer, MathWorks India
Simulation with Model-Based Design is a key capability to understanding the behavior of increasingly complex designs. MathWorks verification and validation products complement simulation with additional rigor, automation, and insight to verify your designs are functionally correct, in compliance with standards such as ISO 26262 and DO-178C, and correctly implemented on target hardware. This talk discusses new capabilities to support requirements modeling; automated guideline checking; and test coverage analysis including dynamic testing and static analysis of model and code. You will learn how to apply these capabilities systematically throughout a production development process to achieve higher quality and productivity.
5G: What’s Behind the Next Generation of Mobile Communications?
Tabrez Khan, Senior Application Engineer, MathWorks India
Learn how MATLAB® and Simulink® help you develop 5G wireless systems, including new digital, RF, and antenna array technologies that enable the ambitious performance goals of the new mobile communications standard.
This talk presents and demonstrates available tools and techniques for designing and testing 5G new radio physical layer algorithms; massive MIMO architectures and hybrid beamforming techniques for mmWave frequencies; and details on modeling and mitigating channel and RF impairments.
Designing and Integrating Antenna Arrays with Multi-Function Radar Systems
Shashank Kulkarni, Ph.D., Principal Developer, MathWorks India
Swathi Balki, Pilot Engineer, MathWorks India
The multi-function radar system is an emerging technology, enabling radars to perform multiple tasks, such as searching and tracking, simultaneously. Modeling the antenna and integrating it with the system is very critical to detecting and addressing issues early. MATLAB® helps you in designing antennas and antenna arrays, rapidly trying different configurations, and integrating them earlier at the system level.
In this talk, you will learn how to model antenna and antenna arrays and integrate them with multi-function radar systems. Topics covered include:
- Analyzing the performance of custom printed antennas and fabricating them using Gerber files
- Performing array analysis by computing coupling among antenna elements
- Integrating antenna models with the rest of the system
- Modeling and simulating multi-functional capabilities of radars
Designing and Prototyping Digital Systems on SoC FPGAs
Hitu Sharma, Application Engineer, MathWorks India
Vinod Thomas, Senior Training Engineer, MathWorks India
Digital system designers are increasingly moving to SoC FPGAs for the implementation of their applications due to the high-speed compute capabilities of FPGAs and the ability to perform complex operations on DSPs or MCUs. This combination of ARM® cores along with the programmable logic of a conventional FPGA requires designers to adopt hardware and software co-design methodology. With Model-Based Design, design teams can simulate models of complete systems, partition designs between hardware and software, and use automatic C/C++ and HDL code generation to prototype on Xilinx® Zynq® or Intel® SoC platforms in an integrated workflow.
In this talk, you will learn how to move from design to prototype on SoC FPGAs through:
- Exploration of hardware-software architecture partitioning
- Automatic HDL and C code generation for FPGA fabric and ARM MCU
- Generation of interface logic and software between FPGA and ARM
- Implementation and prototyping on Xilinx Zynq and Intel SoC platforms
Designing and Testing Voice Interfaces through Microphone Array Modeling, Audio Prototyping, and Text Analytics
Vidya Viswanathan, Application Engineer, MathWorks India
Voice assistants have shifted expectations on the future of human-machine interfaces. They are great examples of IoT products integrating the use of different sensors, device connectivity, and advanced algorithms. Successful innovators tackling similarly complex problems today need agile development tools that can leverage existing resources and create prototypes early on.
In this talk, you will learn more about:
- The process of modeling and simulating microphone arrays for the development of voice interfaces for IoT devices
- Early prototyping through real-time streaming and processing of audio signals
- Speech processing and analysis
Designing and Verifying Digital and Mixed-Signal Systems
Aniruddha Dayalu, Principal Application Engineer, MathWorks India
When designing mixed-signal systems, isolated analog and digital design flows can cause integration issues to be discovered late, which can then lead to delayed projects and costly design rework. In addition, slow simulations in traditional EDA tools limit design exploration and exhaustive test coverage during verification which can lead to a non-optimal, and in the worst case, faulty design. System-level behavioral modeling using MathWorks tools allows issues to be discovered earlier in design cycle, saving precious time and effort. In addition, rapid behavioral simulations using MathWorks tools enable extensive architecture exploration and increased verification coverage. In this talk, you will learn about behavioral modeling, rapid design exploration, and verification of mixed-signal systems through:
- System level modeling and simulation of analog and mixed-signal circuits
- Generation of synthesizable digital RTL code
- Verification of SPICE models and HDL code by co-simulating with EDA tools
- Analysis of simulation data with advanced post-processing
Numerical Simulation of a Tsunami on a GPU
Dr. Siva Srinivas Kolukula, Project Scientist - B, Indian National Centre for Ocean Information Services
On December 26, 2004, an undersea megathrust earthquake triggered a series of deadly tsunamis that killed lots of people and damaged large properties bordering the Indian Ocean. These devastating tsunamis resulted in the establishment of early tsunami warning systems in tsunami-prone regions, with a prime motivation to detect tsunamis in advance and issue warnings to prevent loss of life and damage. The major components for detecting tsunamis are the detection of tsunamigenic earthquakes, continuous monitoring of sea levels, and numerical simulation of tsunami to estimate the water levels and travel times. The major challenge of estimation is the numerical simulation of tsunamis. Specifically, solving the governing shallow water equations, as an initial value problem on a large domain for a long time interval. In order to contribute to the early warnings, the sooner the simulation time the better.
In the presentation, Dr. Siva Srinivas Kolukula investigates the achievable speed of tsunami propagation simulations on a GPU using Parallel Computing Toolbox™. The governing linear shallow water equations are solved by employing the finite difference method. Using MATLAB® for simple GPU computing, numerical simulation is accelerated. A good performance in simulation speed is noticed using MATLAB for simple GPUs. The results are compared with real-time water observations in order to validate the MATLAB code and is found to be a good match.
System-Level Radar Simulation Using Model-Based Design
Dr. Dyana A, Scientist D, LRDE, DRDO
Phased array radar systems consist of different subsystems like antenna, waveform, RF signal, and data processing. The systems also depend upon external entities such as target, clutter, jammer, and channels. There are challenges in such complex systems for understanding the interactions of multiple domains and developing an efficient radar system.
In this presentation, you will learn how simulation is used for the efficient design of phased array radar systems.
Verifying the Hardware Implementation of Automotive Radar Signal Processing with MATLAB
Sainath K, Design Engineer, NXP
Shashank Venugopal, Design Engineer, NXP
One of the key challenges offered by automotive radar designs is the functional and performance verification of signal processing hardware implementations like sigma delta ADCs, decimation chains, and filters. Verification of these signal processing blocks can be either done in time-domain or frequency domain. Time domain checks involve developing reference verification models, which are used to check the correctness of DUT implementation.
This methodology serves the basic requirement of checking functional correctness of DUT, but involves the development effort of reference models, which are highly sensitive to DUT design changes. Therefore, the team at NXP started looking at this problem to come up with a verification means using the powerful signal analysis functions of MATLAB®, and thereby, reducing the repetitive model development and increased verification productivity. The verification metrics used to evaluate the DUT implementation are FFT, SNR, and THD, which are computed using the built-in functions in MATLAB.
Shashank Venugopal presents the team’s experiences of integrating checks based on MATLAB into SV/UVM-based digital verification environments using DPI-C flow.
Automated Driving Development with MATLAB and Simulink
Manohar Reddy, Senior Application Engineer, MathWorks India
ADAS and autonomous driving technologies are redefining the automotive industry, changing all aspects of transportation, from daily commutes to long-haul trucking. Engineers across the industry use Model-Based Design with MATLAB® and Simulink® to develop their automated driving systems. This talk demonstrates how MATLAB and Simulink serve as an integrated development environment for the different domains required for automated driving, including perception, sensor fusion, and control design.
This talk covers:
- Perception algorithm design using deep learning
- Sensor fusion design and verification
- Control algorithm design with Model Predictive Control Toolbox™
Autonomous Drive
Gopinath Chidambaram, Software Architect, L&T Technology Services
Research is a systematic investigative process. L&T Technology Services works in autonomous drive, increasing knowledge and working with various OEMs and Tier-1s to discover new facts and implementations in a limited time. Although many problems turn out to have several solutions (the means to close the gap or correct the deviation), difficulties arise where such means are either not obvious or are not immediately available. Similarly, L&T Technology Services faced many challenges in developing autonomous applications, such as a sensor fusion model, deep learning architecture, machine learning, Lidar-based object detection, and a control model, which can be time-consuming to develop.
The company identified MATLAB® as a key tool that provides significant toolboxes that enable them to move faster and to precisely prove concepts. AEB (automated emergency braking) is an important feature in automated driving systems, where the goal is to provide correct, timely, and reliable control signal for the system to act on impending collision with the objects in front of the vehicle. AEB has various practical challenges. A single sensor (monocular camera) system would be the right one for ADAS system, whereas a system taking action would need absolute certainty and require supplementary sensors.
In this presentation, Gopinath Chidambaram will explain how the challenges L&T Technology Services faced in autonomous system development have been addressed using MATLAB and Simulink®.
Demystifying Deep Learning
Dr. Amod Anandkumar, Senior Team Lead, MathWorks India
Deep learning can achieve state-of-the-art accuracy for many tasks considered algorithmically unsolvable using traditional machine learning, including classifying objects in a scene or recognizing optimal paths in an environment. Gain practical knowledge of the domain of deep learning and discover new MATLAB® features that simplify these tasks and eliminate the low-level programming. From prototype to production, you’ll see demonstrations on building and training neural networks and hear a discussion on automatically converting a model to CUDA® to run natively on GPUs.
Developing Algorithms for Robotics and Autonomous Systems
Dhirendra Singh, Senior Application Engineer, MathWorks India
Abhisek Roy, Application Engineer, MathWorks India
Robotics researchers and engineers use MATLAB® and Simulink® to design and tune algorithms for perception, planning, and controls; model real-world systems; and automatically generate code—all from one software environment. In this presentation, you learn how to develop autonomous systems that are complex with multiple sensors, need continuous planning and decision making, as well as have controls and motion requirements. An approach to adopt these interconnected technologies and make them work seamlessly is Model-Based Design. It centers on the use of system models throughout the development process for design, analysis, simulation, automatic code generation, and verification. Through the lens of an autonomous drone example, see how techniques in perception, such as deep learning, can be integrated with algorithms for motion planning and control of autonomous flying systems.
Deploying Deep Neural Networks to Embedded GPUs and CPUs
Rishu Gupta, Ph.D., Senior Application Engineer, MathWorks India
Designing and deploying deep learning and computer vision applications to embedded CPU and GPU platforms is challenging because of resource constraints inherent in embedded devices. A MATLAB® based workflow facilitates the design of these applications, and automatically generated C or CUDA® code can be deployed on boards like the Jetson TX2 and DRIVE™ PX to achieve very fast inference.
The presentation illustrates how MATLAB supports all major phases of this workflow. Starting with algorithm design, the algorithm may employ deep neural networks augmented with traditional computer vision techniques and can be tested and verified within MATLAB. Next, these networks are trained using GPU and parallel computing support for MATLAB either on the desktop, cluster, or the cloud. Finally, GPU Coder™ generates portable and optimized C/C++ and/or CUDA® code from the MATLAB algorithm, which is then cross-compiled and deployed to CPUs and/or a Tegra® board. Benchmarks show that performance of the auto-generated CUDA code is ~2.5x faster than MXNet, ~5x faster than Caffe2, ~7x faster than TensorFlow®, and on par with TensorRT™ implementation.
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .
You can also select a web site from the following list:
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
Americas
- América Latina (Español)
- Canada (English)
- United States (English)
Europe
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)
Asia Pacific
- Australia (English)
- India (English)
- New Zealand (English)
- 中国
- 日本Japanese (日本語)
- 한국Korean (한국어)