Hyderabad Abstracts

Beyond the “I” in AI

09:45 –10:15

Insight. Implementation. Integration.

AI, or artificial intelligence, is transforming the products we build and the way we do business. It also presents new challenges for those who need to build AI into their systems. Creating an “AI-driven” system requires more than developing intelligent algorithms. It also requires:

  • Insights from domain experts to generate the tests, models, and scenarios required to build confidence in the overall system
  • Implementation details including  data preparation, compute-platform selection, modeling and simulation, and automatic code generation
  • Integration into the final engineered system

Mike Agostini demonstrates how engineers and scientists are using MATLAB® and Simulink® to successfully design and incorporate AI into the next generation of smart, connected systems.

Mike Agostini, MathWorks


Model-Based Product Development and Analytics – Transforming Enablers in Systems Development

10:15–10:45

Vijay Chari,
United Technologies


What’s New in MATLAB and Simulink

10:45–11:15

Learn about new capabilities in the MATLAB® and Simulink® product families to support your research, design, and development workflows. This talk highlights features for deep learning, wireless communications, automated driving, and other application areas. You will see new tools for defining software and system architectures, and modeling, simulating, and verifying designs.

Prashant Rao, MathWorks

AI Techniques in MATLAB for Signal, Time-Series, and Text Data

11:45–12:15

Developing predictive models for signal, time-series, and text data using artificial intelligence (AI) techniques is growing in popularity across a variety of applications and industries, including speech classification, radar target classification, physiological signal recognition, and sentiment analysis.

In this talk, you will learn how MATLAB® empowers engineers and scientists to apply deep learning beyond the well-established vision applications. You will see demonstrations of advanced signal and audio processing techniques such as automated feature extraction using wavelet scattering and expanded support for ground truth labelling. The talk also shows how MATLAB covers other key elements of the AI workflow:

  • Use of signal preprocessing techniques and apps to improve the accuracy of predictive models
  • Use of transfer learning and wavelet analysis for radar target and ECG classification
  • Interoperability with other deep learning frameworks through importers and ONNX converter for collaboration in the AI ecosystem
  • Scalability of computations with GPUs, multi-GPUs, or on the cloud

Dr. Shayoni Datta, MathWorks


Machine Learning-Based Tool for Predictive Analysis in Computational Pathology

12:15–12:45

The advantage of leveraging machine learning techniques in digital pathology is that it provides the capability to learn, to extract knowledge, and to make predictions from a combination of heterogeneous data (i.e. the histological images, the patient history, and the omics data). The ability to mine sub-visual image features from digital pathology slide images—features that sometimes may not be visually discernible—offers the opportunity for better quantitative modeling of disease appearance, and hence, possibly improved prediction of disease aggressiveness and patient outcomes.

Onward Health has built a tool with expert doctor inputs, state-of-art computer vision techniques, and machine learning algorithms. The imaging algorithms are designed to detect different types of cells. Once detected, the algorithm also processes each cell to determine various properties, such as shape, texture, color distribution features, and distance to tumor and other spatial neighbors. The tool employs machine learning techniques to:

  • Automatically identify the location of tumor or stroma
  • Cluster patients based on their survival probabilities or risk score similarity
  • Combine the image features along with other patient-related features, such as age, gender, history, medications, and genetics, to extract different types of insights and inferences

The company’s vision is that the tool would enable multiple benefits, including providing and improving cross-verifiable evidence for tumor, better resource management, and aiding the research/knowledge of tumor behavior.

Dinesh Koka,
Onward Health

Vineet Sharma,
Onward Health


Industrial IoT and Digital Twins

12:15–12:45

Industrial IoT has brought the rise of connected devices that stream information and optimize operational behavior over the course of a device’s lifetime.

This presentation covers how to develop and deploy MATLAB® algorithms and Simulink® models as digital twin and IoT components on assets, edge devices, or cloud for anomaly detection, control optimization, and other applications. It includes an introduction to how assets, edge, and OT/IT components are connected.

The talk features customer use cases starting from design to final operation, the underlying technology, and results.

Pallavi Kar, MathWorks


Optimization in Energy Management Systems

14:30–15:00

Energy management systems (EMS) for homes, buildings, factories, and communities are an important part of the trend towards smarter systems, providing better energy system planning, dispatch, resilience, and operation. Systems are used to manage both generation and consumption to optimally respond to variation in demand, market prices, and environmental conditions.

To develop an EMS, an engineer needs to use data analytics, control, simulation, and optimization for:

  • Electric demand forecasting
  • Electrical system modeling and simulation
  • Operations optimization
  • Tradeoff analysis

MATLAB® and Simulink® provide an integrated platform with both data analytics and Model-Based Design. You can build predictive models of demand and optimization models to minimize cost in MATLAB. Then, combine these with a system model built with Simulink and Simscape™ that integrates power electronics and controls. Deployment can be to embedded systems, or to enterprise or cloud environments.


Predictive Maintenance with MATLAB

15:00–15:30

Predictive maintenance reduces operational costs for organizations running and manufacturing expensive equipment by predicting failures from sensor data. However, identifying and extracting useful information from sensor data is a process that often requires multiple iterations as well as a deep understanding of the machine and its operating conditions.

In this talk, you will learn how MATLAB® and Predictive Maintenance Toolbox™ combine machine learning with traditional model-based and signal processing techniques to create hybrid approaches for predicting and isolating failures. You will also see built-in apps for extracting, visualizing, and ranking features from sensor data without writing any code. These features can then be used as condition indicators for fault classification and remaining useful life (RUL) algorithms.

Predictive maintenance algorithms make the greatest impact when they are developed for a fleet of machines and deployed in production systems. This talk will show you how to validate your algorithms, and then integrate them with your embedded devices and enterprise IT/OT platforms.

Amit Doshi, MathWorks


Developing and Deploying Machine Learning Solutions for Embedded Applications

16:15–16:45

Machine learning is a powerful tool for solving complex modeling problems across a broad range of industries. The benefits of machine learning are being realized in applications everywhere, including predictive maintenance, health monitoring, financial portfolio forecasting, and advanced driver assistance. However, developing predictive models for signals obtained from sensors is not a trivial task. Moreover, there is an increasing need for developing smart sensor signal processing algorithms, which can be either deployed on edge nodes and embedded devices or on the cloud, depending on the application. MATLAB® and Simulink® provide a platform for exploring and analyzing time-series data and a unified workflow for the development of embedded software by providing a workflow from prototyping to production, including C code generation, processor-in-the-loop testing, and rapid prototyping on popular hardware platforms.

In this this talk, you will learn about:

  • Time-frequency feature extraction techniques for machine learning workflows such as wavelets
  • Automatic C code generation for preprocessing, feature extraction, and machine learning algorithms
  • Rapid prototyping on embedded hardware such as Raspberry Pi™ and Android™

Nitin Rai, MathWorks


Building and Sharing Desktop and Web Apps

16:15–16:45

After algorithms are developed in MATLAB®, engineers prefer building user interfaces (apps), which can be then distributed across entire organizations. These apps enable teams to automate workflows, get quick quantitative analysis, avoid human error, and collaborate more effectively. Apps and components can be shared as both standalone desktop applications and as software components to integrate with web and enterprise applications.

In this talk, you will learn how to: 

  • Develop responsive MATLAB applications with rich data visualizations
  • Share these applications with other MATLAB users and non-MATLAB users
  • Deploy MATLAB applications to enterprise production systems and the web

Automated Driving System Design and Simulation

11:45–12:15

ADAS and autonomous driving systems are redefining the automotive industry and changing all aspects of transportation, from daily commutes to long-haul trucking. MATLAB® and Simulink® provide the ability to develop the perception, planning, and control components used in these systems.

In this talk, you will learn about these tools through examples that ship in R2019a, including:

  • Perception: Design LIDAR, vision, radar, and sensor fusion algorithms with recorded and live data
  • Planning: Visualize street maps, design path planners, and generate C/C++ code
  • Controls: Design a model-predictive controller for traffic jam assist, test with synthetic scenes and sensors, and generate C/C++ code
  • Deep learning: Label data, train networks, and generate GPU code
  • Systems: Simulate perception and control algorithms, as well as integrate and test hand code

Dr. Amod Anandkumar, MathWorks


LiDAR-Based Exploration of Unknown Indoor Space by a Robotic System

12:15–12:45

The exploration of unknown environments can be the fundamental problem for mobile robots, as it involves all the basic capabilities of such systems (e.g., perception, planning, localization, and navigation). From a practical viewpoint, exploration is a central task in many applications, such as planetary missions, intervention in hostile areas, and automatic map building.

This presentation focuses on technique known as frontier-based exploration. The rationale of this approach is that the robot must move towards the boundary (the frontier) between safe explored areas and unknown territory to maximize the information gain coming from new perceptions. The talk discusses how to explore indoor spaces in the absence of predefined map, how to navigate the space while avoiding static obstacles, and how to implement the exploration and navigation module in ROS in MATLAB® and simulate the virtual robot, Husky, in ROS gazebo using Robotics System Toolbox™.

Deepak Agarwal,
EbyT Technologies Pvt. Ltd.


LiDAR Processing for Automated Driving

12:15–12:45

The use of LiDAR as a sensor for perception in Level 3 and Level 4 automated driving functionality is gaining popularity. MATLAB® and Simulink® can acquire and process LiDAR data for algorithm development for automated driving functions such as free space and obstacle detection. With the point-cloud processing functionality in MATLAB, you can develop algorithms for LiDAR processing, and visualize intermediate results to gain insight into system behavior.

This talk shows new capabilities including:

  • Acquiring live and offline data from Velodyne® sensors
  • Registering LiDAR point clouds 
  • Segmenting objects and detecting obstacles
  • Applying deep learning to LiDAR data
  • Generating C/C++ and CUDA® code from LiDAR processing algorithms

Avinash Nehemiah, MathWorks


Developing Autonomous Robots Using MATLAB and Simulink

14:30–15:00

Autonomous robots are being developed in many industries from industrial warehouses to consumer products. This talk demonstrates new features in MATLAB® and Simulink® for the different functional domains of robotics, including hardware design, perception, planning and decision making, and control design. It describes the challenges and walks through a common robotics workflow. Some of the topics that will be covered include:

  • Developing kinematic and dynamic models of robots
  • Perception algorithm design using deep learning
  • Path planning with obstacle avoidance
  • Supervisory logic and control using Stateflow®
  • Model-Based Design for developing and testing robotics systems

Develop and Test Vehicle Controllers for ADAS and Automated Driving Applications through System Simulation

15:00–15:30

When developing and testing sensor fusion algorithms and vehicle controllers for ADAS and automated driving applications, engineers need to consider the complex interplay between multiple sensors and dynamics of the vehicle, while the vehicle operates in a wide variety of scenarios, many of which may not have been available in recorded data during development. With multidomain system modeling and simulation-based testing in MATLAB® and Simulink®, you can close the loop in simulation, thus accelerating the development process and reducing costly and hazardous in-vehicle testing.

In this talk, you will learn how to:

  • Design model predictive control-based vehicle controllers
  • Create synthetic scenarios and test sensor fusion and control algorithms using system simulation
  • Improve simulation fidelity with gaming engine integration, vehicle dynamics modeling, and automated scenario creation from recorded data

Abhisek Roy, MathWorks


Deep Learning and Reinforcement Learning Workflows in AI

16:15–16:45

AI, or artificial intelligence, is powering a massive shift in the roles that computers play in our personal and professional lives. Two new workflows, deep learning and reinforcement learning, are transforming industries and improving applications such as diagnosing medical conditions, driving autonomous vehicles, and controlling robots.

This talk dives into how MATLAB® supports deep learning and reinforcement workflows, including:

  • Automating preparation and labeling of training data
  • Interoperability with open source deep learning frameworks
  • Training deep neural networks on image, signal, and text data
  • Tuning hyper-parameters to accelerate training time and increase network accuracy
  • Generating multi-target code for NVIDIA®, Intel®, and ARM®

Avinash Nehemiah, MathWorks


Deploying Deep Neural Networks to Embedded GPUs and CPUs

16:45–17:15

Autonomous robots are being developed in many industries from industrial warehouses to consumer products. This talk demonstrates new features in MATLAB® and Simulink® for the different functional domains of robotics, including hardware design, perception, planning and decision making, and control design. It describes the challenges and walks through a common robotics workflow. Some of the topics that will be covered include:

  • Developing kinematic and dynamic models of robots
  • Perception algorithm design using deep learning
  • Path planning with obstacle avoidance
  • Supervisory logic and control using Stateflow®
  • Model-Based Design for developing and testing robotics systems

Dr. Rishu Gupta, MathWorks

Systems Engineering: Requirements to Architecture to Simulation

11:45–12:15

System engineering and model-based system engineering can mean different things to different groups, but most definitions share a common set of concepts, including starting from a set of system-level requirements which are used to drive a system decomposition and requirements allocation process. Then trade-off studies are performed on system architecture alternatives to produce a candidate architecture from which the design is developed and then simulated to verify that the requirements are met.

This presentation shows how MathWorks tools can support this workflow by allowing users to: 

  • Capture, view, analyze, and manage requirements
  • Develop a system architecture model from the requirements, existing Simulink® models, ICDs, and externally created architectures or combinations of the above
  • Examine the system architecture model using different views for different concerns
  • Allocate (link) requirements to architectural components and perform coverage and change impact analysis
  • Perform trade studies to compare, assess, or optimize the system architecture
  • Design components specified in the system architecture model
  • Simulate the system composition to verify system-level behavior

Gaurav Dubey, MathWorks


Optimizing Robotic Systems with Simscape

12:45–13:15

Robotic systems are everywhere—manufacturing lines, amusement parks, and even in your house. Optimizing the performance of a robotic system is a complex task that involves mechanical, electrical, and algorithm design. In this presentation, you will see how Simscape™ enables you to model the physical system so that you can minimize power consumption and increase the robustness of your design


Developing a Battery Management System Using Simulink

12:45–13:15

Battery management systems (BMS) ensure maximum performance, safe operation, and optimal lifespan of battery pack energy storage systems under diverse charge-discharge and environmental conditions. With Simulink®, engineers can use simulations to model feedback and supervisory control algorithms that monitor cell voltage and temperature, estimate state-of-charge (SOC) and state-of-health (SOH) across the pack, control charging and discharging rates, balance SOC across the battery cells, and isolate the battery from source and load when necessary. Starting from early design tradeoffs to hardware-in-the-loop (HIL) testing of BMS hardware, Simulink can help engineers perform desktop simulations to ensure the BMS performs as intended under all desired operating conditions and meets design durability requirements. In this talk, you’ll learn how Simulink helps engineers from electrical, thermal, and software backgrounds collaborate throughout the development cycle of BMS algorithms.

Prasanna Deshpande, MathWorks


Control Software Development and Testing Using MATLAB

14:30–15:00

With increasing complexity and the continuously evolving control systems of the current automotive domain, it is becoming tedious to develop and maintain different application software with the conventional methodology of handwritten C code. It even becomes difficult to use it for technology demonstration projects where time is a critical factor. To address this challenge, the model-based development approach is gaining traction among Tier 1/OEMs. Automatic verification and validation and code generation make the development process much more efficient and effective. This not only saves the time required for the development, but it also avoids error-prone hand-coding. Error prevention and early error detection is achieved using model-based development.

In this talk, a model-based approach for application software is discussed where the control system is developed using Simulink®, Stateflow®, and MATLAB®. For better optimization, standard library functions are created and reused in various sub-modules. Studies from requirement traceability to control function development and tagging with test case are being done using Simulink. To maintain the traceability of the models to system requirements, the description of the functionality is written in Simulink blocks when modelling is completed. In this way, ambiguity between the model developed and the system requirements is eliminated, and higher coverage is achieved.

Aditya Chendke,
Mahindra & Mahindra Limited

Nabal Pandey,
Mahindra & Mahindra Limited


Developing and Implementing Digital Control for Power Converters

15:00–15:30

Using a buck-boost power converter example, this talk explains how Simulink® and Simscape Electrical™ are used to develop, simulate, and implement a controller that maintains desired output voltage in the presence of input voltage variations and load changes to achieve fast and stable response. The presentation covers:

  • Modeling passive circuit elements, power semiconductors, and varying power sources and loads
  • Simulating the converter in continuous and discontinuous conduction modes
  • Determining power losses and simulating thermal behavior of the converter
  • Tuning the controller to meet design requirements such as rise time, overshoot, and settling time
  • Generating C code from the controller model for implementation on a Texas Instruments™ C2000™ microcontroller

Naini Dawar, MathWorks


Simplifying Requirements-Based Verification with Model-Based Design

16:15–16:45

With Model-Based Design, informal textual requirements can be modeled and simulated to verify behavior earlier, and then be automatically generated into code for an embedded target. The requirements can include temporal properties to define complex timing-dependent signal logic and can be incomplete or inconsistent. This can lead to errors and miscommunication in the design and test.

This talk shows you how you can model requirements and use the Logical and Temporal Assessments editor in Simulink Test™ to translate informal text requirements into unambiguous assessments with clear, defined semantics that can identify inconsistencies. The temporal assessment language, based on metric temporal logic, provides precise, formal semantics that is highly expressive and extensible to author readable assessments. You will learn how to enter assessments with conditions, events, signal values, delays, and responses using the interactive form-based editor. You can view the assessment in an English language-like statement that is easy to understand or view graphical representations that allow you to visualize the results and debug design errors.

Vamshi Kumbham, MathWorks


Enterprise-Scale Software Verification for C/C++ Code

16:45–17:15

Polyspace® products help you statically verify embedded software written in C and C++. They can find bugs and prove the absence of overflow, divide-by-zero, out-of-bounds, array access, and other run-time errors including checking C/C++ coding standards for MISRA® and AUTOSAR. Recent developments in Polyspace products help development teams improve software quality, safety, and security across their enterprise. Analysis execution is automated using continuous integration tools such as Jenkins™. Results can be published for web browser-based code review to triage and resolve coding errors. Integration with defect tracking tools like Jira help manage identified defects. Dashboards display information that development managers can use to monitor software quality, project status, number of defects, and code metrics.

Vaishnavi H.R., MathWorks

5G New Radio Fundamentals: Understanding the Next Generation of Wireless Technology

11:45–12:15

Development of 5G products is accelerating with the first device and network deployments in 2019. 5G New Radio (NR) technology introduces a flexible architecture that will enable the ultra-fast, low-latency communications needed for next-generation mobile broadband networks and applications such as connected autonomous cars, smart buildings and communities, digital healthcare, industrial IoT, and immersive education. The flexibility of the 5G NR standard will make design and test more complex.

Engineers developing 5G enabling technologies and connected devices need a solid understanding of the fundamental concepts behind the 5G NR specification.

This talk demonstrates the key 5G physical layer technologies and concepts. You will learn about the structure of 5G waveforms; how the waveforms are constructed, modulated, and processed; beam management in massive MIMO systems; and methods for simulating and measuring link-level performance.

Tabrez Khan, MathWorks


Seamless System Design of RF Transceivers and Antennas for Wireless Systems

12:45–13:15

Wireless engineers are pursuing 5G and other advanced technologies to achieve gigabit data rates, ubiquitous coverage, and massive connectivity for many applications such as IoT and V2X. The need to improve performance and coexist with multiple communications standards and devices while reducing the overall area and power imposes challenging requirements on RF front ends. Gaining an insight into such complex systems and performing architectural analysis and tradeoffs require a design model that includes DSP, RF, antenna and channel, as well as impairments.

In this talk, you will learn how to model antenna arrays and integrate them in RF front ends for the development of wireless communications, including:

  • Analyzing the performance of antennas with arbitrary geometry
  • Performing array analysis by computing coupling among antenna elements
  • Modeling the architecture of RF front ends
  • Developing baseband and RF beamforming algorithms

Vidya Viswanathan, MathWorks


End-to-End Airborne Radar System and Signal Processor Design

14:30–15:00

Honeywell® IntuVue® RDR-4000 weather radar is an advanced radar system capable of providing 3D display of airborne weather hazards. This is a mechanically scanned radar system capable of providing alerts for weather-related hazards like turbulence and wind shear. It uses sophisticated signal processing techniques to process the raw digitized RF data to produce final weather-related warnings. This is a non-phased array radar system, and hence, not directly amenable to modeling using Phased Array System Toolbox™. Still, the toolbox provides the basic “nuts-and-bolts” to build a sophisticated model.

Yogesh Gharote,
Honeywell Technology Solutions Lab Pvt Ltd


Sensor Fusion and Tracking for Next-Generation Radars

14:30–15:00

The technology required to design and field a multifunction radar drives a corresponding increase in system level complexity. In addition to performing multiple radar-related tasks such as search and track, these systems are often called upon to do other applications such as weather observations, communications, or EW, all with the same hardware resources. To meet the desire for shorter development cycles, modeling and simulation early in this type of project can lower risk and accelerate the pace of project.

With these trends, sensor fusion and tracking technology is central to data processing and resource management. You can extend your signal processing workflows to directly integrate with data processing algorithms, which also results in efficient resource management and control.

In this talk, you will learn to design, simulate, and analyze systems that fuse data from multiple sensors to maintain position, orientation, and situational awareness for surveillance systems. Through several examples, you will see how to:

  • Define and import scenarios and trajectories for simulation
  • Generate synthetic detection data for radar, EO/IR, sonar, and RWR sensors, along with GPS/IMU sensors for localization
  • Design data association algorithms for real and synthetic data
  • Perform “what-if” analysis on tracker configurations
  • Evaluate system accuracy and performance with standard benchmarks, metrics, and animated plots
  • Model a closed-loop, multifunction radar that performs search and track

Abhishek Tiwari, MathWorks


Design and Verification of Mixed-Signal and SerDes Systems

16:15–16:45

The design and integration of mixed-signal integrated circuits is becoming increasingly more challenging due to analog effects that cannot be neglected, and complex embedded digital signal processing algorithms and control logic.

This talk introduces workflows in MATLAB® and Simulink® that help achieve fast system-level simulation speed, perform comprehensive design space exploration, and develop high-quality behavioral models. You will see a realistic PLL example that shows the entire design and simulation workflow. Through practical examples, you will also learn how to reuse a Simulink model or a MATLAB function in EDA tools to develop high quality verification testbenches.

Aniruddha Dayalu, MathWorks


Adopting Model-Based Design for FPGA, ASIC, and SoC Development

16:45–17:15

The competing demands of functional innovation, aggressive schedules, and product quality have significantly strained traditional FPGA, ASIC, and SoC development workflows.

This talk shows how you can use Model-Based Design with MATLAB® and Simulink® for algorithm- and system-level design and verification, including how to:

  • Verify the functionality of algorithms in the system context
  • Refine algorithms with data types and architectures suitable for FPGA, ASIC, and SoC implementation
  • Prototype and debug models running live on hardware connected to MATLAB or Simulink
  • Generate and re-generate verified design and verification models for the hardware engineering team
  • Keep the workflow connected to speed verification closure and meet functional safety requirements

Hitu Sharma, MathWorks

Accelerate Design, Verification, and Deployment of Complex Algorithms onto FPGA/SoC

11:45–13:15

Model-Based Design enables:

  • Design and auto generation of optimised, readable VHDL or Verilog for any FPGA, ASIC, or SoC Hardware exploring tradeoffs, and verification of system architecture before beginning implementation
  • Earlier verification of algorithms and system functionality, faster adaptation to specification changes, and evaluation of more design alternatives with an option to deploy and debug on Eval hardware directly from Simulink®
  • Co-simulation of models and tests together with handwritten or generated HDL code running in a Mentor Graphics® or Cadence® simulator
  • Simulation of hardware and software architectures, including: 
    • Memory
    • Internal/external connectivity
    • Task scheduling

Vinod Thomas, MathWorks

Swathi Balki, MathWorks


Deploying AI Algorithms on Cloud for Near Real-Time Decision Making

14:30–15:00

With the increasing popularity of AI, new frontiers are emerging in predictive maintenance and manufacturing decision science. However, there are many complexities associated with modeling plant assets, training predictive models for them, and deploying these models at scale, including:

  • Generating failure data, which can be difficult to obtain, but physical simulations can be used to create synthetic data with a variety of failure conditions.
  • Ingesting high-frequency data from many sensors, where time-alignment makes it difficult to design a streaming architecture.

This talk will focus on building a system to address these challenges using MATLAB®, Simulink®, Apache™ Kafka®, and Microsoft® Azure®. You will see a physical model of an engineering asset and learn how to develop a machine learning model for that asset. To deploy the model as a scalable and reliable cloud service, we will incorporate time-windowing and manage out-of-order data with Apache Kafka.

Pallavi Kar, MathWorks


Model-Based Design for Autonomous Aerial Systems

16:15–17:15

Developing Autonomous Aerial systems require engineers analyze the behavior of various subsystems, simulate sensor and perception algorithms, and controls as an integrated platform, and deploy to the actual hardware.

Model-Based Design with MATLAB® and Simulink® is a modular development approach that enables engineering teams to move from internal research and development to design and implementation in a single environment.

In this master class, MathWorks engineers will showcase:

  • Single environment for building UAVs, from requirements to deployment
  • Modeling environmental effects and 6DOF aircraft simulations
  • Designing a drone autopilot and testing its performance under simulated flight conditions
  • Simulating communication with multiple agents and with the ground station
  • Deploying and testing correctness of the flight controller’s generated code