Beyond the “I” in AI
Insight. Implementation. Integration.
AI, or artificial intelligence, is transforming the products we build and the way we do business. It also presents new challenges for those who need to build AI into their systems. Creating an “AI-driven” system requires more than developing intelligent algorithms. It also requires:
- Insights from domain experts to generate the tests, models, and scenarios required to build confidence in the overall system
- Implementation details including data preparation, compute-platform selection, modeling and simulation, and automatic code generation
- Integration into the final engineered system
Mike Agostini demonstrates how engineers and scientists are using MATLAB® and Simulink® to successfully design and incorporate AI into the next generation of smart, connected systems.
What’s New in MATLAB and Simulink
Learn about new capabilities in the MATLAB® and Simulink® product families to support your research, design, and development workflows. This talk highlights features for deep learning, wireless communications, automated driving, and other application areas. You will see new tools for defining software and system architectures, and modeling, simulating, and verifying designs.
Data Science and Predictive Analytics
AI Techniques in MATLAB for Signal, Time-Series, and Text Data
Developing predictive models for signal, time-series, and text data using artificial intelligence (AI) techniques is growing in popularity across a variety of applications and industries, including speech classification, radar target classification, physiological signal recognition, and sentiment analysis.
In this talk, you will learn how MATLAB® empowers engineers and scientists to apply deep learning beyond the well-established vision applications. You will see demonstrations of advanced signal and audio processing techniques such as automated feature extraction using wavelet scattering and expanded support for ground truth labelling. The talk also shows how MATLAB covers other key elements of the AI workflow:
- Use of signal preprocessing techniques and apps to improve the accuracy of predictive models
- Use of transfer learning and wavelet analysis for radar target and ECG classification
- Interoperability with other deep learning frameworks through importers and ONNX converter for collaboration in the AI ecosystem
- Scalability of computations with GPUs, multi-GPUs, or on the cloud
Developing and Deploying Optimization Strategy for Engine Calibrations
Diesel engine calibration is a careful balance of emissions while trying to achieve the lowest fuel consumption possible. In a certain instance of trying to achieve this balance, some areas of the calibration yielded enough smoke to cause EGR cooler fouling issues under the right conditions.
This presentation discusses this challenge for Cummins and how the optimization process and MATLAB® based toolkit was used to improve robustness to smoke while maintaining the rest of the emissions constraints. The Cummins-developed toolkit utilizes many MATLAB toolboxes including Statistics and Machine Learning Toolbox™, Parallel Computing Toolbox™, Optimization Toolbox, and MATLAB Compiler™.
Akansha Saxena, Cummins
Industrial IoT and Digital Twins
Industrial IoT has brought the rise of connected devices that stream information and optimize operational behavior over the course of a device’s lifetime.
This presentation covers how to develop and deploy MATLAB® algorithms and Simulink® models as digital twin and IoT components on assets, edge devices, or cloud for anomaly detection, control optimization, and other applications. It includes an introduction to how assets, edge, and OT/IT components are connected.
The talk features customer use cases starting from design to final operation, the underlying technology, and results.
Artificial Intelligence Engine Idle Improvement
Presently, engine idle control is done by a PID controller at Tata Motors Limited. The company is working to reduce quality of idling (over shoot, under shoot, saturation time, and fuel consumption) by 10% when compared to production vehicles using artificial intelligence (AI) tools from MathWorks. They are able to run AI-based engine models in software-in-the-loop and are planning to build in ECU and test in the vehicle.
Tata Motors Limited
Tata Motors Limited
Predictive Maintenance with MATLAB
Predictive maintenance reduces operational costs for organizations running and manufacturing expensive equipment by predicting failures from sensor data. However, identifying and extracting useful information from sensor data is a process that often requires multiple iterations as well as a deep understanding of the machine and its operating conditions.
In this talk, you will learn how MATLAB® and Predictive Maintenance Toolbox™ combine machine learning with traditional model-based and signal processing techniques to create hybrid approaches for predicting and isolating failures. You will also see built-in apps for extracting, visualizing, and ranking features from sensor data without writing any code. These features can then be used as condition indicators for fault classification and remaining useful life (RUL) algorithms.
Predictive maintenance algorithms make the greatest impact when they are developed for a fleet of machines and deployed in production systems. This talk will show you how to validate your algorithms, and then integrate them with your embedded devices and enterprise IT/OT platforms.
Developing and Deploying Machine Learning Solutions for Embedded Applications
Machine learning is a powerful tool for solving complex modeling problems across a broad range of industries. The benefits of machine learning are being realized in applications everywhere, including predictive maintenance, health monitoring, financial portfolio forecasting, and advanced driver assistance. However, developing predictive models for signals obtained from sensors is not a trivial task. Moreover, there is an increasing need for developing smart sensor signal processing algorithms, which can be either deployed on edge nodes and embedded devices or on the cloud, depending on the application. MATLAB® and Simulink® provide a platform for exploring and analyzing time-series data and a unified workflow for the development of embedded software by providing a workflow from prototyping to production, including C code generation, processor-in-the-loop testing, and rapid prototyping on popular hardware platforms.
In this this talk, you will learn about:
- Time-frequency feature extraction techniques for machine learning workflows such as wavelets
- Automatic C code generation for preprocessing, feature extraction, and machine learning algorithms
- Rapid prototyping on embedded hardware such as Raspberry Pi™ and Android™
Building and Sharing Desktop and Web Apps
After algorithms are developed in MATLAB®, engineers prefer building user interfaces (apps), which can be then distributed across entire organizations. These apps enable teams to automate workflows, get quick quantitative analysis, avoid human error, and collaborate more effectively. Apps and components can be shared as both standalone desktop applications and as software components to integrate with web and enterprise applications.
In this talk, you will learn how to:
- Develop responsive MATLAB applications with rich data visualizations
- Share these applications with other MATLAB users and non-MATLAB users
- Deploy MATLAB applications to enterprise production systems and the web
Deep Learning and Autonomous Systems
Automated Driving System Design and Simulation
ADAS and autonomous driving systems are redefining the automotive industry and changing all aspects of transportation, from daily commutes to long-haul trucking. MATLAB® and Simulink® provide the ability to develop the perception, planning, and control components used in these systems.
In this talk, you will learn about these tools through examples that ship in R2019a, including:
- Perception: Design LIDAR, vision, radar, and sensor fusion algorithms with recorded and live data
- Planning: Visualize street maps, design path planners, and generate C/C++ code
- Controls: Design a model-predictive controller for traffic jam assist, test with synthetic scenes and sensors, and generate C/C++ code
- Deep learning: Label data, train networks, and generate GPU code
- Systems: Simulate perception and control algorithms, as well as integrate and test hand code
Model-in-Vehicle Validation Methodology for ADAS Features
Cognizant Technology Solutions has developed an interface for all sensors in MATLAB® and Simulink® and established vehicle communications using a high-end automotive grade CPU. This CPU allows the company to run the Simulink model in real time and test the ADAS features along with sensors on the vehicle in real-world driving scenarios. The company calls this technique model-in-vehicle (MIV) validation. MIV leads to in-vehicle verification and validation of algorithms for difficult-to-replicate scenarios in the virtual world in early stages of development in parallel to classical methodology of model-in-the-loop (MIL), software-in-the-loop (SIL), and hardware-in-the-loop (HIL).
Cognizant Technology Solutions
LiDAR Processing for Automated Driving
The use of LiDAR as a sensor for perception in Level 3 and Level 4 automated driving functionality is gaining popularity. MATLAB® and Simulink® can acquire and process LiDAR data for algorithm development for automated driving functions such as free space and obstacle detection. With the point-cloud processing functionality in MATLAB, you can develop algorithms for LiDAR processing, and visualize intermediate results to gain insight into system behavior.
This talk shows new capabilities including:
- Acquiring live and offline data from Velodyne® sensors
- Registering LiDAR point clouds
- Segmenting objects and detecting obstacles
- Applying deep learning to LiDAR data
- Generating C/C++ and CUDA® code from LiDAR processing algorithms
Developing Autonomous Robots Using MATLAB and Simulink
Autonomous robots are being developed in many industries from industrial warehouses to consumer products. This talk demonstrates new features in MATLAB® and Simulink® for the different functional domains of robotics, including hardware design, perception, planning and decision making, and control design. It describes the challenges and walks through a common robotics workflow. Some of the topics that will be covered include:
- Developing kinematic and dynamic models of robots
- Perception algorithm design using deep learning
- Path planning with obstacle avoidance
- Supervisory logic and control using Stateflow®
- Model-Based Design for developing and testing robotics systems
Develop and Test Vehicle Controllers for ADAS and Automated Driving Applications through System Simulation
When developing and testing sensor fusion algorithms and vehicle controllers for ADAS and automated driving applications, engineers need to consider the complex interplay between multiple sensors and dynamics of the vehicle, while the vehicle operates in a wide variety of scenarios, many of which may not have been available in recorded data during development. With multidomain system modeling and simulation-based testing in MATLAB® and Simulink®, you can close the loop in simulation, thus accelerating the development process and reducing costly and hazardous in-vehicle testing.
In this talk, you will learn how to:
- Design model predictive control-based vehicle controllers
- Create synthetic scenarios and test sensor fusion and control algorithms using system simulation
- Improve simulation fidelity with gaming engine integration, vehicle dynamics modeling, and automated scenario creation from recorded data
Deep Learning and Reinforcement Learning Workflows in AI
AI, or artificial intelligence, is powering a massive shift in the roles that computers play in our personal and professional lives. Two new workflows, deep learning and reinforcement learning, are transforming industries and improving applications such as diagnosing medical conditions, driving autonomous vehicles, and controlling robots.
This talk dives into how MATLAB® supports deep learning and reinforcement workflows, including:
- Automating preparation and labeling of training data
- Interoperability with open source deep learning frameworks
- Training deep neural networks on image, signal, and text data
- Tuning hyper-parameters to accelerate training time and increase network accuracy
- Generating multi-target code for NVIDIA®, Intel®, and ARM®
Deploying Deep Neural Networks to Embedded GPUs and CPUs
Designing and deploying deep learning and computer vision applications to embedded GPU and CPU platforms like NVIDIA® Jetson AGX Xavier™ and DRIVE AGX is challenging because of resource constraints inherent in embedded devices. A MATLAB® based workflow facilitates the design of these applications, and automatically generated C/C++ or CUDA® code can be deployed to achieve up to 2X faster inference than other deep learning frameworks.
This talk walks you through the workflow. Starting with algorithm design, you can employ deep neural networks augmented with traditional computer vision techniques which can be tested and verified within MATLAB. Bring live sensor data from peripheral devices on your Jetson/DRIVE platforms to MATLAB running on your host machine for visualization and analysis. Train your deep neural networks using GPUs and CPUs on the desktop, cluster, or cloud. Finally, GPU Coder™ and MATLAB Coder™ generate portable and optimized CUDA and/or C/C++ code from the MATLAB algorithm, which is then cross-compiled and deployed to Jetson or DRIVE, ARM®, and Intel® based platforms.
Systems Modeling, Implementation, and Verification
Systems Engineering: Requirements to Architecture to Simulation
System engineering and model-based system engineering can mean different things to different groups, but most definitions share a common set of concepts, including starting from a set of system-level requirements which are used to drive a system decomposition and requirements allocation process. Then trade-off studies are performed on system architecture alternatives to produce a candidate architecture from which the design is developed and then simulated to verify that the requirements are met.
This presentation shows how MathWorks tools can support this workflow by allowing users to:
- Capture, view, analyze, and manage requirements
- Develop a system architecture model from the requirements, existing Simulink® models, ICDs, and externally created architectures or combinations of the above
- Examine the system architecture model using different views for different concerns
- Allocate (link) requirements to architectural components and perform coverage and change impact analysis
- Perform trade studies to compare, assess, or optimize the system architecture
- Design components specified in the system architecture model
- Simulate the system composition to verify system-level behavior
HIL Testing of AMT Control Strategy Using Simscape Plant Models
The control strategy for an automated manual transmission is tested using a gearbox model running on a hardware-in-loop (HIL) system. The gearbox is a part of a parallel hybrid (P2) powertrain configuration. The gearbox plant model uses components from Simscape™ and Simscape Driveline™ libraries of Simulink®. The plant model can simulate gearshift forces, including synchronization forces; shift and select detent forces; shift fork leverages, and rotational and translational inertias; and shift sleeve free fly, dog clutch engagement. Gearshift is performed using two electromechanical actuators, one for shifting and other for selecting rail. Detailed models of actuators are also created using data provided by the supplier. Actuator models are validated against their technical specifications like full-load speed, no-load speed, and current drawn at various loads. The control software for AMT is designed in Simulink and Stateflow® using Model-Based Design. The control software is flashed on a rapid prototyping ECU. The plant model consisting of the gearbox and the gearshift actuators runs on HIL.
Ajitsinh A. Yadav,
Developing a Battery Management System Using Simulink
Battery management systems (BMS) ensure maximum performance, safe operation, and optimal lifespan of battery pack energy storage systems under diverse charge-discharge and environmental conditions. With Simulink®, engineers can use simulations to model feedback and supervisory control algorithms that monitor cell voltage and temperature, estimate state-of-charge (SOC) and state-of-health (SOH) across the pack, control charging and discharging rates, balance SOC across the battery cells, and isolate the battery from source and load when necessary. Starting from early design tradeoffs to hardware-in-the-loop (HIL) testing of BMS hardware, Simulink can help engineers perform desktop simulations to ensure the BMS performs as intended under all desired operating conditions and meets design durability requirements. In this talk, you’ll learn how Simulink helps engineers from electrical, thermal, and software backgrounds collaborate throughout the development cycle of BMS algorithms.
Home Appliances Controls Development Using Model-Based Design
In past two years, Model-Based Design has been extensively used at Whirlpool for embedded software development to overcome various difficulties and complexities that typically arise during the design lifecycle of embedded software for closed-loop control systems. With Model-Based Design, Whirlpool can provide a single design environment so that developers can use a single model of their entire lifecycle for data analysis, model visualization, testing and validation, and ultimately, product deployment, with or without automatic code generation.
This presentation discusses how Whirlpool has deployed Model-Based Design, starting with a single platform (home appliances) and spreading across all four different platforms (cooking, dishwasher, and refrigeration), for controls algorithm development due to the numerous advantages the workflow offers. It also explains how the team has brought maturity into the testing process using different MathWorks tools. Whirlpool has deployed a model-based methodology, following these steps:
- Requirement capturing and review
- High-level and low-level requirements
- Review and finalization
- Importing Dymola Plant Model into Simulink® using FMU Import (FMI Co-Sim and Model Exchange Interface)
- Development of control model using MATLAB®, Stateflow®, and Simulink
- Data management using data dictionaries
- Model verification using Model Advisor
- Model design error checking using Simulink Design Verifier™
- Model validation using Simulink validation and verification tools and Simulink Test™
- Model validation using rapid control prototyping
- Software-in-the-loop checking
- Code generation
Whirlpool of India Ltd
Whirlpool of India Ltd
Developing and Implementing Digital Control for Power Converters
Using a buck-boost power converter example, this talk explains how Simulink® and Simscape Electrical™ are used to develop, simulate, and implement a controller that maintains desired output voltage in the presence of input voltage variations and load changes to achieve fast and stable response. The presentation covers:
- Modeling passive circuit elements, power semiconductors, and varying power sources and loads
- Simulating the converter in continuous and discontinuous conduction modes
- Determining power losses and simulating thermal behavior of the converter
- Tuning the controller to meet design requirements such as rise time, overshoot, and settling time
- Generating C code from the controller model for implementation on a Texas Instruments™ C2000™ microcontroller
Simplifying Requirements-Based Verification with Model-Based Design
With Model-Based Design, informal textual requirements can be modeled and simulated to verify behavior earlier, and then be automatically generated into code for an embedded target. The requirements can include temporal properties to define complex timing-dependent signal logic and can be incomplete or inconsistent. This can lead to errors and miscommunication in the design and test.
This talk shows you how you can model requirements and use the Logical and Temporal Assessments editor in Simulink Test™ to translate informal text requirements into unambiguous assessments with clear, defined semantics that can identify inconsistencies. The temporal assessment language, based on metric temporal logic, provides precise, formal semantics that is highly expressive and extensible to author readable assessments. You will learn how to enter assessments with conditions, events, signal values, delays, and responses using the interactive form-based editor. You can view the assessment in an English language-like statement that is easy to understand or view graphical representations that allow you to visualize the results and debug design errors.
Enterprise-Scale Software Verification for C/C++ Code
Polyspace® products help you statically verify embedded software written in C and C++. They can find bugs and prove the absence of overflow, divide-by-zero, out-of-bounds, array access, and other run-time errors including checking C/C++ coding standards for MISRA® and AUTOSAR. Recent developments in Polyspace products help development teams improve software quality, safety, and security across their enterprise. Analysis execution is automated using continuous integration tools such as Jenkins™. Results can be published for web browser-based code review to triage and resolve coding errors. Integration with defect tracking tools like Jira help manage identified defects. Dashboards display information that development managers can use to monitor software quality, project status, number of defects, and code metrics.
Signal Processing Systems: From Design to Implementation
5G New Radio Fundamentals: Understanding the Next Generation of Wireless Technology
Development of 5G products is accelerating with the first device and network deployments in 2019. 5G New Radio (NR) technology introduces a flexible architecture that will enable the ultra-fast, low-latency communications needed for next-generation mobile broadband networks and applications such as connected autonomous cars, smart buildings and communities, digital healthcare, industrial IoT, and immersive education. The flexibility of the 5G NR standard will make design and test more complex.
Engineers developing 5G enabling technologies and connected devices need a solid understanding of the fundamental concepts behind the 5G NR specification.
This talk demonstrates the key 5G physical layer technologies and concepts. You will learn about the structure of 5G waveforms; how the waveforms are constructed, modulated, and processed; beam management in massive MIMO systems; and methods for simulating and measuring link-level performance.
5G NR PHY Implementation, Algorithm Design, and New Waveform Research in MATLAB
The team at Sooktha Consulting Private Limited is developing physical layers of 5G NR in an agile fashion. The physical layer being a key component of their base station product and very complex, it is extremely important to test the implementation against an established reference before they even integrate with an RF front-end or attempt to interoperate with third-party implementations of the UE side. Further, if they notice issues when interoperating with third-party UEs, they need a reference to validate their implementation in those scenarios. The team is expecting their MATLAB® implementation to act as the reference that they use to both speed up their implementation as well as to validate it. Finally, to be able to customize their product to India’s rural connectivity or Internet of Things deployments, they may have to define new algorithms or waveforms and need a reliable reference platform to develop and test these algorithms and waveforms. They are planning to use 5G Toolbox™ and other packages as the reliable reference platform on which they can confidently design and analyze different algorithms and waveforms on the basic 3GPP 5G NR technology and contribute to the specifications through TSDSI and 3GPP.
Sagar Shriram Salwe, Sooktha Consulting Private Limited
Seamless System Design of RF Transceivers and Antennas for Wireless Systems
Wireless engineers are pursuing 5G and other advanced technologies to achieve gigabit data rates, ubiquitous coverage, and massive connectivity for many applications such as IoT and V2X. The need to improve performance and coexist with multiple communications standards and devices while reducing the overall area and power imposes challenging requirements on RF front ends. Gaining an insight into such complex systems and performing architectural analysis and tradeoffs require a design model that includes DSP, RF, antenna and channel, as well as impairments.
In this talk, you will learn how to model antenna arrays and integrate them in RF front ends for the development of wireless communications, including:
- Analyzing the performance of antennas with arbitrary geometry
- Performing array analysis by computing coupling among antenna elements
- Modeling the architecture of RF front ends
- Developing baseband and RF beamforming algorithms
Antenna Array Simulation and Beamforming for the Expanded GMRT
Giant Metrewave Radio Telescope (GMRT) is one of the most sensitive instruments in the world for observing celestial objects at radio frequencies. A project proposal called the Expanded GMRT (eGMRT) is aimed at enhancing the scientific capabilities of the GMRT. eGMRT aims to expand the field-of-view of the telescope using a multi-element feed at the focus followed by a beamformer. During the prototype development phase, the team at NCRA-TIFR is building a FPGA-based Focal Plane Array (FPA) beamformer in the L-band with 300 MHz bandwidth and an ability to process 30 independent beams using 144 antenna elements.
The development of the FPA beamformer utilizes MATLAB®, Simulink®, and other MathWorks toolboxes for antenna array simulation, optimization of beamformer weights, FPGA implementation, and data analysis. The simulation of closely-spaced Vivaldi antenna array was carried out using Antenna Toolbox™ and Phased Array System Toolbox™. A GUI-based simulation tool was developed for the simulation of the radiation pattern through user-configurable antenna selection and array configuration. It also helped in understanding the radiation patterns of beams at different offsets from the direction of the main beam and served as a reference for comparison with the results from the practical beamformer testing.
The digital design of the beamformer system was carried out using a model-based approach in Simulink. Xilinx® System Generator blocks were used for implementation of the design. This approach significantly reduced the development time and helped in addressing the incremental changes and debugging.
In this presentation, Kaushal D. Buch describes the array simulation technique, model-based implementation of the beamformer, and recent results from the ongoing prototype development. The presentation also describes how different MathWorks products are useful to a project which uses antenna theory, signal processing, and FPGA design.
Kaushal D. Buch,
Giant Metrewave Radio Telescope, NCRA-TIFR
Sensor Fusion and Tracking for Next-Generation Radar
The technology required to design and field a multifunction radar drives a corresponding increase in system level complexity. In addition to performing multiple radar-related tasks such as search and track, these systems are often called upon to do other applications such as weather observations, communications, or EW, all with the same hardware resources. To meet the desire for shorter development cycles, modeling and simulation early in this type of project can lower risk and accelerate the pace of project.
With these trends, sensor fusion and tracking technology is central to data processing and resource management. You can extend your signal processing workflows to directly integrate with data processing algorithms, which also results in efficient resource management and control.
In this talk, you will learn to design, simulate, and analyze systems that fuse data from multiple sensors to maintain position, orientation, and situational awareness for surveillance systems. Through several examples, you will see how to:
- Define and import scenarios and trajectories for simulation
- Generate synthetic detection data for radar, EO/IR, sonar, and RWR sensors, along with GPS/IMU sensors for localization
- Design data association algorithms for real and synthetic data
- Perform “what-if” analysis on tracker configurations
- Evaluate system accuracy and performance with standard benchmarks, metrics, and animated plots
- Model a closed-loop, multifunction radar that performs search and track
Design and Verification of Mixed-Signal and SerDes Systems
The design and integration of mixed-signal integrated circuits is becoming increasingly more challenging due to analog effects that cannot be neglected, and complex embedded digital signal processing algorithms and control logic.
This talk introduces workflows in MATLAB® and Simulink® that help achieve fast system-level simulation speed, perform comprehensive design space exploration, and develop high-quality behavioral models. You will see a realistic PLL example that shows the entire design and simulation workflow. Through practical examples, you will also learn how to reuse a Simulink model or a MATLAB function in EDA tools to develop high quality verification testbenches.
Adopting Model-Based Design for FPGA, ASIC, and SoC Development
The competing demands of functional innovation, aggressive schedules, and product quality have significantly strained traditional FPGA, ASIC, and SoC development workflows.
This talk shows how you can use Model-Based Design with MATLAB® and Simulink® for algorithm- and system-level design and verification, including how to:
- Verify the functionality of algorithms in the system context
- Refine algorithms with data types and architectures suitable for FPGA, ASIC, and SoC implementation
- Prototype and debug models running live on hardware connected to MATLAB or Simulink
- Generate and re-generate verified design and verification models for the hardware engineering team
- Keep the workflow connected to speed verification closure and meet functional safety requirements
Comprehensive Workflow for AUTOSAR Classic and Adaptive Using Model-Based Design
Modeling and code generation for AUTOSAR software components lets you automate the process of specifying and synchronizing lengthy identifiers in designs, code, and description files. Join us to learn about Simulink® Advance Support for AUTOSAR features of modeling AUTOSAR Classic and Adaptive Software Components, simulating AUTOSAR compositions and ECUs, and of C and C++ production code generation. A MathWorks engineer will provide a brief overview of the latest AUTOSAR standards, including Classic and Adaptive Platforms, and provide product demonstrations showing how you can use Simulink, AUTOSAR Blockset™, and Embedded Coder™ to design, simulate, verify, and generate code for AUTOSAR application software components.
Deploying AI Algorithms on Cloud for Near Real-Time Decision Making
With the increasing popularity of AI, new frontiers are emerging in predictive maintenance and manufacturing decision science. However, there are many complexities associated with modeling plant assets, training predictive models for them, and deploying these models at scale, including:
- Generating failure data, which can be difficult to obtain, but physical simulations can be used to create synthetic data with a variety of failure conditions.
- Ingesting high-frequency data from many sensors, where time-alignment makes it difficult to design a streaming architecture.
This talk will focus on building a system to address these challenges using MATLAB®, Simulink®, Apache™ Kafka®, and Microsoft® Azure®. You will see a physical model of an engineering asset and learn how to develop a machine learning model for that asset. To deploy the model as a scalable and reliable cloud service, we will incorporate time-windowing and manage out-of-order data with Apache Kafka.
Virtual Engine Calibration: DPF Regeneration Example
To reduce calibration workload, MathWorks developed a virtual engine calibration workflow by providing controller and plant models, calibration optimization tools, and an automated process to convert measured data into calibration parameters to achieve accurate vehicle-level powertrain simulation models. In this master class, MathWorks engineers will focus on how to:
- Quickly use a powertrain reference application as a starting point for system model
- Generate optimal calibration tables using model-based calibration methods on measured data
- Automate filling of calibration tables in the torque structure
- Add DPF regeneration post-inject fueling logic and a DPF after-treatment plant model
- Assess fuel economy impact of DPF regeneration events at vehicle level on a standard drive-cycle