Insight. Implementation. Integration.
AI, or artificial intelligence, is transforming the products we build and the way we do business. It also presents new challenges for those who need to build AI into their systems. Creating an “AI-driven” system requires more than developing intelligent algorithms. It also requires:
- Insights from domain experts to generate the tests, models, and scenarios required to build confidence in the overall system
- Implementation details, including data preparation, compute-platform selection, modelling and simulation, and automatic code generation
- Integration into the final engineered system
Join us as Loren demonstrates how engineers and scientists are using MATLAB® and Simulink® to successfully design and incorporate AI into the next generation of smart, connected systems.
In this session, Ned and Joe introduce new capabilities in the MATLAB® product family in Releases 2019a and 2019b. They share their insights into how MATLAB is designed to be the language of choice for millions of engineers and scientists worldwide. Attend this session for a unique opportunity to learn from two of the key designers of MATLAB.
Lying hidden within the servers and notebooks of the manufacturing communities, there exists a wealth of untapped knowledge. It is long overdue that we brush off this diligently collected process data and begin to learn the secrets it hides.
Bob will discuss his (occasionally challenging) journey to extract useful insights from a treasure trove of manufacturing data. Demonstrating the power of data analytics in a time-pressured environment, he introduces the ability to learn from previous datasets to provide real scientific insight.
During the talk he will highlight how MATLAB® was integral throughout, aiding rapid prototyping of ideas along with extraction, data processing, visualization, and analysis.
AI, or artificial intelligence, is powering a massive shift in the roles that computers play in our personal and professional lives. Two new workflows, deep learning and reinforcement learning, are transforming industries and improving applications such as diagnosing medical conditions, driving autonomous vehicles, and controlling robots. This talk dives into how MATLAB® supports deep learning and reinforcement workflows, including:
- Automating preparation and labelling of training data
- Interoperability with open-source deep learning frameworks
- Training deep neural networks on image, signal, and text data
- Tuning hyper-parameters to accelerate training time and increase network accuracy
- Generating multi-target code for NVIDIA®, Intel®, and ARM®
The Spinach Toolbox is a fast open-source spin dynamics simulation library written in MATLAB® that enables thousands of researchers worldwide to run quantum mechanical modelling of magnetic systems. This toolbox is developed by the Spin Dynamics Group (http://spindynamics.org) at Southampton and makes extensive use of both parallel computing and sparse matrix arithmetic on GPUs to simulate microscopic magnetic processes. The package solves Liouville - von Neumann equation (an extension of Schrodinger's equation to ensembles), whose general solution is known but can be very hard to compute because complexity scaling is exponential. The way forward is to approximate using linear scaling methods. This presentation gives a few examples of these extreme scale simulations and discusses their implementation in parallel MATLAB.
An increasingly data-centric world poses a whole new set of questions that all businesses must strive to answer, and Leonardo is no different.
This talk will provide insight into some of the challenges associated with data analysis and replay as encountered on large-scale engineering projects. The talk will describe actions taken to remedy this with the introduction of a new Hadoop® based big data platform managed by CLOUDERA® software.
Andrew and Martin will show how MATLAB® allows engineers to interact with the Hadoop ecosystem from the comfort of their own desktop, and the role MATLAB plays in processing, analysis, and replay workflows when embedded into Hadoop tools.
Engineering teams have more data available to them than ever before. Data from tests, operations, production, and other sources present opportunities for data-based design decisions and new data-based products and services. But many teams struggle to come up with a consistent set of tools and processes for extracting value from this data.
This talk presents a variety of new MATLAB® features for accessing, organizing, and analyzing data, with a special focus on the MATLAB datastore framework for working with large collections of files and MATLAB datatypes for organizing and preprocessing sensor data. See how these tools enable engineering teams to grow from ad-hoc data analysis to building centralized tools for organisation-wide use, laying a foundation for production apps and analytics.
The Internet of Things (IoT) is an exciting and rapidly growing segment of the tech economy. Skyrad has invented several IoT sensors, including LeakBot (www.leakbot.io), a clip-on flow sensor, and HeatDoctor, a remote central heating diagnostics system.
During the development of these products, Skyrad explored a number of different approaches. These included running prototype MATLAB® algorithms in the cloud, translating MATLAB code to embedded C code by hand, deploying algorithms with MATLAB Production Server™, and ultimately running algorithms on the smart sensors themselves, designed using Simulink™ and deployed using Embedded Coder®.
Samuel will share how Skyrad developed the product, why they chose the particular architecture at each stage, and what was successful (and less successful). He will also give advice to anyone looking to develop or deploy IoT systems, and to anyone who is looking to deploy MATLAB in the cloud.
Industrial IoT has brought the rise of connected devices that stream information and optimise operational behavior over the course of a device’s lifetime.
This presentation covers how to develop and deploy MATLAB® algorithms and Simulink® models as digital twin and IoT components on assets, edge devices, or cloud for anomaly detection, control optimization, and other applications. It includes an introduction to how assets, edge, and OT/IT components are connected.
The talk features customer use cases starting from design to final operation, the underlying technology, and results.
System engineering and model-based system engineering can mean different things to different groups, but most definitions share a common set of concepts, including starting from a set of system-level requirements used to drive a system decomposition and requirements allocation process. Then trade studies are performed on system architecture alternatives to produce a candidate architecture from which the design is developed and then simulated to verify the requirements are met.
This presentation shows how MATLAB® and Simulink® support this workflow when combined with Simulink Requirements™ and System Composer™ by allowing users to:
- Capture, view, analyze, and manage requirements
- Develop a system architecture model from the requirements, existing Simulink models, ICDs, and externally created architectures or combinations of the above
- Examine the system architecture model using different views for different concerns
- Allocate (link) requirements to architectural components and perform coverage and change impact analysis
- Perform trade studies to compare, assess, or optimize the system architecture
- Design components specified in the system architecture model
- Simulate the system composition to verify system-level behavior
Once safety has been established, the next roadblock for autonomous vehicles to become fully accepted is passenger comfort, specifically motion sickness. Additionally, when all road vehicles are autonomous, the passenger experience will be the key differentiator for manufacturers. In this talk, Michael will discuss how simulation models are being used by Ricardo to assess human passenger comfort in autonomous vehicles. He will cover the vehicle dynamics, human dynamics, and neural processing models necessary to calculate a motion sickness index. Michael will discuss the role of tools such as Vehicle Dynamics Blockset™ and Simscape Multibody™ in the development of these models, and the work that lies ahead.
ADAS and autonomous driving systems are redefining the automotive industry and changing all aspects of transportation, from daily commutes to long-haul trucking. MATLAB® and Simulink® provide the ability to develop the perception, planning, and control components used in these systems.
In this talk, you will learn about these tools through examples that ship in R2019a, including:
- Perception: Design LiDAR, vision, radar, and sensor fusion algorithms with recorded and live data
- Planning: Visualize street maps, design path planners, and generate C/C++ code
- Controls: Design a model-predictive controller for traffic jam assist, test with synthetic scenes and sensors, and generate C/C++ code
- Deep learning: Label data, train networks, and generate GPU code
- Systems: Simulate perception and control algorithms, as well as integrate and test hand code
Cummins Generator Technologies (CGT, established 1904) designs, manufactures, and validates alternators under the renowned brands of STAMFORD and AvK. Based in Stamford, Lincolnshire, UK, CGT has branch manufacturing plants in Romania, India, and China. With a nominal 7.5 to 11,200kVA range, CGT alternators are suitable for a variety of applications.
In the modern business environment, to achieve product reliability and commercial growth, a company must be able to accurately simulate and analyze machine performance across multiple varied, realistic conditions in a low-cost and time-efficient manner. MathWorks products offer an integrated simulation environment for Model-Based Design. Simscape™, Simulink®, and MATLAB® provide the blocks, tools, and algorithms for efficiently designing and testing physical and control systems.
In this talk, Peenki describes how using Simscape products to develop multidomain synchronous machine models allowed CGT engineers to shorten the alternator design cycle. The models, calibrated using validated field data, allowed CGT to accurately and efficiently predict the transient response for the synchronous machines over a range of alternator fault conditions.
Most new electrical and electromechanical systems require multiple models to support the key design questions. Simple behavioural models that enable optimization of the overall system are required early in the design. As a design matures, the system-level assumptions must then be validated by more detailed subcomponent models. Moreover, subcomponent design itself requires more detailed models to ensure constituent parts stay within permitted operating envelopes, thus helping ensure the subcomponent will be reliable in service.
In this presentation, Rick will show how to develop Simscape™ models with different levels of fidelity that span system, control, and component design tasks. Models of both a photovoltaic generator and an electric traction drive will be used for illustration. Workflows to import SPICE subcircuits and magnetic finite element data will be shown, plus abstraction methods to map this detailed design data to support design validation at the system level. Two ways of extracting stability margins from switched converter models will also be demonstrated. The presentation will conclude with a tutorial Simscape language example on how to build a faulted component model.
Getting Started with MATLAB and Simulink
In this session, Laura introduces MATLAB®, the interactive environment and high-level language for numerical computation, visualisation, and programming. Topics discussed and demonstrated in this session include:
- Importing data from Microsoft® Excel®, text files, databases, and devices
- Exploring and visualising data using interactive tools
- Performing mathematical analysis on the data
- Automating your analysis and creating reports
See how easy it is to get started using MATLAB.
Modern techniques in image processing and computer vision have transitioned from processing pixels to extracting features to learning models directly from data to recognize, detect, and segment objects in scenes.
In this talk, you will learn about the capabilities in MATLAB® for creating feature-based workflows, training machine learning models, and leveraging deep convolutional neural networks (CNNs). Demos include:
- Feature detection and extraction
- Machine learning techniques for training object detectors
- Creation of a bag-of-words model for object classification
- Ground truth images and video, and training new classifiers
- Deep learning for object detection using YOLO v2
- Semantic segmentation using U-Net and SegNet
In this session, Tim introduces the Simulink® product family. Topics include:
- Basic concepts for Simulink and Stateflow®
- Problems that are appropriate for Simulink rather than MATLAB®
- Using Simulink and Stateflow to effectively capture your complete system modelling needs
- How you can use code generation to take your designs from desktop to hardware
This presentation is ideal for Simulink beginners and MATLAB users interested in learning more about Simulink.
There is an exponential growth in the development of increasingly autonomous systems. These systems range from road vehicles that meet the various NHTSA levels of autonomy, through consumer quadcopters capable of autonomous flight and remote piloting, package delivery drones, flying taxis, and robots for disaster relief and space exploration. Work on autonomous systems spans industries and includes academia as well as government agencies.
In this talk, you will learn to design, simulate, and analyze systems that fuse data from multiple sensors to maintain position, orientation, and situational awareness. By fusing multiple sensors data, you ensure a better result than would otherwise be possible by looking at the output of individual sensors.
Several autonomous system examples are explored to show you how to:
- Define trajectories and create multiplatform scenarios
- Simulate measurements from inertial and GPS sensors
- Generate object detections with radar, EO/IR, sonar, and RWR sensor models
- Design multi-object trackers as well as fusion and localization algorithms
- Evaluate system accuracy and performance on real and synthetic data
Developing predictive models for signal, time-series, and text data using artificial intelligence (AI) techniques is growing in popularity across a variety of applications and industries, including speech classification, radar target classification, physiological signal recognition, and sentiment analysis.
In this talk, you will learn how MATLAB® empowers engineers and scientists to apply deep learning beyond the well-established vision applications. You will see demonstrations of advanced signal and audio processing techniques such as automated feature extraction using wavelet scattering and expanded support for ground truth labelling. The talk also shows how MATLAB covers other key elements of the AI workflow:
- Use of signal preprocessing techniques and apps to improve the accuracy of predictive models
- Use of transfer learning and wavelet analysis for radar target and ECG classification
- Interoperability with other deep learning frameworks through importers and ONNX™ converter for collaboration in the AI ecosystem
- Scalability of computations with GPUs, multi-GPUs, or on the cloud
Designing and deploying deep learning and computer vision applications to embedded GPU and CPU platforms like NVIDIA® Jetson AGX Xavier™ and DRIVE AGX is challenging because of resource constraints inherent in embedded devices. A MATLAB® based workflow facilitates the design of these applications, and automatically generated C/C++ or CUDA® code can be deployed to achieve up to 2X faster inference than other deep learning frameworks.
This talk walks you through the workflow. Starting with algorithm design, the algorithm may employ deep learning networks augmented with traditional computer vision techniques and can be tested and verified within MATLAB. Bring live sensor data from peripherals devices on your Jetson/DRIVE platforms to MATLAB running on your host machine for visualization and analysis. Deep learning networks are trained using GPUs and CPUs on the desktop, cluster, or cloud. Finally, GPU Coder™ and MATLAB Coder™ generate portable and optimized CUDA® and/or C/C++ code from the MATLAB algorithm, which is then cross-compiled and deployed to Jetson or DRIVE, ARM®, and Intel® based platforms.
Engineers and scientists increasingly adopt practices from software development to write programs that are easy to debug, verify, and maintain. In this session, you’ll learn how to integrate MATLAB® with source control systems like GitHub® and integration servers like Jenkins™, which also facilitates Agile development. You’ll additionally learn how to test code with the MATLAB Unit Test Framework and manage code with projects.
With Model-Based Design, informal textual requirements can be modeled and simulated to verify behavior earlier, and then, be automatically generated into code for an embedded target. The requirements can include temporal properties to define complex timing-dependent signal logic and can be incomplete or inconsistent. This can lead to errors and miscommunication in the design and test.
This talk shows you how you can model requirements and use the Logical and Temporal Assessments editor in Simulink Test™ to translate informal text requirements into unambiguous assessments with clear, defined semantics that can identify inconsistencies. The temporal assessment language, based on metric temporal logic, provides precise, formal semantics that is highly expressive and extensible to author readable assessments. You will learn how to enter assessments with conditions, events, signal values, delays, and responses using the interactive form-based editor. You can view the assessment in an English language-like statement that is easy to understand or view graphical representations that allow you to visualize the results and debug design errors.
The systems required for Industry 4.0, IoT, and other digital trends will be multidisciplinary, with autonomous and AI functionality, and pervasively connected to larger systems of systems. How will engineers and scientists acquire the skills and know-how to conceive and develop such systems? Integrating engineering knowledge about signals and systems, data-driven AI methods, and computational thinking approaches will become critical.
Learn how you can use MATLAB® and Simulink® to develop the skills and techniques for academic research and industrial applications to create smarter systems. Discover, learn, and apply new concepts and technologies for emerging engineering and scientific challenges using Model-Based Design.
As the number of MATLAB® and Simulink® users grow in an organisation, retaining awareness of an engineer’s developing skills, and the models and code generated, is challenging. Duplication of effort occurs between users and time is wasted by multiple users reinventing the wheel. Cultural challenges exist in the adoption of new tools and bringing the whole engineering workforce along on the wave of change. A user community facilitates knowledge sharing and provides methods for greater communication, education, and standardisation of common working practices. The resulting benefits drive productivity and efficiencies while cutting through the organisation complexities. This presentation will discuss the approaches taken at BAE Systems, a large and diverse organisation, to establish a vibrant, self-organising user community to support sharing, self-learning, and personal development.
Matthew Offredi, BAE Systems
Battery management systems (BMS) ensure maximum performance, safe operation, and optimal lifespan of battery pack energy storage systems under diverse charge-discharge and environmental conditions. With Simulink®, engineers can use simulations to model feedback and supervisory control algorithms that monitor cell voltage and temperature, estimate state-of-charge (SOC) and state-of-health (SOH) across the pack, control charging and discharging rates, balance SOC across the battery cells, and isolate the battery from source and load when necessary. Starting from early design tradeoffs to hardware-in-the-loop (HIL) testing of BMS hardware, Simulink can help engineers perform desktop simulations to ensure the BMS performs as intended under all desired operating conditions and meets design durability requirements. In this talk, you’ll learn how Simulink helps engineers from electrical, thermal, and software backgrounds collaborate throughout the development cycle of BMS algorithms.
Embedded software complexity and quality requirements are always increasing. In this session you will see how Polyspace® products allow you to achieve the highest levels of software quality with reduced testing effort. Polyspace can prove your code is free from certain critical run-time errors, such as overflows, divide-by-zero, or out-of-bounds array accesses, using formal methods-based static code analysis. Discover how Polyspace helps you to quickly find defects and check for violations of coding guidelines like MISRA®, CWE, and CERT C/C++. Along the way you will learn how the analysis can be performed interactively or used with continuous integration tools such as Jenkins™ for software integration testing. Highlights include the latest features for browser-based code review, team collaboration dashboards, and integration with defect tracking tools such as Jira.
Predictive maintenance reduces operational costs for organizations running and manufacturing expensive equipment by predicting failures from sensor data. However, identifying and extracting useful information from sensor data is a process that often requires multiple iterations as well as a deep understanding of the machine and its operating conditions.
In this talk, you will learn how MATLAB® and Predictive Maintenance Toolbox™ combine machine learning with traditional model-based and signal processing techniques to create hybrid approaches for predicting and isolating failures. You will also see built-in apps for extracting, visualizing, and ranking features from sensor data without writing any code. These features can then be used as condition indicators for fault classification and remaining useful life (RUL) algorithms.
Predictive maintenance algorithms make the greatest impact when they are developed for a fleet of machines and deployed in production systems. This talk will show you how to validate your algorithms, then integrate them with your embedded devices and enterprise IT/OT platforms.
Women in Tech Ignite Lunch
MathWorks will be hosting a Women in Tech Ignite lunch during this year's MATLAB EXPO. In this informal networking event, we will welcome and celebrate women working in science and engineering with MATLAB® and Simulink®. Join us to hear from leading technical experts, to network with MathWorks leaders and engineers, and to exchange ideas with industry peers.