I have to say that I’m constantly surprised and impressed by the things the clever chaps and chapesses at MathWorks come up with. Just when I think I’ve seen and heard it all, they spring into action and introduce me to things I’d never even thought about before.
MathWorks is a privately held corporation, so it can be a tad difficult to lay one’s hands on certain information, but I’m informed that they have more than 6,000 employees, which is a lot more than I would have guessed if you’d asked me before I started writing this column.
I was also amazed to discover that, although MathWorks was formed in 1984, its key product — MATLAB — first saw the light of day in the 1970s. MATLAB was created by Cleve Barry Moler, who is an American mathematician and computer programmer specializing in numerical analysis, and who was chairman of the computer science department at the University of New Mexico at the time.
Today, MathWorks’ major products include MATLAB and Simulink. MATLAB (an abbreviation of “MATrix LABoratory”) is a multi-paradigm programming language and numeric computing environment that allows matrix manipulations, plotting of functions and data, implementation of algorithms, creation of user interfaces, and interfacing with programs written in other languages. Although MATLAB is intended primarily for numeric computing, access to symbolic computing abilities is provided by means of an optional toolbox that uses the MuPAD symbolic engine.
Meanwhile, Simulink is a MATLAB-based graphical programming environment for modeling, simulating, and analyzing multidomain dynamical systems. Simulink’s primary interface is a graphical block diagramming tool and a customizable set of block libraries. It offers tight integration with the rest of the MATLAB environment and can either drive MATLAB or be scripted from it.
Now, you can use MATLAB and Simulink to do almost anything your heart desires on the numerical analysis and simulation fronts, but it can take you a long time if you decide to implement everything yourself from the ground up. To save you time and effort, the folks at MathWorks provide a cornucopia of what they call Toolboxes (you can peruse a comprehensive list on their Products page). It can be a little hard to pin down a comprehensive definition as to what Toolboxes actually are, but I tend to think of them as allowing you to start doing what you want to do at a higher level of abstraction. Another way of looking at things is that they do a lot of the grunt work for you and allow you to hit the ground running.
For example, I’ve written several MathWorks-centric columns over the past couple of years, and various Toolboxes have featured in all of them. If you have a few moments to spare, you may wish to cast your orbs over the following: Digital Twins Promote Predictive Maintenance (Did you know MATLAB has Deep Learning and Predictive Maintenance Toolboxes, or that Simulink has an add-on called Simscape you can use to create physics-based models?), Want to Learn AI? But Where to Go? (MathWorks has awesome artificial intelligence (AI) capabilities with regard to developing and deploying machine learning (ML), deep learning (DL), and reinforcement learning (RL) apps), and The MathWorks Satellite Communications Toolbox is Out of This World! (The Satellite Communications Toolbox lets us simulate, analyze, and test satellite communications systems and links).
As part of the Satellite Communications column, I was exposed to the concept of constellations of low Earth orbit (LEO) satellites that are starting to zip across the sky. These come with the promise of our future smartphones adding satellite links as just one of their myriad modalities, thereby allowing us to call home to mom from the top of the tallest mountain, the bottom of the deepest valley, or the heart of the remotest outpost of civilization (which, on the off chance you were wondering, some might claim to be Huntsville, Alabama, where — by some strange quirk of fate — I currently hang my hat).
The point of all this is that I never paused to ponder what the guys and gals at MathWorks had to offer when it came to designing artifacts of an automotive nature. All this changed recently when I had a chat with Govind Malleichervu and Wensi Jin, both of whom are automotive industry managers at MathWorks (don’t let the “manager” moniker fool you, because they are engineers to the core).
Now, designing regular automobiles is one thing, but ever since the Autonomous-Car Chaos of the 2004 DARPA Grand Challenge, the automotive industry and the general public have become increasingly enthralled by the promise of automotive automation, from advanced driver assistance systems (ADAS) to fully autonomous driving (AD). Although developments were slow at first, they’ve really ramped up in the past 10 years or so, largely coinciding with the modern era of AI compute, which concept I discussed in my recent Are We Poised to Turn the Optical Computing Corner? column.
As an aside, have you seen the Arc de Triomphe de l’Étoile in Paris as depicted in this video? When I one day hear that autonomous vehicles have reached the stage where they can navigate their way around this magnificent monument, I will doff my cap to their creators.
The point is that the myriad developments that ensued from the DARPA Grand Challenge spawned the need for ever more sophisticated levels of modeling and simulation. There are all sorts of interrelated aspects to this. For example, you have to be able to create a scene, perhaps starting with a straight stretch of road, then adding some curves, then some intersections. Also signs and traffic lights and suchlike.
The next step is to create a scenario. In addition to your scene, this will include all of the actors (vehicles, pedestrians, animals), along with environmental conditions like the weather and any light sources. The term “ego vehicle” refers to the vehicle that contains the sensors that perceive the surrounding environment. Your ego vehicle needs to be modeled to include both its sensors (cameras, lidar, radar, etc.) and its dynamics (lateral control, longitudinal control, etc.).
Sometimes the car’s sensors provide conflicting data, much like when I’m driving down what appears to be a clear stretch of road and my wife (Gina the Gorgeous) screams “WATCH OUT” at the top of her (not inconsiderable) voice inches from my ear (one day I hope to discover what she’s talking about). In the autonomous vehicle arena, there’s a need to perform sensor fusion to provide redundancy and a more holistic view of the whole world, including the ability to determine which inputs are currently more reliable and need to be acted upon.
There’s no point in testing things like visual perception (including object detection and recognition) systems only in simulations featuring pristine conditions. Thus, over time, more fidelity will be added to your library of scenarios, like bumpy roads and faded road markings and signs, along with an ego model that will dynamically respond to bumps in the road, thereby affecting the data being returned by the sensors. Throw in a snowstorm or a swarm of fireflies and rerun the simulation assuming a moonless night. Now we’re talking.
Simulating this sort of scenario permits iterative refinement of algorithms for perception (including sensor fusion), planning, and control. These algorithms are used to generate software, which is then used as part of system-level simulation. All of these simulations may be run either interactively or automatically (on your desktop, on a cluster, or in the cloud).
All of which brings us back to the rich collection of MathWorks toolboxes, which include Lidar Toolbox, Radar Toolbox, Image Processing Toolbox, Computer Vision Toolbox, Sensor Fusion and Tracking Toolbox, Vehicle Dynamics Blockset, Automated Driving Toolbox, and RoadRunner, to name but a few.
Photo of real-world road scene (left) and a RoadRunner re-creation (right) (Image source: MathWorks)
To be honest, I could waffle on for hours regaling you with all the wonders I’ve seen (“I’ve seen things you people wouldn’t believe…” as Roy Batty says in Blade Runner), but a picture tells a thousand words and a video is worth a thousand pictures, so it’s fortunate that the folks at MathWorks like creating videos.
Some videos of interest with respect to these discussions are as follows. Two feature the Automated Driving Toolbox. In Part 1 we learn virtual simulation basics, including how to create scenarios or import them into the app. Meanwhile, Part 2 is all about sensors: adding them, changing parameters, visualizing sensor detections, and then exporting to MATLAB or Simulink.
One point of interest is that the fidelity of your virtual world can be selected depending on the need for simulating specific use cases. MathWorks provides two environments for virtual worlds: Cuboid and Unreal Engine. Cuboid world representations can be used to simulate driving scenarios, use sensor models, and generate synthetic data to test automated driving algorithms in simulated environments, including controls, sensor fusion, and path planning. The two videos above are based on cuboid world representations.
Where things start to get really tasty, visually speaking, is that you can also use the Unreal Engine from Epic Games to develop, test, and visualize the performance of driving algorithms in a 3D simulated environment. In addition to the algorithms noted with respect to the cuboid world, you can use the Unreal Engine world to develop and test perception algorithms driven by camera data from different camera models. Check out this Synthesize Sensors with Unreal Engine Driving Simulation video. Last, but certainly not least, you might wish to take a peek at this Autonomous Navigation for Highway Driving: Design and Simulate Lane Change Maneuver System video.
All I can say is that I’m well impressed. My poor old noggin is spinning with all the things I’ve seen and heard. When an autonomous vehicle is eventually created that can successfully navigate the Arc de Triomphe, I’ll bet that tools from MathWorks will play a large part in its success. How about you? Have you any thoughts you’d care to share about anything you’ve seen here?