feature article
Subscribe Now

Cars, Connection and Silicon

Automotive Thoughts from Embedded World

How do you sum up embedded world, this year spread over three days, with nearly 40,000 people and over 1,000 exhibitors, all around a theme of Securely Connecting the Embedded World?  Well you can’t – not sensibly. Instead I am going to look at a thread that recurred in the dozens of conversations that I had in those three days and that is currently a hot topic. Obviously, the IoT was also a regular occurrence, with many of the new product announcements specifically targeting it. But the topic that kept recurring was cars: not just fully autonomous cars, but also those with lower levels of autonomy such as with Advanced Driver Assistance Systems (ADAS). A handy guide to levels of automation is the SAE’s taxonomy, which classes them from 0 = total driver control to 5 = totally autonomous car.

The car manufacturing landscape is undergoing a massive transformation. In the middle of the twentieth century, most vehicle manufacturers were largely vertically integrated, making all the elements of the car – even, in the case of Ford, going to the extent of buying iron ore and coal to make their own steel and working on creating (not very successfully) a rubber plantation in South America. But, by the end of the century, they had essentially become assemblers with a long tail of specialist suppliers. In the jargon, the direct suppliers to the manufacturers (often referred to as the OEMs) are labelled Tier One; their suppliers are Tier Two (which is where semiconductor companies have been, and so on. Essentially the manufacturer would order a specific product from the Tier One but not be concerned directly in specifying sub-systems or defining the Tier Two.

Until recently, the OEM world could be crudely divided into several sectors. The largest, in terms of companies, are the many companies in the Far East, who are building mostly low-cost, relatively low-tech vehicles, usually only for the local market. At the other end of the spectrum are the new entrants, such as Google (and probably Apple) who are going straight for autonomous vehicles. The middle ground of the traditional OEM was effectively divided into technological progressives – mainly German and Japanese – and technology laggards – mainly the traditional Detroit companies. Just to muddy the waters, we also have seen the growth of fully electric vehicles, like Tesla, and hybrids, like the Prius and the Leaf.

Today, this is all in a state of flux, and electronics is the driver (if you will forgive the pun). And, again, we are looking at a spectrum of causes. At one end is the simple replacement of electro-mechanical systems by an electronic system – for example (if you go back that far), the spark to a cylinder in a petrol engine has always been routed through a rotating distributer. Today, it is normally supplied by an electronic ignition control system. At the other end of the spectrum is the high level of artificial intelligence that is required for the autonomous car. All of this is going to require tons of silicon and many millions of lines of code, and the semiconductor manufacturers and tools companies are falling all over each other to secure a large chunk of this market. At the same time, the OEMS are either getting more closely involved with the semiconductor guys or buying companies that would previously have been Tier Ones.

A few years ago, it was fashionable to illustrate presentations about electronics in vehicles with a slide showing the different functions that were potentially available for an electronic solution; today, it would be very difficult to provide a slide that showed functions where electronics were not involved.

The headlines are focusing on fully autonomous cars – Mentor’s announcement of the DRS360 platform that will support SAE level 5 that Kevin covered last week (https://eejournal.com/archives/articles/20170411-mentor-adas/) is only one of a slew of news stories. But there is still significant concern about the concept. One comment I heard was, “I will believe in fully autonomous cars when I see them coping with Delhi traffic.” But even without the Delhi traffic to cope with, there is still a wide range of other issues.

MIT’s Media Lab has created a set of scenarios that aim to generate discussion about issues faced by a self-driving car. In the Moral Machine (moralmachinie.mit.edu) you are asked to consider what should be the decision when, for example, a brake failure forces an autonomous car to choose between killing pedestrians crossing the road legally or running into a barrier, which will kill the car’s occupants. Now it can be argued – and has been – that these are entirely false situations. But there are serious issues underlying this.

We have already seen hysteria over the Tesla death – where the driver was clearly abusing the system. That same day, around 100 other people would have died in vehicle accidents in the United States alone, and around 3500 in the rest of the world – and very few of these even made it into the local news. An accident in Arizona, where an Uber “self-driving car” rolled after being side-swiped by another vehicle, also hit world news, even though no one was seriously injured. Accidents like that happen so frequently that local papers and news outlets never even report them.

Much of this is due to the novelty effect – autonomous cars are new, so accidents involving them are regarded as newsworthy. But another factor is the general attitude that always asks, “Who is to blame?” If the Tesla hadn’t been in autopilot mode, and if the driver, who was exceeding the speed limit, hadn’t noticed the truck he hit, there would be no discussion – it was the driver’s fault. But he was relying (in a way that the instructions said he shouldn’t) on the Tesla’s sensors – which failed to see the truck. Was it Tesla’s fault?

How are the systems designers going to create the algorithms that help the autonomous car react to a complex situation? Who is going to sign off these decisions for the car manufacturer? And what role will the legislative authorities take? And, given such stories as Toyota’s spaghetti code and unintended acceleration and Volkswagen’s gaming emission tests, there will be those who doubt the car companies’ ability to produce the artificial intelligence software at the heart of autonomy. One route that car companies are beginning to follow is to acquire companies that demonstrably have the skills needed – effectively a new version of vertical integration. For example, Ford recently struck a $1 billion deal to buy Argo, an AI start-up that will work at arms length and, according to Ford, will be able to sell its software and sensor suite to other manufacturers.

The sensors, such as optical, radar and lidar, that will provide the information for decision-making, are already being deployed for ADAS – either simply to improve the driver’s information, such as rear view cameras, or with added image processing, to provide an enhanced view of what is ahead or alongside. But they also feed into systems that actively assist the driver. These systems started with top-end vehicles and are now working their way downward. For example, the 2017 VW Golf in the UK has a “Traffic Jam Assist” option, which is a “camera and radar sensor controlled warning system that helps you keep your distance in congested traffic and avoid a typical congestion type collision. Should your vehicle drift out of lane, the system accelerates, brakes and steers as appropriate, as long as it detects the driver’s hands on the wheel.” The same technologies are behind Park Assist – effectively the car will parallel park or angle park by itself. Pedestrian Protection is standard on all models and “features a camera and radar sensor controlled warning system. This alerts you should it detect any pedestrians at the edge of the road or on the carriageway, via acoustic and optical signals as well as a gentle jolt of the brakes.” There are numerous connectivity options, and Volkswagen even added gesture recognition to the touch-controlled centre console.

The importance of sensors was demonstrated earlier this year with Intel’s purchase of Mobileye for $15.3 billion. The idea is that Intel will add Mobileye’s cameras, sensor chips, in-car networking, roadway mapping, machine learning, cloud software, and data fusion and management to its own products to serve ADAS and autonomous driving. Remember, with the Arria 10 SoCs, Intel has the equivalent of the Xilinx Zynq products that Mentor is using as hardware for the DRS360 platform.

Another issue in autonomous driving/ADAS is knowing the car’s geographical location accurately. Options being discussed include the use of GPS (particularly as new services are much more accurate then earlier ones), either on its own or linked to detailed mapping (Google Maps on steroids). An alternative approach says that you don’t need accurate knowledge; just approximate knowledge backed by the sensor information is good enough for driving. Within urban areas, it is possible that vehicle-to-infrastructure communication (the car talking to traffic lights and monitors) will be another way forward.

Vehicle communications (V2X) is another area where there is still considerable debate. Communications channels and protocols are being vigorously debated, and commercial interests are vying with each other to be the de facto standard. A similar debate is going on for communication within the car. Here we have traditional vehicle communication technologies like CAN (Controller Area Network) and Flex-Ray, which carry the safety-critical traffic, alongside MOST (Media Oriented Systems Transport) for multimedia. These are being joined by the recently announced IEEE 802.3bw, (100BASE-T1) Ethernet aimed at all of the aspects of the automotive sector.

But we have electronics in a whole range of other areas within the car – for example, in the personal comfort options – such as environmental control, seat adjustment, and so on – and simple things like door locking, and starting the engine where the long-serving mechanical key is being replaced by a fob that stays in your pocket or handbag. (Tesla offers a door-opening smartphone app. This doesn’t use Bluetooth to communicate with the car, but instead communicates using 3G mobile telephony – which is a bit of a nuisance if you are in area where there is no 3G signal!)

A huge user of silicon is the infotainment area, which can include GPS/navigation, audio, and (hopefully just for back seat passengers) video, together with communication for traffic information, and, in some cases, combining traffic information with the navigation system to offer alternative routes around congestion.

A major concern is the link between the safety-critical functions and the rest of the car. In lower levels of ADAS this need be only one way – with the safety critical area passing information to the infotainment display, for example. However, with autonomous driving, channels that have previously been purely for infotainment now need to communicate in both directions. Good practice is to create separate domains with secured gateways between the domains, but, according to an Infineon speaker at last year’s electronica conference, there are many manufacturers who still have only a single domain, and the weakness of this approach was demonstrated by the people who remotely took control of a Jeep.

One issue with multiple domains is the need for multiple processing units, each dedicated to a specific function, with associated purchase costs and weight issues. There are approaches that use multi-core processors, real or virtual, with hypervisor technology partitioning the different applications, but these can present serious challenges to the developers. (As Bryon Moyer pointed out last week, multiple cores are also valuable in safety-critical applications https://eejournal.com/archives/articles/20170413-synopsys-dual-core)

This ramble though a whole range of topics triggered by embedded world conversations brings us to one of the questions I regularly asked people at  the show, “Are you looking forward to autonomous driving?” Since most of the people were from automotive environments, there was a consensus that they were not, but there was also a consensus that the question was one that they would not have to face for some time yet. Also, there was a strong body of opinion that, particularly in urban areas, mass car ownership will probably disappear. Instead, self-driving vehicles would be summoned as needed and then dismissed when the journey was over.  This has a number of broad societal benefits, in reduced use of natural resources to build and power the vehicles, better use of large areas of valuable land than for parking cars that rarely move, and even increased personal spending. But it doesn’t sound like fun.

 

Leave a Reply

featured blogs
Mar 18, 2024
Innovation in the AI and supercomputing domains is proceeding at a rapid pace, with each new advancement heralding a future more tightly interwoven with the threads of intelligence and computation. Cadence, with the release of its Millennium Platform, co-optimized with NVIDIA...
Mar 18, 2024
Cloud-based EDA tools are critical to accelerating AI chip design and verification; see how NeuReality leveraged cloud-based chip emulation for their 7NR1 NAPU.The post NeuReality Accelerates 7nm AI Chip Tape-Out with Cloud-Based Emulation appeared first on Chip Design....
Mar 5, 2024
Those clever chaps and chapesses at SiTime recently posted a blog: "Decoding Time: Why Leap Years Are Essential for Precision"...

featured video

We are Altera. We are for the innovators.

Sponsored by Intel

Today we embark on an exciting journey as we transition to Altera, an Intel Company. In a world of endless opportunities and challenges, we are here to provide the flexibility needed by our ecosystem of customers and partners to pioneer and accelerate innovation. As we leap into the future, we are committed to providing easy-to-design and deploy leadership programmable solutions to innovators to unlock extraordinary possibilities for everyone on the planet.

To learn more about Altera visit: http://intel.com/altera

featured paper

Reduce 3D IC design complexity with early package assembly verification

Sponsored by Siemens Digital Industries Software

Uncover the unique challenges, along with the latest Calibre verification solutions, for 3D IC design in this new technical paper. As 2.5D and 3D ICs redefine the possibilities of semiconductor design, discover how Siemens is leading the way in verifying complex multi-dimensional systems, while shifting verification left to do so earlier in the design process.

Click here to read more

featured chalk talk

Current Sense Shunts
Sponsored by Mouser Electronics and Bourns
In this episode of Chalk Talk, Amelia Dalton and Scott Carson from Bourns talk about the what, where and how of current sense shunts. They explore the benefits that current sense shunts bring to battery management and EV charging systems and investigate how Bourns is encouraging innovation in this arena.
Jan 23, 2024
8,086 views