There may never be one, single, most-effective way to implement sensing technology for assisted driving systems (ADAS) and autonomous vehicles (AV). The magic number might instead be six — as in six fundamental considerations that every auto maker will decide how to achieve in its own way, which will lead each to creating its own unique approach to integrating sensors in future vehicles. At the closing session of the AutoSens Brussels 2020 virtual conference, a panel of experts debated the right sensor mix and how to make sure design never compromises safety — and vice versa.
The panelists were Patrick Denny, senior expert for vision systems and advanced driver-assistance systems at automotive embedded supplier Valeo; Paul-Henri Matha, technical leader at Volvo Car Corp.; Robert Stead, managing director at Sense Media Group; and Carsten Astheimer, director of design firm Astheimer Ltd. EE Times Europe also reached out to Pierrick Boulay, technology and market analyst at Yole Développement (Lyon, France), for Yole’s insights on the adoption and usage of various sensor types in automotive systems.
Getting the number right
More and more sensors are being deployed throughout the vehicle to address safety concerns proactively. How many sensors do we have in cars today, and how many do we need to progress to further levels of autonomy? “If we take into account sensors for ADAS — ultrasonic, radar, camera for sensing, camera for viewing, and LiDAR — we estimate that a vehicle has between 10 and 20 sensors, depending on the type of vehicle,” Yole’s Boulay told EE Times. Naturally, high-end vehicles embed more sensors than low-end vehicles or those in the middle of the performance and feature range.
Sensors will be pivotal for unlocking high automation levels, and the number and type of sensors are expected to increase. “We expect that 35 to 40 sensors will be implemented for these automation levels,” said Boulay. “Sensors will be more specific in that we will see sensors for short-, mid-, and long-range applications. One sensor will not be able to cover all applications. Each application or use case will have its specifications and requirements in terms of sensors.”
The increasing number of sensors is only the tip of the iceberg. Sensors generate a ton of data, and systems are heavily limited by the processing power. Moving forward, having enough computing power to process all the data generated by these sensors will be a key feature, said Boulay. “While typical ADAS systems using Intel-Mobileye chips were doing the leap between 0.25 TOPS [10× the performance of a high-end laptop] and 2.5 TOPS for the new EyeQ4 chip, robotic cars are already beyond 250 TOPS,” he said. Eventually, “the E/E [electric/electronic] architecture of a vehicle will need to change from a distributed architecture to a centralized architecture with domain controllers, able to manage the fusion of raw data coming from the sensors.”
So the more sensors, the merrier? “Some may think this, but the number of sensors in cars will not increase indefinitely, for cost or integration reasons,” said Boulay, who expects the numbers of sensors for automation to plateau at some point. “The main difference will be at the software level and the capacity of companies to process the enormous quantity of data efficiently. Some OEMs, like Tesla, are still not using LiDAR and are betting on the combination of sensors and AI computing to achieve high automation levels.”
Objectively, “some OEMs will do better than others with fewer sensors, and the difference will be at the software and computing levels,” he added.
Optimizing the mix
A vehicle might be driving under a big blue sky one moment and through a rain shower the next. Sensors need to be constantly available to measure and monitor variables. An efficient way to enhance availability is by deploying redundant sensors to compensate for possible failures. “There has to be more than one way of looking at the environment,” Valeo’s Denny said during the panel session. “When you are in complete darkness or you have terrible weather conditions, you need a variety of modalities and functions to work together.”
Sensors help in situations where human vision is at a disadvantage, and the diversity of sensors is what makes the car reliable in all weather and light conditions. “Cameras are good during the daytime,” said Boulay, whereas at night or in the fog or rain, “other sensors will not be ‘blind’ [like cameras], and the vehicle will still be able to move, even if it is in a degraded mode.”
Ensuring the right placement
Just like human senses, sensors must be strategically positioned to feed back information on the car’s surroundings continuously. But there are technical limitations to where the sensors can be placed. Condensation in a headlamp, for example, can prevent LiDARs from working. In snow or cold weather, frost can lead to a sensor malfunction. Infrared sensors cannot see through glass and cannot be put behind a windscreen. Similarly, painting over an ultrasonic sensor may alter its acoustic properties, said Denny.
The power consumption of sensors is also a key challenge, said Volvo’s Matha. “Each sensor consumes between 1 and 10 W. If you add all sensors for ADAS functionalities, you can reach 100 or 200 W, and up to 4 g of CO2. We have to reduce the power consumption. [For example], perhaps the functionality of the sensor will not be active all the time.”
Thermal management is another constraint to consider. Behind a windshield, temperature can reach 90°C, and appropriate sensors may not be available, said Matha. “If you put them in another area, in headlamps, for instance, we have some cooling systems, but it’s complex and expensive.”
Simulations and driving tests can help determine the best position for a sensor, the panelists said.
Above all, said Yole’s Boulay, “the position of sensors is closely related to the use cases targeted by OEMs. From what we see currently on vehicles that implement LiDAR for automated driving on highways, the LiDAR is in a central position, almost aligned with the ADAS camera and the long-range radar. For other use cases, like parking or city driving, the position of these LiDAR units will be different and are expected to be on the side or the corners of vehicles.”
Integrating aesthetically
Volvo cars currently integrate 20 types of sensors, said Matha. Many of them are totally hidden. On the Volvo XC90, for instance, the forward parking camera is in the grille, while the side cameras are positioned in each door mirror and the backward-facing camera is fitted above the registration plate. “We can integrate sensors and make them look beautiful,” Matha said.
But do we necessarily need to hide the sensors? Can’t they be a feature?
For Astheimer, if the car is an intelligent product, it should look like it, and “everything shouldn’t be hidden away.” Sensors are small enough to be completely integrated and almost unnoticeable now. However, as we approach full autonomy, with cars driving themselves, some sensors “will need to be extremely prominent.” A 360° LiDAR will need to have full visibility, and its position will accept no compromise.
More important, Astheimer highlighted the need for designers and engineers to work together to make sensors fit the identity of the vehicle. The Deliver-E, an electric delivery vehicle prototype co-developed by Warwick Manufacturing Group (WMG), the University of Warwick, and Astheimer, integrates cameras in the sides of the vehicle and places the LiDAR prominently in the back of the car.
Asked about the pertinence of concentrating sensors in an external pod, Boulay cited Magneti Marelli’s Smart Corner, which can accommodate sensors such as LiDARs, radars, cameras, and ultrasonics, as well as LED-based lighting features like adaptive beam and digital light processing. “It could be easier for OEMs to integrate these pods during the manufacturing process, but in the case of an accident, the cost to repair or change these pods for insurance or consumers would be extremely high,” he said. “A balance will have to be found between integration, reparability, and cost.”
Reducing cognitive overload
The human-machine interface (HMI) not only bridges the driver and the car but also connects the driver with the outside world. The risk is that the driver gets distracted by all the functionalities and misses out on vital driving information.
Engaged in the design of Volta Trucks’ Zero electric delivery truck, Astheimer realized the importance of enhancing the driver’s vigilance. “In London, although heavy traveling vehicles account for less than 4% of overall traffic, they are responsible for over 50% of vulnerable-road–user deaths, i.e., pedestrians and cyclists,” he said. There are two main reasons for that: the lack of direct visibility and the cognitive overload.
“Cognitive overload is a massive issue,” said Astheimer. “We need to make sure ECUs [electronic control units] and CAN [controller area network] systems can read the right signals and display information in the clearest and most simplified way possible, whether it’s tactile, audio, or visual.”
Making safety cool
Referencing a panel comment at an earlier AutoSens Conference, Stead asked the panelists whether “making safety cool” is the key to selling connected cars.
“We do business with safety,” said Matha. “Our customers want safety, and we can do safety only with sensors. So we need to make beautiful cars with sensors.”
There is another dimension to consider. Users need to understand the level of intelligence of their own cars to maintain their alertness to road users and their surroundings. “By making the product safer and safer, you’re distancing the driver away from what the vehicle is doing,” said Astheimer. “In adding levels of autonomy onto the vehicle, you are aiding the driver with the simple things, but you are making the difficult things more difficult to do. The driver is no longer attentive as the vehicle does more and more.”
It is essential that the sensors, and the feedback from them, help the driver maintain awareness of what is going on “rather than just cocooning him from the outside world,” Astheimer said.
The post Key Considerations When Implementing Sensors in Vehicles appeared first on EE Times Asia.
from EE Times Asia https://ift.tt/2J4M8ER
No comments:
Post a Comment
Please do not enter any spam link in the comment box.