Share

September 3, 2024 | On-board multi-sensor perception chain

Credit: CEA
Researchers at CEA-List demonstrated a distributed perception chain integrated into a Twizy electric vehicle on a smart parking use case. The research that led to the advance was part of the CORAM NeVeOS project.

 

In this use case, the vehicle had to reconcile data from several viewpoints, manage wireless network fluctuations, and, ultimately, perceive its environment in order to safely navigate a parking lot also equipped with sensors. Technologies from three laboratories were implemented to address these challenges.

 

1. Multi-sensor perception and fusion

A multi-sensor processing pipeline was built (usin eMMOTEP) to detect obstacles and, especially, pedestrians, even in low light. Data from color and infrared cameras and a multi-slice LiDAR sensor are used to create an occupancy map around the vehicle. All classified obstacles are designated, and their trajectories predicted (using SigmaFusion).

 

2. Distributed multi-viewpoint perception

The parking lot’s sensors (cameras and LiDAR) also gather data, which is used to map the fixed (walls) and moving (pedestrians, vehicles, etc.) obstacles in the environment. In order to adapt to the vehicle’s position in the parking lot and match the obstacles in the map generated by the parking lot’s sensors with those in the map generated by the vehicle’s sensors, the vehicle has to realign the two maps.

 

3. Resilient to wireless network fluctuations by design

The quality of the wireless (Wi-Fi) connection between the vehicle and the parking lot fluctuates. A monitoring device was designed in PolyGraph to continuously measure communication quality in order to maintain predictable response times. If the connection is degraded or interrupted, the vehicle switches to a slower downgraded mode.

 

4. Integration

CEA-List’s Twizy (via the Carnot Network’s Carnauto institute for automotive technologies) was equipped with the prototype. A mobile tower with NVIDIA processors was set up in the Nano-Innov parking lot in Palaiseau to test the system

 

Rendering of obstacles and trajectory prediction
View of the in-vehicle display interface

Demos of NeVeOS were given at CEA-List Tech Days (June 6-7, 2023) and at the Journée Outils Logiciels Matériels pour la Recherche sur les Véhicules Terrestres Autonomes at ENS Paris-Saclay (October 5, 2023).

Learn more

The technology in use:

  • The EU Selfy project: robustness of multi-sensor/multi-viewpoint perception.

Main patents:

  • The NeVeOS demonstrator system leverages several pre- existing patents, including (for the formal communications supervision system, Paul Dubrulle, 2019; and for an iterative method for estimating the motion of a material body by generating a filtered motion grid, Tiana Rakotovao Andriamahefa, 2021).

Integrating technologies from three different labs into a single vehicle is no mean feat, which makes us even prouder to present a demonstrator that works.

Rebecca Cabean

Etienne Hamelin

Project Manager — CEA-List

See also

Focus areas

Mobility and the challenges of trust and environmental responsibility

CEA-List is working on solutions to all the major transformations underway in mobility: digital technology, decarbonization, and new business models.
Read more