Calendar

Share
Forum

AutoSens

CEA-List will participate in Autosens at the HELIAUS project booth and with the conference Multimodal fusion: an assessment of LWIR camera data for automotive held by Erwan Piriou [CEA-List].

From 12 Sep. To 14 Sep 2022
ADD TO CALENDAR 20220912 20220914 France CEA-List will participate in Autosens at the HELIAUS project booth and with the conference Multimodal fusion: an assessment of LWIR camera data for automotive held by Erwan Piriou [CEA-List]. <p>The best-in-class event bringing together ADAS and AV specialists to shape the future of vehicle perception.</p> <p>The HELIAUS project is run by a consortium of 11 complementary european partners coordinated by Lynred. The EU aims at a 50% reduction in the number of people seriously injured on the roads between 2015 and 2025. The HELIAUS project will contribute to these objectives by providing an affordable and valuable thermal-based perception system to the ADAS system. This project could be leveraged on other highly-automated systems like the industry, robotics, IoT or Health.</p> <p>As of today for many applications such as autonomous vehicles, no sensor is alone enabling a robust and accurate detection in all conditions.</p> <p>Some solutions based on Multimodal Sensors Data fusion combine each sensors benefits and enable redundancy features. But multi sensors integration and data fusion become complex and challenging when electronics units shall fulfil embedded design constraints (Latency, power consumption, cost).</p> <p>CEA platform aims to integrate a complete multimodal perception chain on constrained computing units.</p> <p>It demonstrates robust and accurate perception systems are possible mixing complementary sensors (depth sensors, vision, IR) while environment model (object and map fusion) is generated with demanding constraints as high responsiveness, limited computing and power budget at low cost.</p> <p>It offers the opportunity to investigate core components of multimodal perception chains, and innovate in their implementation considering embedded software, hardware acceleration or smart sensors approaches.</p> <p><strong>Our inputs:</strong></p> <ul> <li>LWIR and RGB inferences: <ul> <li>Use of LWIR Raw Data to Reduce computing power</li> <li>40Fps on 15W power budget on Embedded NVIDIA GPU</li> </ul> </li> <li>Lidar: Integer Grid fusion approach</li> <li>Ultra Low Power grid fusion (under 2W execution)</li> </ul> <p>To learn more about the project and its solution, come meet the Heliaus partners on booth 30.</p> <p>Erwan Piriou [CEA-List] will also present the conference « Multimodal fusion: an assessment of LWIR camera data for automotive » on Tuesday 13th September.</p> <p><strong>Abstract:</strong></p> <p>We present a unified perception approach integrating data from a pair of stereo LWIR cameras and a LIDAR. It allows the integration of a set of functionalities from thermal IR raw data to a 2.5D representation.</p> <p>Thus, the contribution of IR camera is evaluated through:</p> <ul> <li>Several dedicated light-weight CNN-based detectors for pedestrian and multiclass objects on LWIR images,</li> <li>The assessment of the stereo LWIR through point clouds comparison with LIDAR with a specific focus on a custom stereoIR camera rig platform,</li> <li>Fusion of 3D data from IR stereovision and semantic data from IR inference into a 2.5D occupancy grid with clustering and object dynamics analysis,</li> </ul> <p>In addition, a dedicated multi-sensor platform provides a reference baseline to address the issues of calibration and computational load balancing on embedded targets for the implemented functionalities.</p> <p style="text-align: center;">To register: <a href="https://auto-sens.com/events/brussels/pass/" target="_blank" rel="noopener">https://auto-sens.com/events/brussels/pass/</a></p>

The best-in-class event bringing together ADAS and AV specialists to shape the future of vehicle perception.

The HELIAUS project is run by a consortium of 11 complementary european partners coordinated by Lynred. The EU aims at a 50% reduction in the number of people seriously injured on the roads between 2015 and 2025. The HELIAUS project will contribute to these objectives by providing an affordable and valuable thermal-based perception system to the ADAS system. This project could be leveraged on other highly-automated systems like the industry, robotics, IoT or Health.

As of today for many applications such as autonomous vehicles, no sensor is alone enabling a robust and accurate detection in all conditions.

Some solutions based on Multimodal Sensors Data fusion combine each sensors benefits and enable redundancy features. But multi sensors integration and data fusion become complex and challenging when electronics units shall fulfil embedded design constraints (Latency, power consumption, cost).

CEA platform aims to integrate a complete multimodal perception chain on constrained computing units.

It demonstrates robust and accurate perception systems are possible mixing complementary sensors (depth sensors, vision, IR) while environment model (object and map fusion) is generated with demanding constraints as high responsiveness, limited computing and power budget at low cost.

It offers the opportunity to investigate core components of multimodal perception chains, and innovate in their implementation considering embedded software, hardware acceleration or smart sensors approaches.

Our inputs:

  • LWIR and RGB inferences:
    • Use of LWIR Raw Data to Reduce computing power
    • 40Fps on 15W power budget on Embedded NVIDIA GPU
  • Lidar: Integer Grid fusion approach
  • Ultra Low Power grid fusion (under 2W execution)

To learn more about the project and its solution, come meet the Heliaus partners on booth 30.

Erwan Piriou [CEA-List] will also present the conference « Multimodal fusion: an assessment of LWIR camera data for automotive » on Tuesday 13th September.

Abstract:

We present a unified perception approach integrating data from a pair of stereo LWIR cameras and a LIDAR. It allows the integration of a set of functionalities from thermal IR raw data to a 2.5D representation.

Thus, the contribution of IR camera is evaluated through:

  • Several dedicated light-weight CNN-based detectors for pedestrian and multiclass objects on LWIR images,
  • The assessment of the stereo LWIR through point clouds comparison with LIDAR with a specific focus on a custom stereoIR camera rig platform,
  • Fusion of 3D data from IR stereovision and semantic data from IR inference into a 2.5D occupancy grid with clustering and object dynamics analysis,

In addition, a dedicated multi-sensor platform provides a reference baseline to address the issues of calibration and computational load balancing on embedded targets for the implemented functionalities.

To register: https://auto-sens.com/events/brussels/pass/

ADD TO CALENDAR 20220912 20220914 France CEA-List will participate in Autosens at the HELIAUS project booth and with the conference Multimodal fusion: an assessment of LWIR camera data for automotive held by Erwan Piriou [CEA-List]. <p>The best-in-class event bringing together ADAS and AV specialists to shape the future of vehicle perception.</p> <p>The HELIAUS project is run by a consortium of 11 complementary european partners coordinated by Lynred. The EU aims at a 50% reduction in the number of people seriously injured on the roads between 2015 and 2025. The HELIAUS project will contribute to these objectives by providing an affordable and valuable thermal-based perception system to the ADAS system. This project could be leveraged on other highly-automated systems like the industry, robotics, IoT or Health.</p> <p>As of today for many applications such as autonomous vehicles, no sensor is alone enabling a robust and accurate detection in all conditions.</p> <p>Some solutions based on Multimodal Sensors Data fusion combine each sensors benefits and enable redundancy features. But multi sensors integration and data fusion become complex and challenging when electronics units shall fulfil embedded design constraints (Latency, power consumption, cost).</p> <p>CEA platform aims to integrate a complete multimodal perception chain on constrained computing units.</p> <p>It demonstrates robust and accurate perception systems are possible mixing complementary sensors (depth sensors, vision, IR) while environment model (object and map fusion) is generated with demanding constraints as high responsiveness, limited computing and power budget at low cost.</p> <p>It offers the opportunity to investigate core components of multimodal perception chains, and innovate in their implementation considering embedded software, hardware acceleration or smart sensors approaches.</p> <p><strong>Our inputs:</strong></p> <ul> <li>LWIR and RGB inferences: <ul> <li>Use of LWIR Raw Data to Reduce computing power</li> <li>40Fps on 15W power budget on Embedded NVIDIA GPU</li> </ul> </li> <li>Lidar: Integer Grid fusion approach</li> <li>Ultra Low Power grid fusion (under 2W execution)</li> </ul> <p>To learn more about the project and its solution, come meet the Heliaus partners on booth 30.</p> <p>Erwan Piriou [CEA-List] will also present the conference « Multimodal fusion: an assessment of LWIR camera data for automotive » on Tuesday 13th September.</p> <p><strong>Abstract:</strong></p> <p>We present a unified perception approach integrating data from a pair of stereo LWIR cameras and a LIDAR. It allows the integration of a set of functionalities from thermal IR raw data to a 2.5D representation.</p> <p>Thus, the contribution of IR camera is evaluated through:</p> <ul> <li>Several dedicated light-weight CNN-based detectors for pedestrian and multiclass objects on LWIR images,</li> <li>The assessment of the stereo LWIR through point clouds comparison with LIDAR with a specific focus on a custom stereoIR camera rig platform,</li> <li>Fusion of 3D data from IR stereovision and semantic data from IR inference into a 2.5D occupancy grid with clustering and object dynamics analysis,</li> </ul> <p>In addition, a dedicated multi-sensor platform provides a reference baseline to address the issues of calibration and computational load balancing on embedded targets for the implemented functionalities.</p> <p style="text-align: center;">To register: <a href="https://auto-sens.com/events/brussels/pass/" target="_blank" rel="noopener">https://auto-sens.com/events/brussels/pass/</a></p>