Share

Industry 5.0 will be driven by smarter, more interactive robotics

One of the main challenges facing today’s manufacturing companies is how to rapidly reconfigure their production lines—a capability that is vital to staying competitive. CEA-List, a center for advanced research and development in robotics and artificial intelligence, introduced the Carnot Robot Compagnon cobotics project in 2021 to help make factories more flexible and agile. The project’s first deliverables are smart, interactive robots that are easy to program, use, and deploy on the factory floor.

Production systems will need to be flexible and agile to overcome the challenges facing the manufacturing industries. In addition to making the most of limited space and other resources, the factories of the future will also have to be able to respond to growing demand for mass customization by producing small batches—and even single units—of highly personalized products. And this means having production lines and, especially, robots, that any operator can reconfigure on the fly without any special programming skills. Because operators—and their know-how—will be at the center of these production lines, the robotic systems deployed will have to be able to work alongside humans and rapidly learn new tasks to meet new production needs. And they must also be easy to use. Around a dozen CEA-List labs are contributing to this cobotics project to gradually develop and assemble technology bricks that will be easy to integrate into future demonstrators.

Use case no. 1: industrial assembly tasks

Assembly tasks are very common on production lines. But, because no two assembly tasks are exactly alike, even tasks that seem simple and repetitive can be hard to robotize. Assembly procedures can involve similar-looking parts. Or parts arranged randomly in bins. Another challenge is staying within the tolerances for assembly parameters like tightening. The first demonstrator that came out of this project was a robotic arm designed in 2021. This version identifies and locates parts using cameras combined with 2D and 3D vision technology based on AI algorithms and more conventional object registration techniques. A command-control system enables the robotic arm to follow the necessary trajectory to reach a part and to deploy the robotic skills necessary for tasks like grasping and inserting a part.

Here, the demonstrator was tested on a gear assembly task, where it successfully grasped the parts and inserted them within an assembly tolerance of 10 microns. This is a much higher degree of precision than what the cobots used in factories today can deliver. Decoupling movements in different directions and controlling the force of the robot’s effector are the secrets behind the exceptional precision.

A winning combination of expertise from a dozen CEA-List laboratories in robotics, AI, digital simulation, and software engineering

Another area in which CEA-List scientists and engineers excel is the digital twin. These virtual production-line replicas are updated as parts are detected and located, generating the optimal trajectories for the robot to move around without colliding with any obstacles in its path.

The demonstrator also has a human-machine interface (HMI) that displays real-time information on the status of the task and alerts the human operator if needed. If the human operator intervenes, the vision algorithms analyze the operator’s behavior so the robot can learn from it.

The orchestration and engineering tools in CEA-List’s Papyrus software suite were instrumental to getting the multiple technologies integrated into the demonstrator to interact with each other so that the system could function properly. The control system was designed to be modular and interoperable so that the different technology bricks, components, and software applications could be switched in and out easily. This also facilitates adding new components, like specific tools for new tasks, or even additional robots, which can then work together on tasks.

A second demonstrator was built in 2022 to test two robots working together. One was equipped with a gripper and screwdriver. The tasks that make up the assembly process were parallelized, cutting total assembly time almost in half.

New use cases on the horizon

CEA-List also investigated use cases like inserting connectors or opening battery compartments to remove spent batteries. They are currently looking at disassembly processes that could be used in factories implementing circular economy principles.

The next step will be for operators with no specific programming skills to test the demonstrators. Proof-of-concept test cases involving programming basic tasks by demonstration or imitation and more complex tasks by giving instructions in natural language were completed. Future iterations of the demonstrator will integrate not only these technologies, but also reinforcement learning, both independent and leveraging feedback from human evaluators, similar to the latest conversation agents.

See also

Technology platforms

SMART interactive robotics platform

Improve robots’ capabilities and develop new ways of interaction with humans.
Read more
Research programs

Smart robotics

Creating smart interactive robots to serve humans.
Read more
Research programs

Robotic systems

To create smart robots, development work in mechatronics, control, computer science and perirobotics must be carried out in a coordinated and unified manner.
Read more
Challenges

The Factory of the Future

The pervasive use of digital technology in all areas of industry from product design to manufacturing is rapidly ushering in a new industrial era.
Read more