Share

Multisensory screens make the digital world more inclusive

©CEA
CEA-List developed a touch screen that uses local vibrations to convey context-enriched information. Integrated into a tablet, the screen could bring people with visual impairment or blindness an enhanced digital experience.

Better access to digital for people with visual impairment or blindness

Digital technology is something we depend on for many of our activities—and something we have come to take for granted. For people with visual impairments, however, these everyday tools are far from optimal. Accessing digital content—especially when it includes visual elements like images or graphics—can be hard. Text-to-speech and other workarounds are not very sophisticated or interactive and do little to remove barriers for people who have both visual and hearing impairments.

CEA-List sensory and ambient interface experts designed an innovative display with haptic feedback that could fill this gap. Local vibrations, which the user can feel as they interact with or explore the surface of the screen, provide context-enriched information (i.e., information about the position of the user’s finger on the screen). The technology is also multi-touch, which means it can respond to more than one finger on the surface of the screen at the same time.

 


Left: prototype of the 10-inch screen.
Right: illustration of the color zones and the associated haptic effects. ©CEA


 

Prototype of a multi-touch display with local haptic feedback

The display was developed as part of the EU Ability project. A recently-presented functional prototype—built with an OLED display for a 10-inch tablet—was used to demonstrate the feasibility of the multi-sensory technology for three use cases:

  • Data graphs: Numerical values are converted into vibrations of different intensities, for example ;
  • Geographical maps: Vibrations provide guidance along a route ;
  • Image exploration: Vibrations are associated with different areas of an image, allowing the user to explore with several fingers to understand the image’s spatial organization.

The tablet was presented at the Embedded World Exhibition and Conference in Nuremberg, Germany in April 2024.

During the demonstration, users were able to explore a geographical map with their fingers and locate points of interest. ©CEA

An algorithmic innovation

The device is made from a matrix of piezoelectric actuators glued behind a tablet display and coupled to the display to make it vibrate.

Where the real innovation lies is in the matrix’s control algorithm, which can locate multiple inputs and respond to them with separate local vibrations at the same time. The algorithm is based on a proprietary reverse-filtering technique developed by CEA-List. It can handle up to ten fingers—spaced at least 15 mm apart—at the same time.

The innovation can also be used in two other novel ways. First, in addition to the mechanical vibrations described above, the matrix can also emit sound vibrations (at higher frequencies) to produce a truly multisensory experience. This experience will be further enhanced by the algorithm’s ability to control the directivity—in other words, the perceived position—of an audio source.

Contactless interaction with the display, which is currently being prototyped, is the second exciting use. These additional capabilities will target not only people with visual impairments, but also people, such as those with motor disabilities, who experience challenges interacting using touch. The technology could also prove useful in situations where touch interaction is impractical, like car interiors and industrial workstations.

Here’s how it works: The piezoelectric matrix emits ultrasound waves (at frequencies even higher than sound waves), and then captures the waves that bounce back off the user. By analyzing this signal, the algorithm can interpret the user’s movements and generate the corresponding response.

 


The matrix of 16 piezoelectric actuators glued behind the display. ©CEA

 

A wide range of use cases investigated

CEA-List is working closely with several other Ability project partners to fine-tune the technology and its planned use cases:

  • R&D is underway with France-based Insidevision, a manufacturer of tablets for people with visual impairments, to integrate the technology’s haptic feedback capabilities into the company’s products;
  • Insidevision is contributing its in-depth knowledge of accessibility software to help CEA-List develop the interpretation software layers needed for the display to be operational;
  • Additional studies have begun with OFFIS, a research institute in Germany, to utilize the technology in tandem with OFFIS’s automatic image recognition software. The goal is to enable tactile image discovery through feature extraction, object location, and other capabilities;
  • Sweden’s Lund University, ULUND, , the French association for universal and inclusive design, H-lab, the Lithuanian Union for the Blind and Visually Impaired, LASS and the German manufacturer Siemens are helping design and evaluate use cases;
  • Employees at Samsung’s UK subsidiary are developing AI algorithms for text analysis and predictive typing.

The EU Ability project

The purpose of the Horizon 2022-2025 Ability project is to respond to the challenge of content accessibility for people with visual impairments, including those who also have hearing impairments, by designing a complete portable solution.  The technologies developed include CEA-List’s multisensory tablet and a pin-based Braille display, also based on piezoelectric actuators.

CEA-List has produced a demonstrator of this second display. Compared to similar devices available on the market, CEA-List’s technology is based on an innovative concept that requires fewer actuators, making the system more economical for distribution to a wider audience.

In addition to CEA-List, the Ability project consortium includes a research institute (OFFIS), a manufacturer of equipment for people with visual impairments (Insidevision), two end-user organizations (H-lab and LASS), a university (ULUND), and two manufacturers (Samsung and Siemens).

Learn more about the EU Ability project

The main innovation behind this technology is the use of a single actuator—the piezo matrix—to enable different display features. This advantage will open up a larger market to absorb the additional cost of integrating the technology into the display, which will help make this reliable solution—and digital content—more readily available at a lower cost to people with visual impairments.

Charles Hudin

Research Engineer — CEA-List

Two-question interview with Sabrina Panëels, CEA-List research engineer

How did the Ability project come about?

It all started with the multi-point, localized haptic feedback display technology developed by our laboratory. We became aware of an EU call for projects around accessibility, and we thought our technology might be a good fit for this kind of multi-partner project!

 

The project will be completed by the end of 2025. Now that a prototype has been made, what’s left to do?

We are currently finishing up the hardware and integrating the software layers. The display will then be operational. We will be completing a series of user tests in 2025 to figure out the best haptic strategies. Next, we’ll be looking at how to integrate sound in a way that is complementary to the haptic features. The idea is to create a truly multisensory tablet.

Learn more

2024 Activity Report

ABILITY, a digital accessibility revolution for the visually impaired

To make digital content accessible, we’ve developed the world’s first multisensory tablet as part of the European ABILITY project, leveraging CEA-List’s innovative haptic technologies.
Read more