Our touchless evolution

When Elenium stepped into 2020 we were fully focused on developing solutions that would enable all passengers, regardless of their ability, to engage with self-service devices at airports.

This meant kiosk and bag drop interaction and the end to end transaction needed to be performed by an action other than touching the screen.

Voice is an obvious alternative to touching screens but we had concerns about noise in a busy airport environment. Our patent pending solution, jointly developed with Amazon Web Services (AWS) is made for loud environments, distinguishing voices in busy environments like airports, hospitals, or medical centers by combining bi-directional microphones.  When a person is detected within the proximity of the device the camera system starts to track the person’s lips. The system then localizes which microphone in the array best aligns with the person’s voice. That microphone becomes the primary audio source, while remaining microphones in the array are used to drown out the background noise. 

We realized voice would not suit all passengers so also developed our patent pending head movement tracker so that a passenger can choose to interact with the self-service device by moving their head. This head control technology uses a machine learning classifier to find a point of interest on a person’s face as an anchor point. The cursor follows the head movement of the person in front of the kiosk.  The interaction is intuitive with no training required.  The reason that we chose the head instead of a hand is to make the action a lot more suitable for a differently abled passenger who may not have a full use of their arms.  At the same time, our research showed that head movement is vastly more stable than hand movement. 

Onwards to February, the team realized the technology we had been developing to help differently abled travellers would have an important application for airports and other public environments in a new more health focused world. Touching surfaces carries the risk of spreading viral and bacterial germs, something we now want to avoid.

According to WHO and many other sources the main vital signs to indicate illness are body temperature, heart rate and respiration rate. Taking the touchless solution, a step further, Elenium partnered with Amazon Web Services to enhance the Amazon Transcribe Medical machine learning technology and integrate it into self-service devices to be used within the airport and public environment. 

Temperature, heart and respiratory rates  are read, processed and analyzed by the device. Travel history and related questions can be asked of the user to further understand the risk. Should the data indicate possible illness this will flag to a service agent who can interact with the individual via video  conference to further evaluate and decide on next steps. This health screening cannot detect an illness but provides a good indication of whether an individual is likely to be ill.

Interest in our Touchless Health Screening is coming from our aviation sector and beyond as businesses seek ways to operate when recovery takes place. Health care faculties would like to screen people coming into hospitals and doctors’ rooms and can autonomously evaluate vital signs of those potentially ill. Large events can screen visitors before admitting entrance while offices and factories can screen staff before entering buildings. 

The solution can take various forms, from a robust suitcase for dusty, water-prone environments like construction sites, factories or mining operation to stand alone versions for office buildings and shopping malls or retrofitted to an existing kiosk, ATM or check out desk.

2020 hasn’t gone as planned for any of us but our hope is, we can move towards a recovery together as soon as possible.

This site uses cookies to offer you a better browsing experience. By browsing this website, you agree to our use of cookies.