What will they think of next?

Logo of the International Ergonomics Association.The EinDfa IEA TC newsletter has three examples of digital technology for inclusive and enabling design. While these topics aren’t exactly UD, they interface with UD thinking and inclusive practice. The first is about audio sign language, the second is about aircraft seating, and the third is about our sense of touch. Below is a copy and paste from the newsletter:

Sign Language Ring is a device that detects sign language motion and “translates” that to voice by emitting audio through a speaker. Inspired by Buddhist prayer beads, according to its designers from Asia University, this wearable device includes a bracelet and set of detachable rings worn on select fingers. It can also translate voice to text, transcribing spoken language picked up by a microphone into text that’s displayed on the bracelet’s screen. 

Currently in the prototype stage, Layer company has developed a smart textile for use in Airbus’ economy class seating, called Move, which would allow passengers to monitor and control their seat conditions using their phone. Digitally knitted from a polyester wool blend with an integrated conductive yarn, the smart seat cover is connected to a series of sensors that detect both the passenger’s body and the conditions of their chair, including temperature, seat tension, pressure and movement. The Move app analyses the data collected by the sensors and sends targeted messages to the passenger telling them how they can improve their comfort. Moreover, during the flight, the seat automatically adjusts itself based on the passenger’s weight, size and movement by passing a current through the conductive yarn to change the seat tension.

The Massachusetts Institute of Technology (MIT) has recently developed an inexpensive sensor glove designed to enable artificial intelligence to figure out how humans identify objects by touch. Called Scalable TActile Glove (STAG), it uses 550 tiny pressure sensors to generate patterns that are used to create improved robotic manipulators and prosthetic hands. The MIT project is very suggestive, since researchers are intentioned to replicate human’s ability to figuring out what an object is just by touch. Using the STAG glove pressure sensors, the MIT is gathering as much touch information as possible for creating a large enough databases, to sustain a machine learning process that could bring to create a system able to perform analysis and deduce not only how a human hand can identify something, but also how it can estimate its weight, something robots and prosthetic limbs have trouble doing today.

The newsletter is from the Technical Committee on Ergonomics in Design for All (EinDfa). Thanks to Isabella Tiziana Steffan for this content.

 

Accessibility Toolbar