FRAMOS’ glasses use text and object recognition enabled with intelligent algorithms to translate the visual impression into haptic and audio information.
While audio information relies on object and character recognition, the haptic feedback is provided by a wrist band fitted with vibration motors.
The prototype includes an Intel RealSense 3D camera and speakers for audio feedback. The setup is controlled by a processing hub with a GPS sensor and a LTE module for mobile data connection. Connected via Bluetooth, a micro-processing unit is said to translate visual data into haptic-feedback through an 2D array of vibration motors. Based on the exact location and movement of the vibrating feedback on the arm, the user is informed about the position and distance of objects in the surroundings.
The wearable has a voice controlled interface, which FRAMOS hopes will make interaction easier, and is fitted with rechargeable batteries.