A wearable tech aid enabling those with sigh-loss to read faces and experience non-verbal communication in a new, yet intuitive way.
Around 83% of our understanding of the world is taken in visually. Without access to this information, an individual’s ability to understand or participate in given situations can be heavily reduced. Non-verbal cues are a prime example of this. They provide significant insight and feedback in any given social interaction, while also playing a pivotal role in the building and development of friendships and relationships. Lack of awareness of these given cues can be a great source of anxiety and aggravates feelings of isolation.
Cue Sense is looking to provide real-time feedback of non-verbal cues in social settings. Since its conception, Cue Glasses has won multiple competitions at AKQA, Deloitte and Bath Innovation Centre. It is now being developed in collaboration with the Design Council.
My role: Concept / design / engineering / Co-founder and CTO of Cue Sense ltd.
Visit cueglasses.com for more information.
Research & Development
With the kind support of BucksVision Charity, we have been able to meet with a number of visually impaired people and conduct interviews and preliminary experiments. They confirmed how much they struggle in social settings and were excited about a potential solution to this problem. We have realised that CUE should focus on identifying faces, detecting eye contact and recognising facial expressions.
The prototype (still under development) translates facial expressions into sounds which are played through bone-conducting headphones. When a known person is recognised, the user is told their name.
Future Academy 2016
Best Idea Winner.
Business Plan Competition