🔥 Land A High Paying Web3 Job In 90 Days LEARN MORE

NUS Computing’s Breakthrough Wearable Assistive Device Empowers Visually Impaired Individuals

In this post:

  • AiSee, a wearable AI device, helps visually impaired people recognize objects easily using discreet headphones.
  • It works by capturing images and answering user queries through advanced AI technology.
  • The device aims to promote independence and inclusivity for visually impaired individuals, with ongoing efforts to make it affordable and accessible.

In a groundbreaking development for assistive technology, researchers from the National University of Singapore’s School of Computing (NUS Computing) have introduced AiSee, an innovative wearable device designed to assist visually impaired individuals. With the power of artificial intelligence (AI), AiSee aims to enhance the daily lives of those facing the challenges of visual impairment.

For visually impaired people, performing everyday tasks like grocery shopping can be a daunting experience. Recognizing and identifying objects is crucial for making informed decisions; this is where AiSee steps in. Developed progressively over five years, AiSee offers a novel solution to this issue by harnessing state-of-the-art AI technologies.

Lead researcher of Project AiSee, Associate Professor Suranga Nanayakkara, from the Department of Information Systems and Analytics at NUS Computing, emphasized the importance of a user-centric approach in developing AiSee. Unlike traditional approaches that involve glasses augmented with a camera, AiSee takes an alternative route. The device incorporates a discreet bone conduction headphone, eliminating concerns about stigmatization associated with wearing glasses.

AiSee’s operation is simple and intuitive. Users only need to hold an object and activate the in-built camera to capture an image. With the assistance of AI, AiSee identifies the object and provides additional information when queried by the user.

Three key components

AiSee comprises three fundamental components:

See also  Character.AI faces lawsuit for driving kids into mental health problems

The eye: Vision engine computer software

AiSee incorporates a micro-camera that captures the user’s field of view. This software component, referred to as the ‘vision engine computer,’ is capable of extracting features such as text, logos, and labels from captured images for processing.

The brain: AI-powered image processing unit and interactive Q&A system

After taking a photo of the object of interest, AiSee utilizes advanced cloud-based AI algorithms to process and analyze the images for object identification. Users can also pose various questions about the object. AiSee excels in interactive question-and-answer exchanges thanks to its powerful language model.

The speaker: Bone conduction sound system

AiSee’s headphone employs bone conduction technology, allowing sound transmission through the skull bones. This ensures visually impaired individuals receive auditory information while remaining aware of external sounds, such as conversations or traffic noise. Environmental sounds are essential for decision-making, especially in safety-critical situations.

Unlike many wearable assistive devices requiring smartphone pairing, AiSee is a self-contained system operating independently without additional devices.

Empowering the visually impaired

AiSee’s potential impact on the visually impaired community is significant. Currently, individuals with visual impairment in Singapore lack access to assistive AI technology of this caliber. AiSee has the potential to empower them to perform tasks that typically require assistance independently. Ongoing efforts to make AiSee more affordable and accessible include ergonomic design enhancements and a faster processing unit.

See also  Google pushes to break Microsoft’s exclusive hold on OpenAI

Mark Myres, a visually impaired NUS student who tested AiSee, praised its inclusivity. He highlighted that AiSee strikes a balance, benefiting both visually impaired and blind individuals. The device’s versatility opens up new possibilities for a wide range of users.

Professor Suranga Nanayakkara’s team is currently collaborating with SG Enable in Singapore to conduct user testing with visually impaired individuals. The insights gained from this testing will help fine-tune AiSee’s features and performance.

A Step-By-Step System To Launching Your Web3 Career and Landing High-Paying Crypto Jobs in 90 Days.

Share link:

Disclaimer. The information provided is not trading advice. Cryptopolitan.com holds no liability for any investments made based on the information provided on this page. We strongly recommend independent research and/or consultation with a qualified professional before making any investment decisions.

 

Most read

Loading Most Read articles...

Stay on top of crypto news, get daily updates in your inbox

Editor's choice

Loading Editor's Choice articles...
Cryptopolitan
Subscribe to CryptoPolitan