A series of workshops
A series of workshops co-designed with specific users, quickly producing prototypes by 3D printing, deep learning model, UI design and other methods
Sept 2023 - Apr 2024
--
User Experience Researcher & Designer
This project presents a assistive technology prototype designed to enhance the navigation experience for Visually Impaired(VI) individuals. Integrated with computer vision and ultrasonic sensor, the system is able to detect and recognize a variety of objects in both indoor and outdoor environments.
Most existing convolutional neural network (CNN) models for road surface semantic segmentation are tailored for the perspective of cars, unsuitable for visually impaired individuals’ walking view.
To address this, a custom dataset was created specifically for visually impaired wayfinding, focusing on roadblocks, pedestrians, and other elements. The dataset was used to train a customised model with YOLOv8.
At the hardware level, the system includes a micro-controller with an embedded camera (ESP32 with OV2640), an ultrasonic sensor (HC-SR04), Bluetooth bone conduction headphones, and an external battery.
The prototype is low-cost and easy to wear, avoiding the need for potentially expensive or specialised components.
Users can customise their experience based on individual preferences.
To enhance usability for the visually impaired, the interface adopts strategies such as increased font size, prominent icons, and the use of pictures and buttons. All text components within the app are accompanied by voice prompts, ensuring that the content remains accessible.