Project V.I.E.W

Capstone 2017


Humans primarily use their sense of sight for spatial awareness and to navigate their surroundings, but lack the ability to employ their other senses as guidance systems. This report details the design of a wearable feedback device which makes visionless navigation possible, and has potential uses in low light environments when artificial light is not an option, situations where 360◦ awareness is important (such as riding a bike on a busy road) and applications where a user is piloting a system remotely. Visually impaired persons could be positively impacted by an affordable device which allows them to sense their surroundings in greater detail than other solutions on the market today.

The feedback platform developed during this research has shown the ability to assist individuals with recognizing and avoiding walls. This has been tested in a virtual maze with a joystick controller and a virtual reality headset. In a majority of cases, users were able to successfully navigate an unknown maze without vision, using the wearable as their primary method for navigation.



The Unity game engine enabled the rendering of a 3D enviroment to navigate with either a controller or a full VR system (HTC VIVE). Signals were routed by Uniduino to the Arduino RF dongle via USB

As a player, the goal was to navigate the maze with vision. Then naviagte a randomly given maze blindfolded (visionless) with the use of the VIEW system.


Altium was used to design the PCB for production. To view the schematic and board design, follow this link. We fabricated the PCB in China using PCBWAY. Assembly was done in back in Boston by hand.

The system functioned wireless using the NRF24L01 module. The system was powered with a LiPo battery, with voltage and charge protection.


OnShape was used to design the enclosure. To view the enclosure, follow this link.

Design considerations were as follows:

  • Snap fit PCB
  • Space for LiPo battery + Connector
  • External Power Port
  • Programming port
  • Snap fit enclosure top
  • Belt loop on back of enclosure

This enclosure was manufactured by MakeXYZ with a .step file


Blindfolded users experienced drastic improvements navigating through the course with haptic feedback. While navigating through a two-dimensional space, users found difficulty monitoring their speed. As a result they were given audio feedback when moving. Subjects with more experience playing video games were able to complete the two-dimensional tests with less difficulty than those who did not play games. However, subjects globally completed three-dimensional mazes with fewer collisions using the feedback system. Once in a three-dimensional test, test subjects were able to navigate with more comfort. In the virtual test setting, users had a much stronger sense of speed and spacial awareness when walking in real life instead of using a controller.


Real world user, navigateing maze with the VIEW system. (Without vision)

In game footage, of same user navigateing maze (Without vision)


1st Place

2017 Capstone Competition

Northeastern University

With strong competiton, competing aginst 12 other teams, the VIEW team took first place overall.

Link to formal website


Augmented Cognition Laboratory

Northeastern University

A special thank you to Dr. Sarah Ostadabbas from NEU's ACLab. Without her mentorship, financial support, as well as institutional backing this project would not have been possible.

Learn more about the ACLab