top of page

AR for VIPs

Augmented Reality for Visually Impaired People uses Augmented Reality devices (the Microsoft Hololens) and a combination of spatial audio clues and speech sounds to deliver information about a user's surroundings. The project delivers spatial audio pings so users can understand where walls and obstacles are, as well as reads out any text in the user's environment. Our team conducted a user study with 7 blind users and found promising results.

I worked on AR for VIPs during my semester abroad at UC Berkeley as part of the Extended Reality @ Berkeley group. My main responsibilities were object sonification, which involved automatically breaking down the Mesh generated by the Hololens to distinguish obstacles, and text recognition. The project was presented at the Microsoft Reactor in San Francisco and I later presented it as a poster at India HCI 2019. A demo video and the link to the project site can be found below. You can also check out my poster for India HCI here:

bottom of page