Project Echo, Auditory Augmented Reality
‘Project echo’, is an auditory augmented reality project that tracks the user’s spatial position and renders sound relative to where the user is standing. Much like visual augmented reality, the audio produced can be placed into a real world position making user experiences more immersive through augmented sound.
A Microsoft Kinect camera is used to capture the user’s depth information, this information is then translated into a digital 3D engine that renders sound within the digital space. The sound is then translated back into the real world using the echo speaker system. The system is able to provide environmental depth cues to the user, adapting their aim to create the ideal augmented audio environment for the user.
This project explores the initial stages of developing a system that could eventually enable the visually impaired to navigate the world using augmented sound. The concept of sensory substitution allows for the brain to interpret sound information as visual information, like echolocation. In the future, this system could be used in public spaces or throughout the home to provide audio visual cues to create an immersive, co-operative multimedia user experience.
The speaker system is designed to sense the user, aiming and adapting sound to the position of the user. This augmented audio systems application could also be adapted to existing surround sound setups.