Neuro.Nav:
Perception-Driven Navigation


This project aims to digitalize the idea of perception in navigation - by using cognitive maps and environmental perception as a means to navigate in virtual and real environments.

By quantifying and tangibly measuring the extended cognitive phenomena using currently affordable technological advances, we can help in digitally modeling tasks and processes that can be used in several practical use cases.

We programmed this aspect of discrepancy of physical stimuli and cognitive stimuli - intentionality of attentional focus.  This allows users to be focused on a particular target while being able to look around, made possible through cognitive information through peripheral vision which eye tracking cannot do.

To achieve our Perception-Driven Navigation prototype, we looked at ways to use space as a canvas for quantifying the perception.  Our steps included the following:

  1. Real-time mapping of the 3D environment
  2. 3D object Detection
  3. Identify nodal objects and apply SSVEP Neurotag

By using Neurotags, it gave us a total of three values for us to quantify perception:
  1. Triggered Focus on an object
  2. Maintained Focus on an object
  3. Confidence-Change on the focus

Through this, we mapped these data to understand the user's initial direction, acceleration (stay in motion), and speed respectively.

GitHub link: TBA



Duration:  September 2022 - December 2022 (3 months)
Role:  Prototyper
Team:  Kenny Kim, Vishal Vaidhyanathan
Tools Used:  Unity, C#, NextMind, Oculus Quest 2
Category:  MIT Fluid Interfaces | MAS.S61 - Extended Cognition
Instructors: Caitlin Morris, Abhinandan Jain, Pattie Maes










© 2024
Kenny Kim. 
All Rights Reserved.