V/Recall was created as part of the MIT Reality Hack. My role: Lead Developer


Background


Alzheimer's disease affects the parts of the brain that control thought, memory, and language, often resulting in impaired recall, thinking, communication & behavior. By 2050, the number of Americans over the age of 65 with Alzheimer's is expected to grow to a projected 12.7 million. VR has the potential to be an extremely valuable tool in this field of research.

So, we created V/Recall! Changes in short-term memory – forgetting words, names, or familiar tasks – is the first and most common sign of Alzheimer's. Our research-by-design VR solution has the potential to improve brain cognition through sensory and spatial exploration in a fun and immersive way.

Inspiration


Memory is essential to all of our lives. Remembering the intricate, special, and unique experiences that take place with the people around us is what makes life special, and with a team member’s grandparent affected by Alzheimer's, we aimed to create a solution that prioritized cognitive stimulation as a means of decreasing memory loss progression through the use of the sensory-immersion capabilities in virtual reality.


Virtual Reality Experience


The virtual reality experience of V/Recall uses physical embodiment, spatial audio, and haptic feedback with a head-mounted VR display headset, alongside hand controllers to simulate the movements of completing basic tasks around the household. Users are instructed to find, hold, and place objects around various scenes while ambient music plays in the background. Sensory-motor coupling is also present through the use of vibrating controllers upon touch. We aimed to stimulate not only the visual senses but also the physical and auditory ones in order to fully embody the user inside of the simulation.


Memory Recall


Following the simulation of household tasks, users are given a brief distractor task in order to refresh their working memory. Upon completion of this task, they are prompted to answer questions about where and when they interacted with specific household items previously encountered in the VR simulation. In the case that a user is unable to recall correctly, they are guided by audio and visual prompters from the simulated scenes in order to assist in the memory-recall process. This includes static images taken from the environments as well as replays of the ambient music heard during the task. Combining the introductions of these sensory reminders with the procedural memory of physically completing the associated actions, V/Recall is intended to increase the recall of episodic memory events.


How we built it


The virtual reality simulation of V/Recall was built in Unity and is composed of multiple scenes comprising the rooms the user is able to interact with during the simulation. The XR interaction toolkit enabled us to build immersive tools to engage the user with sensory stimulation. Interaction events in the environment are triggered one after the other upon the successful completion of scripted tasks. The UI elements were prototyped and created in Figma. All assets appearing in the game were free with appropriate usage licenses.
R