logo
Constructing Your Memory Palace in the Metaverse

Constructing Your Memory Palace in the Metaverse

In recent years, the concept of the metaverse has gained significant attention and excitement, thanks in part to tech giants like Meta (formerly Facebook) spearheading its development. The metaverse is envisioned as an immersive digital space where individuals gather as avatars to engage in activities much like they would in the physical world. However, despite the promise, current renditions of the metaverse often fall short of capturing the human experience, lacking the appeal and depth of the real world. This gap in the metaverse’s ability to resonate with users of diverse backgrounds and experiences has spurred researchers to explore innovative solutions.

In this blog post, we delve into the innovative research by Immanuel Koh and Ashley Chen, who present a groundbreaking framework for the metaverse using EEG and AI technologies to create a deeply personalized and meaningful digital experience.

The Metaverse and Its Current Limitations

The metaverse represents a thrilling vision of a digital realm where people can work, socialize, and explore imaginary spaces as avatars. However, the current state of the metaverse often lacks the “genius loci” or sense of place that makes physical spaces meaningful and memorable. It struggles to retain novelty and appears unappealing to many users. This is where the research of Koh and Chen comes into play.

Building a Personalized Memory Palace

The core objective of Koh and Chen’s research is to create a metaverse that is deeply personalized and meaningful for each user. They propose a framework that combines wearable technology and AI to capture and express human memory and perception in the digital realm. This approach aims to provide users with unique and immersive experiences, leveraging their individual experiences to construct virtual spaces.

Methodology: EEG, Eye Tracking, and Photogrammetry

The research relies on three key technologies:

  1. EEG (Electroencephalogram) Technology: EEG technology is typically used in the medical field to monitor brain activity. It records electrical signals from the brain in real-time, offering insights into emotional states and cognitive processes. By analyzing EEG data, researchers can identify specific markers associated with emotions and even distinguish between positive and negative emotional processing.
  2. Eye Tracking: Eye tracking technology detects and records eye movements, including gaze patterns, fixation time, and points of interest. By examining how users interact with visual stimuli, researchers can gain insights into their perception and interest levels. This information, when combined with EEG data, provides a more comprehensive understanding of the user’s emotional response to stimuli.
  3. Photogrammetry: Photogrammetry is a technique that extracts three-dimensional (3D) information from two-dimensional (2D) photographs. This process involves capturing images from multiple angles and converting them into 3D models through specialized algorithms. Photogrammetry allows researchers to recreate physical spaces and objects in a digital format, making it an invaluable tool for creating immersive metaverse environments.

The Future of Wearable Technology

The success of Koh and Chen’s framework hinges on wearable technology becoming increasingly portable and widely adopted. Just as smartwatches have become everyday accessories, future wearable devices may capture and record users’ experiences, transforming them into core memories based on their significance. These wearables, equipped with EEG and eye-tracking capabilities, continuously collect data related to emotional responses, gaze patterns, and more.

This data is then used to create a unique profile of each user’s perception and memories, which are integrated into the metaverse. As users interact with this digital world, their personal experiences and memories become integral to the environment, making it a dynamic and personalized space.

The Experimental Process

To validate their framework, Koh and Chen conducted an experiment using EEG and eye-tracking technology. They showed participants a series of images representing interior spaces such as living rooms and bedrooms, resembling memories of home. During the experiment, EEG electrodes were placed on the participants’ foreheads and central regions to measure the alpha frequency band (8-12 Hz), which is associated with emotional and cognitive processing. Eye-tracking technology was used to capture gaze patterns and points of interest.

The data collected from EEG and eye tracking was matched with corresponding images and spectrograms. Two AI models, AI-1 and AI-2, were trained using this data. AI-1 focused on capturing visual perception, while AI-2 aimed to capture the perceptual and subconscious aspects of the user’s mind.

The Results: From EEG and Eye Tracking to the Metaverse

The AI models, AI-1 and AI-2, produced intriguing results. They introduced spatial elements and qualities derived from individuals’ conscious and subconscious perceptions, defamiliarizing the original images through design operations. In essence, these AI models formed a connection between memory and emotion, generating a unique and personalized artificial brain capable of perceiving the world from an individual’s perspective.

Using the predictions from AI-1 and AI-2, Koh and Chen created a potential metaverse space. This space was represented as a point cloud, emphasizing the momentary and fleeting nature of memories and perceptions. Point clouds, with their density and resolution dependent on the angle and time of observation, capture the essence of memories that are often vague or incomplete.

Future Work and Implications

Koh and Chen’s research opens the door to exciting possibilities in the development of the metaverse. As wearable technologies continue to evolve, researchers can explore additional data types beyond images, enriching our understanding of human perceptions.

The potential applications of this research extend beyond the metaverse. In industries like architecture and heritage preservation, this framework can help conserve historical buildings and landmarks in their original states, ensuring they are accessible to future generations. In healthcare, it could provide a valuable tool for improving the well-being of patients, particularly those with limited mobility or conditions like dementia.

Conclusion

The metaverse is an exciting frontier that promises to revolutionize how we interact with digital spaces. Immanuel Koh and Ashley Chen’s research offers a glimpse into the future of the metaverse, one that is deeply personalized and rooted in human experiences. As wearable technology like Enobio continues to advance, the possibilities for creating a metaverse that resonates with users on a profound level are endless. Koh and Chen’s work is a testament to the boundless potential of human-machine collaboration in shaping the digital worlds of tomorrow. Your Memory Palace is no longer confined to your mind; it can now be constructed in the metaverse, offering endless possibilities for personalization and exploration.

References: