For my final project, I want to explore the power of computation to recognize biofeedback patterns and present them to the user in a recognizable way.
I’ve been working with an EEG headset called the MindWave, which is a single-channel wireless EEG sensor. The headset outputs EEG power band values such as Delta, Theta, Alpha, Beta, and Gamma waves. I’ve successfully been able to output these features to Wekinator, a machine learning application that allows for discrete classification of input features and outputs OSC messages. With this, I have been able to train the system to recognize colors that I am thinking.
What I would really like to do is map a person’s mental state such as emotions to a social feed of memories. For example, I could display several images from a person’s Facebook feed to train the system. Then, as the user lets their thoughts roam, the application would pull up other related images, videos, and posts as a way for the user to visually see what they’re thinking.
Some questions about this:
- Does seeing these memories allow you to maintain to these emotions?
- Can you control what you’re thinking?
- Can we build 3d environment to fully immerse yourself in your “thoughts and memories”?
- What if you can invite others to “experience” what you’re thinking?