By Team Syncer
Contributors: Alif Jakir, Tsing Liu, Soo Bin Park, Yechan Ian Seo, Syed Hussain Ather
"We stuck together the Ultracortex with a Quest 3, but made space by giving the prefrontal cortex a lobotomy, and then we added a bunch of electrodes, and made a local multiplayer shared experience."
Table of Contents
- Introduction
- Project Genesis: MIT Reality Hack
- Features
- Use Cases
- Bill of Materials (BOM)
- Hardware Setup
- Software Setup
- Machine Learning Model
- Running OpenGalea
- Colocation Implementation
- Troubleshooting
- Contributing
- License
- Acknowledgements
- Contact
- Inspirations
- Repositories
OpenGalea is an open-source project that merges neuroscience and mixed reality to create immersive, brain-controlled experiences. By combining an 8-channel EEG system (based on the OpenBCI Cyton board) with the Meta Quest 3, OpenGalea enables users to interact with virtual environments using their brainwaves. This project aims to democratize neurotechnology, making it more accessible and affordable for researchers, developers, and enthusiasts. OpenGalea offers an accessible and open-source solution for creating valuable datasets and interactive closed-loop visual and auditory experiences that are entirely brain-controlled.
OpenGalea was conceived and developed during the hardware track at MIT Reality Hack, a renowned annual hackathon that brings together innovators to push the boundaries of mixed reality, AI, hardware, software, and game development.
- Colocated Mixed Reality: Supports shared virtual environments where multiple users can interact in the same physical space.
- Brain-Computer Interface: Integrates an 8-channel EEG system for real-time brainwave analysis.
- Custom Machine Learning Model: Utilizes a trained ML model for accurate classification of mental states (e.g., Attention, Relaxation).
- UDP Communication: Facilitates seamless communication between EEG hardware, laptops, and the Quest 3 headset.
- Unity Development: Leverages the Unity platform and Meta XR SDK for creating immersive mixed reality experiences.
- Open-Source and Affordable: Provides a cost-effective alternative to expensive commercial neurotechnology solutions.
- Gaming: Develop brain-controlled game mechanics for more immersive experiences.
- Therapeutic Applications: Create neuroadaptive environments for meditation, relaxation, and mental wellness.
- Collaborative Training: Enhance team-based training through shared virtual simulations.
- Accessibility: Provide assistive technology for individuals with disabilities.
- Non-Verbal Communication
A detailed BOM, including component sources and costs, is available [here](link to your BOM - Google Sheet or Markdown table).
Cost Comparison:
- OpenGalea: Approximately $1,900
- Commercial Equivalent (e.g., Galea): Approximately $30,000
OpenGalea is approximately 15.8 times more cost-effective than comparable commercial systems.
- Headset Components: The front and back components of the OpenGalea headset are designed for 3D printing. STL files are available in the
OpenGalea/3d-models
directory of the Hardware Repository. - Recommended Materials: PLA or PETG
- Print Settings:
- Layer Height: 0.2mm
- Infill: 20-30%
- Supports: As needed
- Refer to your filament and printer documentation for specific temperature settings.
A detailed, step-by-step assembly guide with diagrams and photos is available in the Hardware Repository's HARDWARE.md.
Key Assembly Steps:
- Prepare all components (3D printed parts, OpenBCI Cyton board, electrodes, wiring, Velcro straps).
- Assemble the Ultracortex frame (refer to OpenBCI Ultracortex Mark IV documentation if needed).
- Integrate the Cyton board onto the 3D printed back component.
- Attach electrodes and route wiring.
- Mount the front component to the Ultracortex frame.
- Add weights and Velcro straps for balance and fit.
- Connect the assembled system to the Quest 3 headset.
- Operating System: Windows 10 or 11 (for OpenBCI GUI and model training)
- Unity: Version 2022.3 or later
- Meta XR SDK: Download and import into your Unity project
- OpenBCI GUI: Download from the OpenBCI website
- Python: Version 3.9 or later (for machine learning components)
- Python Libraries:
numpy
scipy
scikit-learn
joblib
pylsl
(for Lab Streaming Layer)websockets
- Install using pip:
pip install numpy scipy scikit-learn joblib pylsl websockets
- Visual Studio or Rider (for C# development in Unity)
- Blender (optional, for 3D model editing)
-
Clone the Repositories:
- Software Repository (Unity, ML, BCI):
git clone [[https://github.com/sbpark422/Syncer.git](https://www.google.com/search?q=https://github.com/sbpark422/Syncer.git)]([https://github.com/sbpark422/Syncer.git](https://www.google.com/search?q=https://github.com/sbpark422/Syncer.git)) cd Syncer
- Hardware Repository (Design Files + Hardware):
git clone [[https://github.com/Caerii/OpenGalea.git](https://www.google.com/search?q=https://github.com/Caerii/OpenGalea.git)]([https://github.com/Caerii/OpenGalea.git](https://www.google.com/search?q=https://github.com/Caerii/OpenGalea.git)) cd OpenGalea
- Software Repository (Unity, ML, BCI):
-
Set up the Unity Project:
- Open Unity Hub and create a new project using Unity 2022.3 or later.
- Import the Meta XR SDK.
- Copy the contents of the
Assets
directory from theSyncer
repository into your Unity project'sAssets
folder.
-
Install Python Dependencies:
cd Syncer/ML # Navigate to the ML directory within the Syncer repo pip install -r requirements.txt
(Create a
requirements.txt
file listing all Python dependencies within theSyncer/ML
directory) -
Install OpenBCI GUI
- Download and install the OpenBCI GUI according to your operating system.
- EEG data was collected using the OpenBCI Cyton board and the provided Python script (
Syncer/ML/model_dev_Attention.py
). - Data was collected for two states: "Attention" and "Relaxation."
- The dataset comprises 112,900 data points (498 seconds).
- Electrode Placement:
- Fp1: Left frontal lobe
- Fp2: Right frontal lobe
- C3: Left central region
- C4: Right central region
- T5: Left temporal lobe
- T6: Right temporal lobe
- O1: Left occipital lobe
- O2: Right occipital lobe
- A modified Random Forest algorithm was used for classification.
- Features: Raw EEG data, alpha wave band power, and beta wave band power.
- The model was trained using the collected EEG data and saved as
Syncer/ML/random_forest_model_Attention_AlphaBeta.joblib
. - Training scripts and details are available in the
Syncer/ML
directory.
- The
Syncer/ML/main_Attention.py
script loads the trained model and performs real-time inferencing on incoming EEG data from the Cyton board via LSL. - Inferencing is performed on 1-second time windows, with results updated every second.
- Hardware Setup: Assemble the OpenGalea headset and connect it to your Quest 3. Connect the OpenBCI Cyton board to your laptop.
- Software Setup:
- Launch the OpenBCI GUI and start streaming EEG data.
- Run the
main_Attention.py
script from theSyncer/ML
directory to start the machine learning model and begin real-time inferencing. - Open the OpenGalea Unity project and build/deploy it to your Quest 3.
- Start the Experience: Launch the OpenGalea app on your Quest 3.
- Hardware Setup: Each participant needs a fully assembled OpenGalea headset, a Quest 3, and a laptop.
- Software Setup:
- Ensure all devices are on the same Wi-Fi network.
- Launch the OpenBCI GUI on each laptop and start streaming EEG data.
- Run the
main_Attention.py
script on each laptop. - Open the OpenGalea Unity project.
- Configure the
NetworkManager
in Unity to designate one device as the host and the others as clients. - Build/deploy the app to all Quest 3 devices.
- Start the Experience: Launch the OpenGalea app on all Quest 3 devices. The host should initiate the shared experience.
OpenGalea utilizes Meta's Colocation and Shared Spatial Anchors APIs to create shared mixed reality experiences.
- Colocation Discovery Initialization
- Devices running OpenGalea utilize Bluetooth-based discovery to identify each other within a range of approximately 30 feet. The Colocation API allows devices to advertise their presence and discover nearby sessions.
- Shared Spatial Anchors
- Once devices are connected, shared spatial anchors ensure that virtual objects are consistently positioned within the physical space for all users.
- Synchronization
- Maintaining synchronization between devices is crucial for a seamless shared experience. OpenGalea employs various techniques to ensure that all users see and interact with the same virtual objects at the same time.
Refer to the [Troubleshooting Guide](link to troubleshooting guide - can be a separate Markdown file or a section in the README) for solutions to common issues.
We welcome contributions to OpenGalea! Please see our [Contribution Guidelines](link to contributing guidelines - can be a CONTRIBUTING.md
file) for details on how to get involved.
This project is licensed under the MIT License.
For questions or inquiries, please contact:
- [[email protected] (Soo Bin); [email protected] (Tsing); [email protected] (Yechan); [email protected] (Hussain); [email protected] (Alif)]
- Pacific Rim
- Cerebro (X-Men)
- Neon Genesis Evangelion
- Software (Unity, ML, BCI): https://github.com/sbpark422/Syncer
- Hardware (Design Files): https://github.com/Caerii/OpenGalea