This app allows a user to control a slideshow of pictures and movies from a given file directory using muscles only (e.g. open and closing hands). Signal acquisition and processing are handled by the OpenBCI GUI. Normalized EMG data is then streamed via OSC to a simple neural network in Max. A facilitator prepares the user and has override control using a browser-based interface via MIRA.
- OpenBCI Ganglion or Cyton
- EMG/ECG Snap Electrode Cables
- EMG/ECG Foam Solid Gel Electrodes
- OpenBCI GUI
- Windows or Mac computer
- Dedicated WiFi router (for wireless iPad/tablet control)
- iPad or other mobile device with a web browser
Optional:
- Secondary display
- HDMI or other video cable to connect to secondary display
Install the OpenBCI GUI and connect with hardware, then stream the data to this app using Open Sound Control (OSC) via the Networking Widget.
Use Channels 3 and 4 with either the Cyton or Ganglion. Refer to the following illustration for details. Hand is palm up, with electrodes attached at specific locations on the inner forearm, with one electrode on the inner-lower bicep. Ideally, user is sitting in a chair with arm rests or laying down. Direct user to let arm rest by his/her side with palm facing up.
Using the Arm-Based approach and the suggested electrode placement, the user can control the slideshow with the following actions:
- Backward == Flex/Relax Right Arm
- Forward == Open/Close Right Hand
- Play/Pause == Close Hand and Flex Arm
Using the Head-Based approach with Cyton 8 Channel board and default 10-20 locations:
- Backward == Left Eye Blink
- Forward == Right Eye Blink
- Play/Pause == Jaw Clench
- Max/MSP - Visual Programming Language
- Open Sound Control
- OpenBCI GUI
We use SemVer for versioning. For the versions available, see the tags on this repository.
This project is licensed under the MIT License - see the LICENSE.md file for details
- ml.star (Max package manager)
- CNMAT-odot
- MIRA (Max package manager)