Demonstrating a brain-computer interface on stage

Hi everyone,

I’m looking for a way to demonstrate the power (or at least the potential) of BCI on stage for gifted children. My basic thinking was to fit a Muse 2 device on a volunteer’s head, then being able to tell whether he’s thinking “right” or “left” (or “happy” or “sad”, or whatever other pair of contradicting terms that will be able for the kid to focus strongly on).

The problem is, I could only find one app that interprets Muse 2 readings, and that’s the official Muse app itself. It only gives you real-time interpretation in the form of pre-designed meditation sessions, and the biofeedback isn’t accurate at all.

I’m looking for other apps / ways to get data in real-time from the Muse. Maybe you can help? Any advice would be appreciated!

Thanks - it’s for the kids :slight_smile:

@Roey,
Welcome to this community.
I don’t know anything about this subject but your project sounds worthwhile and interesting. Perhaps I can help with some specific element of the project that matches my expertice.

Thank you!
I’m still exploring things, and hope to get back to you soon with more details.

Thats interesting.

Looks like you could write your own app -

https://anushmutyala.medium.com/muse-101-how-to-start-developing-with-the-muse-2-right-now-a1b87119be5c