How My Brain Waves Can Prevent Bruxism

Aryanna Gangani
14 min readJan 18, 2023


A breakdown of how your computer can detect, through BCIs, when you grind your teeth.

How my computer was able to detect my teeth grinding :)

But hold on, what’s Bruxism?

Basically, it’s a fancy medical-world term for grinding teeth in your sleep. In teeth grinding, teeth that are flattened, fractured, and chipped or become loose, which leads to increased tooth pain or sensitivity. Lockjaw is also another common cause of teeth grinding.

Okay, so you must be thinking, "How can brainwaves prevent teeth grinding?"

Well, that’s exactly why I created this amazing project — that can hopefully change lives.

To create this project, I used something called a BCI, or brain computer interface. Brain computer interfaces use brain or nervous system data to control computers or machines. Check out my other article here for a more detailed breakdown.

As I started getting more into the process of how BCIs actually work, I came across a couple projects that high school students had created. The one that really caught my eye was when a computer lowered the volume every time the student blinked but raised the volume every time there were jaw actions. And this didn’t apply to just jaw movements. As mentioned above, blinking can also be incorporated.

An example of a TKS Alumni’s project

These insanely cool projects inspired me to want to build my own... so I did 😏.

An important thing to remember — for this project, I used a MacBook. The instructions for a Windows computer may differ from the ones I have here.

Here are all the hardware and software I used:

Huge shoutout to Carol Rong 😘. Go check her out on Medium as well.

Step 1: Purchasing the Muse headband

Before starting any part of this project, I needed to buy a Muse headband. Right now, the most updated Muse headband is the Muse 3 (but depending on when you read this, it might be outdated). Buying the headband from Amazon was $350 CAD. The best deal was straight from the Muse Website:

Essentially, I paid for the app membership, and the headband was a free bonus.

They tend to ship in 3–5 business days, depending on where you live.

So once that was settled, I moved onto the more technical stuff. To be honest, this process was really long and pretty discouraging. But I promise you, it’s worth it, and you can do it 🫶🏼.

Step 2: Setting up the code

This project is done using Python, which, according to Google, “is arguably the easiest programming language for beginners to learn." The first thing I did was go to the official Python website here and download the latest version of Python for macOS. Again, right now the latest version of Python is 3.11.1, so that is the version that I downloaded and used for this project.

Then I went to the terminal on my Macbook (⌘ or command + space + search “terminal”).

How to find the Terminal on Mac

Once I was in the terminal (should look a little something like this):

What the terminal should look like

I typed:

python3 --version

If this doesn’t work for you, type:

python --version

Putting this into my terminal gave me the most updated version of Python that I have. In my case, it’s version 3.11.1, which is the one I need.

Python Version

Next, I installed pip. Pip is a package management system for Python. It allows you to install and manage packages that are not included in the Python standard library. It isn’t something you have to download; it’s in the terminal.

In the terminal, I typed:

pip3 --version

Again, if the above didn’t work for you, try typing:

pip --version

The result I got was pip 22.3.1, which is the most updated version right now.

Both Python and Pip versions

Step 3: Getting a code editor

The editor I used is called Visual Studio Code, which, in this case, works for both Mac and Windows. Here is their website, where you can install their app. For me, the pop-up said the installation would take 175 hours…but in total it only took up to 1 hour to download. I suggest letting it download and doing something else in the meantime.

That’s all the code I installed! Now, moving onto the headband…🤓

Step 4: Connecting the Muse headband

For the headband to be able to communicate with my Mac, there had to be an outlet. The outlet, or “communicating app," that I used was called Petal Metrics. This is their website, where I installed it from. Basically, Petal Metrics connected my Muse headband to my computer through bluetooth — this is where the brainwaves and data was streamed through. The problem I had with this was that my Mac wasn’t allowing the app to be installed, since it couldn’t run it through the “virus checking system”. One way to bypass this is to control + left click, then open, and it should work. If this works, you can move onto the next part.

This didn’t work for me though. So what I did was go to the Apple symbol in the top-left corner → then system preferences → and then Security & Privacy. At the bottom of this, there should be a notification that says “Petal Metrics”. Click the lock to make changes, then allow it to be downloaded. I promise, it’s not a scam or virus — it just wasn’t able to go through the virus check that Apple has in place.

Once it’s been installed, it should look something like this:

Screenshot of the Petal Metrics app

Now we need to connect the Muse. All you need to do is turn your headband on, wait for the orange LED lights to start, then click the “Stream” button at the bottom of the Petal Metrics page. Once you’ve done that, scroll down, and it should look something like this:

Petal Metrics once Muse is streaming

And don’t worry — you don’t actually have to sign in for it to work. All this does is confirm that your Muse headband is connected to Petal Metrics, and you’re done connecting the headband!

Step 5: Interpreting the Python code into Visual Studio Code

When you think Python, you think learning code. But learning Python isn’t that easy, it takes hours, days and weeks of practice to get some components right. I started doing Python using Scrimba and get this — it’s free! But there are 58–60 lessons, so how exactly would you know which code you needed for this project specifically?

So, contrary to people telling you “you need to brush up on your Python or you’ll never be able to do the project” you actually don’t. But you should. But…you don’t need to for this project, so you don’t have to panic about having to learn it all in a couple weeks time. Thanks to Alexandre Barachant, who created a Github with all the code you’ll need for this right here. It’s called muselsl, so keep that at the back of your mind for later.

All I did was click the green “code” dropdown, click the “copy” beside the Github link, which should look something like this:


Photo credits: Carol Rong

Then, I went back to Visual Studio Code. On the side toolbar, I saw 5 icons: Explorer, search, source control, run and debug, and extensions (in that order). I clicked the first one, Explorer → then clone repository → then I pasted the link from above. A pop-up came asking me to open it up, and I clicked open

Photo credits: Carol Rong

A “Get Started” page opened up. The next thing I did was go to my top toolbar (apple sign, code, file, edit, selection…) and click on the one that says “terminal” → then the first one that says “new terminal”. This new terminal that I opened was separate from the one I opened above — this new terminal that I created is linked to Visual Studio Code.

Photo credits: Carol Rong

Remember muselsl from before? What I did next was install it, in this new terminal that I created.

So, in this new terminal, I typed:

pip3 install muselsl

And it gave me an error. Don’t worry, it should give you one too. The error is that my terminal didn’t recognize something called “pygatt”.

So, underneath the error, I typed:

pip3 install pygatt

And that didn’t work either 😑.

So, I scrolled up to the yellow section of the error, and the bolded and capitalised word was: DEPRECATION.

Closer to the end of the yellow section, it gave an alternative — a way to overcome this problem. For me, it gave “add — pep517 to the end of pip3 install”.

Which gives you this:

pip3 install pygatt--use-pep517

Instead of this:

pip3 install pygatt

Now, this might be different for you, so don’t just copy and paste what I typed above. All you have to do is put the alternative that the “error code” suggests, in the yellow, and it should work. It’ll say “successfully installed”.

Step 6: Reading the brain waves!

Finally! We’re going to get to see the brainwaves 🥳!

All I did was put my headband on, type the following code into the terminal, and it worked:

muselsl view
Photo credits: Carol Rong

These are what the brainwaves should look like ⬆️. Isn’t it soooo cool? I can see my brainwaves literally milliseconds after I produce them — all in front of me 🤯.

Photo credits: Carol Rong

These are some of the commands I used to play around with on the brainwaves view.

Using the “zoom out” tool, I was able to see spikes and dips, like in the picture above. Can you guess what those spikes and dips are?

Actually it’s kinda hard to guess…but that is jaw movement! Everytime I did the “act-like-I’m-biting-something” movement or grinded my teeth, it got recorded as a dip in the brainwaves. The picture above is an example of what blinking can do to the brain waves — also a dip.

So, it’s good that I can see the jaw action, but how do I get my computer to recognize that?

Step 7: Getting my computer to recognize my jaw actions (Part 1)

Remember where I copied the Github link before? And remember how there were so many files on the left hand side of the Explorer? One of those many files read “examples”. So I clicked on the dropdown menu of examples, and clicked on the one that read “neurofeedback”. This is where all the signals are stored:

Then, since I wasn’t automatically prompted, I went down to the 5th toolbar option, Extensions, where I then typed “python” and clicked “Install”. The picture below is what it looks like after I already installed the extension in Visual Studio Code:

After I installed Python, I went back to the Explorer sidebar option (the one at the top) and went back to the neurofeedback tab. In the top right corner, there is a little play button, which is the run button. So, I clicked it to run the already existing code.

Before I ran the code, I made sure to put my headband back on, and stream it using Petal Metrics, so that the code had current brain waves and signals to read off of.

When I ran the code, it gave me this:

The existing code only ran my Alpha waves. Basically, alpha waves “induce feelings of calm, increase creativity, and enhance your ability to absorb new information.”

If we look at the picture, Alpha waves are somewhat dead centre with the different types of brainwaves. Small movements like moving my jaw wouldn’t be able to be detected using Alpha waves. In other words, all the Alpha waves that were printed above aren’t going to be able to help me detect my jaw movements.

If we look to the bottom of the list though, we can see Delta waves.

Alpha brain waves weren’t able to pick up such small movements like jaw movements or blinking. But since Delta waves are at the bottom of the list, they are more sensitive to small movements. So, if I got my computer to detect my Delta brain waves instead of my Alpha ones, my jaw movements would be detected by my computer.

Essentially what I did was stop the Alpha waves from printing and have my computer start printing my Delta waves.

I added the following code underneath line #124. So the new line of code became line #125:

print("Delta: ", band_powers[Band.Delta])

And I added a “#” in front of line #137 now which reads:

#print('Alpha Relaxation: ', alpha_metric)

What the “#” did was hide that part of the code, so that it didn’t show my Alpha waves that were being printed before.

So now, everytime I moved my jaw, whether it was “acting-like-I-was-biting” or grinding my teeth, the number became larger than 1.

In the picture above, whenever I grinded my teeth, the number became larger than 1, but when I did the “biting” action, it went over 2.

I did this experiment one more time to see if what I had just found was right:

But after trying this multiple times, I realised that the first experiment was, well, technically…biased. I was sooo enthusiastic to try this out that I ended up “biting” WAY too hard. So, if you’re not getting numbers over 2, good job 👍. Because my teeth hurt for a long time after that 😔.

Okay, so now my computer could detect my jaw movements. But wouldn’t it be cool for it to tell me that I moved my jaw?

Step 7: Getting my computer to recognize my jaw actions (Part 2)

Ohhh yes it would.

And it sounds complicated, but to be extremely honest, it’s not.

All I had to do was put an “if” statement.

If the Delta wave goes over “1”, print “Jaw movement :)”

Just like how I printed my Delta waves before — remember?

So, I put the following line of code after line #125:

if band_powers[Band.Delta] >= 1:
print("Jaw movement :)")

And this is what I got:

Yes! It worked!

Wait…something’s not right.

Grinding my teeth once (quickly) resulted in 7 “Jaw movements :)” ?

So what’s happening here is my Delta waves are being read before I have time to finish my movement. The waves are collected too quickly, resulting in it multiple prints of “Jaw movement :)”

The easy way to fix this is by playing with the “experimental parameters”:

For clarification, epoch is “an event or a time that begins a new period or development” — so a period of time, and overlap is “a part or amount which overlaps” — 🤔 so an overlap.

The difference between these two is how long it takes the code to print a new EEG measurement. An EEG is an electroencephalogram. It’s used to measure the electrical activity of the brain.

As of right now, the difference is (for all the math geniuses what’s 1–0.8?) 0.2. So, every 0.2 seconds, it prints a new EEG measurement. But that’s too fast. A teeth grind takes longer than 0.2 seconds.

An easy way to change that is to think — if we want to have the same period of time, what’s the smallest we could go without going into the negatives? (Again, for the math geniuses it’s 0)

So, the only thing I did was change the overlap length to — you guessed it — 0.

So, in line #42, I changed the overlap length to 0:

Now, let’s try testing it out again:

The jaw movements didn’t get carried on throughout multiple prints.

IT WORKED 🎉🎉🥳🥳! Party time!! We did it 😁!

Give yourself a pat on the back/hug/treat yourself to something good because your program just recognized when you grinded your teeth!

But wait, how does this solve Bruxism again?

Basically, through this project, I was able to get my computer to detect and tell me when I was grinding my teeth.

So, if someone was wearing the Muse headband and grinded their teeth, their computer would be able to detect that and let them know, through the program we just made.

Check out my video here for a quick runthrough of my project:

My video :)

I couldn’t have done this without:

Check out both Karina and Carol here!

Hey! I hope that by reading this article you were able to set up your own BCI project as well — and possibly use this project to help you detect when/if you were grinding your teeth! If you liked this article, or would like to give me any feedback, feel free to reach out to my LinkedIn and be sure to follow me on Medium as well to check out my previous articles. If you would like to stay updated on my monthly goals/accomplishments, be sure to check out my newsletters here!

Thank you so much for reading my article, and high five for making it this far 🙏🏼 (apparently that’s a high five…) !

Stay tuned for my next article, where I go more into depth of a potentially viable solution that you could use to prevent Bruxism for you!



Aryanna Gangani

14 y/o Innovate at The Knowledge Society learning about the world! | BCIs 🧠