iBaby Monitoring — Paper of the Week — June 20th
The paper of the week can be found here: Neonatal EEG Interpretation and Decision Support Framework for Mobile Platforms
Note: For the first time, this paper is open source! Special thanks to arxiv.org for existing and making it possible for the average person to stay up to date on scientific research.
Coming soon to a hospital near you — The iBaby Monitor!
This week, we’re talking about baby monitoring. Specifically, on your phone. This is probably not the baby monitor that you grew up with (unless you were born in the last couple months. If you were, you should probably alert an adult to the fact that you are an infant prodigy). It is also probably not one that you would use on your current/future children unless you want to find a nursery rhyme for when you have to connect EEG electrodes to their heads. However, it is a very simple solution to a problem that most people don’t realize exists that could become a real product without much effort, which is not something that we can usually say about recently published research.
This paper, titled “Neonatal EEG Interpretation and Decision Support Framework for Mobile Platforms” came out of the Irish Centre for Fetal and Neonatal Translational Research (INFANT) (I see what you did there (: ) and the School of Engineering at the University College Cork. It details an Andriod app for mobile EEG monitoring of newborn babies in the Neonatal Intensive Care Unit (NICU). Sorry to all those iPhone users, although I’m sure this could be made into an iOS app too.
Why do babies need to be monitored? Well, if anyone saw the viral video of a baby climbing up a closed and locked “unclimbable” pool ladder, babies can clearly get into some trouble if left unattended. In this case, the authors are trying to help nurses in hospitals who have to take care of babies in the NICU, where premature babies or newborns with other medical issues live temporarily while they get treatment. One of the ways that nurses keep an eye on the babies is using electroencephalography, or EEG, which records the electrical activity of the brain. EEG can be used to identify and treat things like epilepsy, which can otherwise be hard to predict.
Now, nurses are pretty smart, but they also have a bunch of patients to care for every day, and they may not be trained to identify concerning trends in EEG data. Here’s where the Android app comes into play. The EEG is recorded using electrodes, and the data is sent to the Android app, which is on a tablet, through Bluetooth. The EEG data is filtered to remove noise in the signal (click here for another blog on signal noise and filtering), and is displayed on the tablet. Interestingly, they use sound to help the nurses figure out whether the EEG signals are normal, with the app producing different sounds for different types of EEG signals. Finally, the app uses machine learning (specifically, a deep convolutional neural network) trained on seizure and non-seizure EEG data to tell nurses whether the EEG data from a baby shows signs of an impending seizure.
So, why should we care? I mentioned this earlier, but in my experience, it is very uncommon to would make it easier for nurses to identify medical problems and monitor infants 24/7 without having to physically be there the whole time. Even better, it gives nurses an intuition for when EEG data looks normal and when something might be wrong. Best of all, it would not take a lot of work to give this kind of technology to hospitals if someone were to take this out of the lab. I’m looking forward to seeing (and hopefully doing) science that can easily move from the lab to the clinic in our near future.
Originally published at www.jordanharrod.com.