Image for post
Image for post

So far, our Meduim blog covered stories on our exciting research findings, fun experiments, or new products. We decided it’s time to introduce our Founding Partner George Pliev — a person whose ideas and committment brought Neurodata Lab to where it is now. We’ve run a short Q&A session with our founder to discuss the foundation of our startup, his views on the future of AI, and whether a successful company can be built outside of Silicon Valley.

Q: When journalists ask startup founders about how they came up with the idea of their startups, many say they tried to create something they couldn’t find as consumers. Or that they wanted to refine a process or service. …


Image for post
Image for post

You know we monitor the market with great attention, and we all noticed people saying the world has changed in the last several months. Many experts make forecasts related to pandemic, freedom of movement, and economic growth. Yet there’s something that very few people talk about, while this circumstance may define future consumption and labour practices.

In the near future, emotions will define markets and trends.

Coronacrisis and self-isolation have deprived us of things that made us happy: live social interactions, positive experiences, and bright emotions. Now we can’t meet with friends, take colleagues out for beer, or enjoy a concert like we used to before. What’s more, the circumstances will probably remain like this for some time. …


Image for post
Image for post

Can you name a TV series that portrays your line of work? We definitely can. For us, that’s Lie to Me. But despite the intriguing discoveries presented in it, in reality there are very few similarities between the series and emotion recognition ranging from arguable scientific methodology behind the plot to real-life applications. Yet for quite some time we at Neurodata Lab have been wondering if acts of deception really have physiological manifestations that can be captured by sensors.

But what is a lie to start with? According to the Merriam-Webster Dictionary, lie may be defined as:

1. an assertion of something known or believed by the speaker or writer to be untrue with intent to…


Image for post
Image for post

Ads and creatives can elicit many emotions and reactions and make us happy, upset, intrigued, curious, thoughtful, and much more. Some ads and products elicit stronger emotions and some result in weaker, some are engaging and some are not. Usually it’s enough to ask a respondent for their opinion on ads they were shown to understand whether they are interesting or memorable, and that’s what brands and marketing researchers do. We at Neurodata Lab decided to test this approach as we believe that physiological data can be much more informative than words: it can give us much deeper insights into how people perceive ads they see. …


Image for post
Image for post

During the last couple months we’ve all been watching the fast changes occuring due to the pandemic. Yet we believe that the most significant changes still lie ahead. The relief from the coronavirus will be probably replaced by a long recession or even a full-blown economic crisis. A majority of B2C businesses have already been hit by the pandemic and can’t stay afloat anymore which, in turn, affects B2B segment. Those who are lucky enough to continue operating are now cutting budgets to prevent as much damage as possible.

Our chats with partners and customers show us that there is one particular industry that will be affected the most. It is focus group marketing research. We already see that they suffer from budget cuts and inability to reach customers. The brands that used to commission such research cannot afford any “excessive” spending, and anything not related to direct sales or viability of the business itself is now considered excessive. …


Image for post
Image for post

We have developed a set of 8 DIY emotion recognition and human sensing technologies that can bring value to our customers’ businesses by giving them insights into client behavior and reactions they never thought were possible. All of these tools are now available for testing via our API.

However our customers frequently ask how exactly our technologies work and what are their technical requirements. Since many of them got first acquainted with us by reading our posts in this blog, we decided to provide here answers to the questions that get asked most often.

We see you’ve got a Sex&Age Detector. What are these technologies capable of? How do we apply them in our company to bring value to our business?

All of these tools are based on our face detector. In order to recognize an emotion or estimate age/sex of a person or measure their heart rate, our technology automatically detects a face in the frame first. Full face capture is the primary condition under which our more advanced tools work. …


Image for post
Image for post

We never miss a chance to do new exciting research, so when we learned there were several amateur cyclists in our team we thought there just had to be a way we could combine the two. That’s how this sort-of-scientific experiment for a potential out-of-the-lab use-case was born. But before things got too complicated, we should explain some theory behind it.

Recently we published an article where we described the existing contact and contactless heart rate tracking methods from ECG to machine learning.

In a whole lot of industries there is a growing demand for contactless heart rate (HR) measurement tools. In many cases, use of traditional methods such as EEG or ballistocardiography is impossible or inconvenient. For instance, it may be problematic to use heart rate and respiratory monitors if patients are newborns or suffer from skin conditions, or if they just move too much during the procedure. …


Image for post
Image for post

You only get one chance to make a good first impression. Our clients always ask how exactly our emotion analysis tech works, so we’ve decided to conduct a little Christmas experiment and have a bit of fun ourselves.

In this article, we’re going to look into how we have built a virtual webcam to construct the function of augmented reality (AR). Now when we skype our clients, they can see our emotional data in real time as a live demonstration of how our products work. We’ll briefly show you how you can build a virtual webcam of your own if you wish to impress your clients. Afterwards, you’ll be able to use this virtual webcam analytics with such applications as Skype, Zoom, etc. …


Image for post
Image for post

When two people talk, they unconsciously detect and deconstruct each other’s emotions so that the conversation could go smoothly for both of them. Now, imagine that you can only hear the voice of a person to whom you are speaking (just like during a phone call), or read a text that doesn’t contain any emojis (take any book). If you know the person you’re talking to, it becomes quite easy to understand which emotions are behind the text or voice. But if you’re communicating with a stranger, things can be quite tricky.

The same works for an artificial neural network (ANN). Almost* any person is a stranger for an ANN, and when we limit it with one modality only (say, show it a video without any sound), it might confuse similar looking emotions while trying to recognize them: for instance, a surprised and a happy face captured on a 3-second video might look very alike. How do we train our ANN to see the difference between the two? …


Image for post
Image for post

People have a hard time believing that a person’s heart rate (HR) can be measured by analyzing a video of them. Indeed, if we’ve already trained computers to calculate the HR using a video then why haven’t we learnt to do this ourselves in the course of our evolution? After all, somehow we’ve learnt to interpret tiny movements of facial muscles or detect emotions in each other’s voices. That would be a great skill, especially for people whose work involves measuring the HR — such as cardiologists, sports coaches, and even reporters.

Yet even when we look closely at someone, we cannot estimate their HR by sight or identify whether they’re suffering from cardiac irregularity without using any special equipment. Hower, we are getting close to becoming able to measure HR by just looking at someone. There are appearing new machine learning-based methods — such as computer vision and time series analysis — that can extract data on someone’s HR even from a webcam video. In this article we’ll talk about how these methods work and how you can develop your own algorithm for HR estimation. …

About

Neurodata Lab

We create multi-modal systems for emotion recognition and develop non-contact methods of physiological signal processing. Reach us at contact@neurodatalab.com

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store