“Experiential Analytics: Understanding Intention”
Date: Wednesday, March 15, 2017
Place: City View Metreon; San Francisco, CA
Sean Captain: Freelancer at Fast Company
Ramses Alcaide: President and CEO of Neurable
Peter Hartzbach: Founder and CEO of iMotions
Charles Nduka: CSO for Emteq (Facial Paralysis)
Matteo Lai: CEO of Empatica
Take new technology such voice analysis capture data.
Facial muscles lack spindles by means of gestures. Helps patience for rehab; with adaptive VR.
Contact and non-contacting sensing with invasive gestures.
Emotions and character recognition to help a user what item to select, user intention with brain signals.
Control electric activity for stimuli, brain reaction to specific situation(s).
Create the actions that is associated with it.
Consumer based one with electric 1, prefrontal symmetry.
Voluntarily and automatic control with facial tracking.
Similar to that of Google Glass
Got to augmented the products to create this technology to add value.
Biometrics for data.
Lessons in the wearable space
Help with identifying whenever people have a seizure thru a stress response.
EEG to identify to whenever it can identify a seizure.
Medical application for the tech.
Use for people with: ALS, MLS, CP …
Non-invasive methods as a future technology, likened to a hearing aid.
Sensors must be very small but do people want to be tracked down; do I have a choice?
Privacy issues from consumers and consent.
You can major what we can measure, but we have poor tools.
“Can you opt out?”
EULA — Disconnects from the rest of the world!
It’ll get embedded to such with wearable to wear consistently.
Latency for readings into an app for EEG
-Depends on the use case
-VR into Unity