NYC Media Lab awards $25,000 in prizes for emerging media technology prototypes and startups

Winning faculty and student teams from Columbia University, The New School, NYU, CUNY, SVA and Cornell Tech demonstrated at NYC Media Lab’s Annual Summit.

NYC Media Lab’s annual Summit on Thursday, September 28th, 2017, was host to over 150 interactive demos of emerging media and technology from NYC Media Lab’s consortium of universities. Entrepreneurs, engineers, creative technologists, product designers, data scientists and makers convened at The New School for the event’s Demo Expo—a “science-fair” showcase that brings together faculty and students from a wide range of disciplines.

Demo participants presented their startups, research, and prototypes to a crowd of more than 1,000 attendees, including thought leaders and fellow technologists from leading digital media, technology, and communications companies.

$25,000 in prizes was awarded by NYC Media Lab to projects that represent the creativity, technical depth and potential impact of the ideas emerging from faculty and students across NYC universities. This year, demo awards were split into four central categories, representing areas of interest for NYC Media Lab and the greater NYC-based innovation community: VR/AR, Data Science, Entrepreneurial, and Creative Technology.

“This year there was an incredible array of projects. Virtual and augmented reality seemed to be in the most demand amongst attendees, while other projects stood out for their applications of data science and creative technology,” said Justin Hendrix, Executive Director of NYC Media Lab. “These prizes are intended as a gesture of support for the teams that shared their work at the Summit, and as encouragement for the continued development of their projects, prototypes and research.”

Read on to learn about the winners, or browse the demo expo here.

Grand Prize: $10,000

Winning team members from Columbia University’s Computer Graphics and User Interfaces Lab

Travel in Large-Scale Head-Worn VR

Columbia University SEAS Computer Science
The team demonstrates a 3D interaction technique that allows a user wearing a VR head-worn display to point at a world-in-miniature representation of a city-scale virtual environment and perform efficient and precise teleportation by pre-orienting an avatar. Team Members: Carmine Elvezio, Mengu Sukan, Barbara Tversky, Steven Feiner. Website.


First Prizes: $2,000 to each

M3diate

NYU ITP; Category: VR/AR.
M3diate is a Multi User Virtual Reality Plugin that encourages users to interact in immersive worlds. Team Members: Kyle Greenberg, Theodore Lee, Christian Grewell, Bas in vet Held. Website.

Searching the Web with Neural Networks

NYU Computer Science; Category: Data Science.
Search engines play an important role in our everyday lives by assisting us in finding the information we need. When we input a complex query, however, results are often far from satisfactory. This demo shows neural networks that introduce a new paradigm for web search. Team Members: Rodrigo Frassetto Nogueira, Kyunghyun Cho. Website.

Ovee

Ovee

Parsons School of Design; Category: Entrepreneurial.
Ovee is an application that is designed for young women to take control of their sexual and reproductive health. Incorporating artificial intelligence and augmented reality, Ovee creates a safe space for women to ask and answer questions, learn about their bodies in a private setting, and engage with their health. Team Members: Courtney Snavely, Jane Mitchell. Website.

Touching The Void

The New School; Category: Creative Technology.
Touching The Void displays virtual objects on a physical pedestal and uses vibration gloves to create the haptic sensation when the audience interacts with the virtual object. Team Members: Danli Hu. Website.

Second Prizes: $1,000 to each

Calling Thunder

Calling Thunder: The Unsung History of Manhattan

School of Visual Arts; Category: VR/AR
Calling Thunder explores the unsung history of New York through spatial audio and mobile VR. Users are transported through a series of interactive soundscapes that compare today’s urban cacophony to the vibrant ecosystems Henry Hudson would have encountered in 1609. Team Members: David Al-Ibrahim. Website.

Localization System for Pedestrian Safety

Columbia SEAS; Category: Data Science.
With the prevalence of smartphones, pedestrians and joggers today often walk or run while listening to music. Since they are deprived of their auditory senses that would have provided important cues to dangers, they are at a much greater risk of being hit by cars or other vehicles. This demo presents an ultra-low-power solution for sound source localization that can be used in wearable devices for pedestrian safety. Team Members: Daniel de Godoy, Peter Kinget, Fred Jiang. Website.

Speech Up

Parsons School of Design and Cornell Tech; Category: Data Science.
Speech Up is a mobile speech therapy app for kids that uses machine learning to provide real-time pronunciation feedback and self-directed personalized gameplay. Team Members: David Cheng, Eliza Bruce, Luis Serota. Website.

XTH Sense

XTH Sense: The World’s First Biocreative Instrument

CUNY City Tech; Category: Entrepreneurial.
Participants harness the power of their body (muscles, blood flow, heart beat, body temperature & spatial data) to interact with connected devices, musical and audio software and games, virtual and augmented reality. Team Members: Heidi Boisvert. Website.

Untitled Realities

NYU Tandon; Category: Creative Tech
Many theorists believe that identity is largely shaped by experiences. If this is true, what are the implications of creating immersive experiences using VR technology, and how is the formation of human identity influenced by it? Untitled Realities is a Virtual Reality installation that examines this quandary. Team Members: Najma Dawood-McCarthy, Gabriella Cammarata, Chun-Fang Huang. Website.

Third Prizes: $500 to each

From NYC Media Lab 17 VR/AR Pavilion

Remote Collaboration in AR and VR Using Virtual Replicas

Columbia University; Category: VR/AR
The team presents a collaborative AR and VR system for remote task assistance, where event attendees wear tracked head-worn displays. A remote expert creates virtual replicas of tracked objects and demonstrates actions on them in VR to guide a local user performing a task with those physical objects in AR. Team Members: Carmine Elvezio, Mengu Sukan, Ohan Oda, Barbara Tversky, Steve Feiner. Website.

Citygram

Citygram

NYU; Category: Data Science.
Citygram is an urban noise sound-mapping project that makes sensor network scaling practicable via our “plug-and-sense” sensor network design rendering any device with a mic and wifi into an AI noise sensor. Team Members: Tae Park. Website.

Auditory Bubbles Game

CUNY Brooklyn College; Category: Creative Tech.
A game-with-a-purpose about identifying speech in noise that will help researchers identify the cues that are important for understanding individual words. Team Members: Michael Mandel, Eugene Chen, Shelby Ahmed. Website.


For more information, contact Alexis Avedisian (alexis@nycmedialab.org)

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.