NYC Media Lab Annual Summit 2015

NYC Media Lab
Sep 29, 2015 · 3 min read

Award Announcements

NYC Media Lab’s Annual Summit and demo showcase on Friday, September 25th, was host to over one hundred creative coders, product designers, data scientists and makers from New York City’s universities and beyond.

Demo participants presented their research and prototypes to a crowd of 800 participants, including thought leaders and fellow technologists from leading digital media, technology, and communications companies.

Prizes were awarded by the NYC Media Lab to projects that represent the creativity, technical depth and potential impact of the ideas emerging from faculty and students across NYC universities.

“It’s very difficult to pick winners in a demo pool as broad and deep as this one,” said Justin Hendrix, Executive Director of NYC Media Lab. “These prizes are intended as a gesture of support for the teams that shared their work at the Summit, and as encouragement for the continued development of their projects, prototypes and research.”

While projects varied in form and content, a shared trait was their imaginative reach, creative execution, and promise for shaping future technologies.

We’re pleased to announce the following winners:

Ultra-low cost sensor platform and integration
Ioannis Kymissis and Shyuan Yang
Columbia University Engineering
An ultra-low cost sensor platform enabled with customizable sensor. The sensor can be integrated with radio communication for variety of applications such as plugSTRATE, a “wireless monitoring network system specifically designed to address the needs of the rapidly expanding energy audit market.”

Tyler Henry
Parsons The New School for Design, MFA Design + Technology
Kinemetagraph is an interactive projection that reflects the movement of the visitor with a matching pose from the history of cinema.

All in Pieces
Shuangshuang Huo, @muyewubian
Parsons The New School for Design, MFA Design + Technology
An immersive projection installation that explores the experience of information fragmentation by transforming real-time Tweets into sound frequencies.

AMuSe — Adaptive Multicast Services
A joint project by Columbia University and Bell-Labs
Varun Gupta, Craig Gutterman, Gil Zussman, Electrical Engineering, Columbia University; Raphael Norwitz, Savvas Petridis: Computer Science, Columbia University; Yigal Bejerano: Bell Labs, Alcatel Lucent

Interactive web-based performance demonstration of a system for multimedia content delivery to large crowds via WiFi multicast.

Sarcastic or Not: Word Embeddings to Predict the Literal or Sarcastic Meaning of Words

Debanjan Ghosh, School of Communication and Information, Rutgers University, Weiwei Guo, Computer Science Department, Columbia University, Smaranda Muresan, Center for Computational Learning Systems, Columbia University

Distributional semantics to discover sarcasm in social media.

Dibuja Tu Casa (Draw Your House)
Sharon De La Cruz, @miss163
A redesign of the intake forms used to gather information from unaccompanied minors crossing the border. Using visual language inspired by art therapy activities, this redesigned form more sensitively gathers sophisticated responses.

Shakti Mb, @shakti_mb
Parsons The New School for Design, MFA Design + Technology
A two-person virtual chatroom where you can hear the heartbeat of the person you are chatting with in real-time — an experiment that attempts to explores the effect additional feedback could have in a virtual environment.

Anneka Goss
NYU Polytechnic School of Engineering, Integrated Digital Media
An interactive projected installation that visualizes global warming as a code-generated virtual ocean and sky

The Story Discovery Engine
Meredith Broussard
NYU Digital Journalism
A new type of software that reporters can use to accelerate the process of finding investigative story ideas on public affairs beats such as education, transportation or campaign finance.

On Broadway
Agustin Indaco
CUNY Graduate Center
An interactive installation representing life in 21st century city using 40 million images and data points collected from 13 miles of Broadway.

AR-APM(Augmented Reality Anti-Personnel Mines)
Carlos Bautista
NYU Polytechnic School of Engineering, Integrated Digital Media
An Augmented Reality application that enables civilians and the military to detect and deactivate anti-personnel mines.

Thanks to everyone who came out for the demo showcase, we look forward to seeing you at future NYC Media Lab events.

Weren’t able to make it? Check out a few sample demos in the video below.

Sign up for the 2016 summit here.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch

Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore

Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store