Vis 147A Midterm Project Proposal

For my midterm project, I plan to do a hybrid software and hardware project. The former part will consist of a live video stream from a webcam, phone, or camera. I will create a program that performs a live mesh analysis of the incoming picture stream and performs a generative transform on the mesh that is created from the screen’s pixels. I am still working out the details of the generative mesh algorithm, but below are my rough sketches as to what I hope to acheive.

Notes I made while forming the idea.

The hardware part of the project would involve a touchless 3D tracking system that would be able to track the movement of a person’s hand in 3D space. The tracker would be able to do this by using the natural electrical properties of our bodies and the way it can influence a DIY capacitor. This would be done using an Arduino. The 3D tracker would send information about the position of the user’s hand to the system and so allow the user to pan around the live, 3D mesh using his/her hand.

As an example of what kind of Mesh generation I can do, here is a screenshot of a mesh generator I wrote today:

Original Image
Generated Mesh

Note how the mesh exists in 3D space. In this simulation, the user can also pan around in 3D using their mouse. The generative algorithm here simply selects the pixels with a brightness above a certain pre-defined threshold and generates a mesh between those. Their brightness is used to define their extrusion into the z-axis.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.