Planting a Seed

Adam Sauer
6 min readOct 16, 2016

--

An idea to render file systems into 3D trees based on the layout of the directory — started at MIT’s recent Reality, Virtually Hackathon.

Introduction

Last weekend MIT hosted the world’s largest Hackathon for Virtual and Augmented Reality projects. After being wait-listed, writing the organizers for a last minute attendance, and finally receiving confirmation to attend, I drove over from Columbus, OH with two other VR developers that I know through our VR Columbus Meetup group. On the drive over we listened to an audiobook on the emerging field of Data Science, and how one of the most challenging feats to overcome is to be able to intuitively and quickly interact with and understand large databases of information.

Applying a few thoughts on how data visualization can improve our cognitive faculties, I tossed an idea out there on creating 3D structures based on information. I thought I was a genius, but quickly found out that there were several other people with similar ideas, some who had worked on them decades ago. So, I found several like minded individuals to team up with — Sarthak Giri, Aravind Elangovan, Ethan Anderson, and Tim Besard — and went to work.

Basic Idea

Our starting idea was to build a basic tree-like structure that would reflect the various file pathways of a file directory. The trunk would represent your C: drive, with every iteration of branches representing the various sub-folders contained in that directory, and the files being similar to the leaves on a tree. Every file would have its own root all the way down to the trunk, and that root would be colored based on the type of file (blue for Word Doc, yellow for Images, etc). This is a picture of what we built that weekend.

Once Augmented Reality headsets have become an integrated part of our daily lives, we need to have three-dimensional way to represent and navigate through information, research shows that realistic immersive experiences engage the mind much more than our current 2D interfaces.

Through the process of building on our original idea and exploring what directions to take it, we discussed a lot of use case scenarios and extensions of the idea. We want to continue that conversation, and share the ideas with others to ponder, and build upon. So, following Simon Sinek’s method for communicating an idea, here’s our “Why”, “How”, and “What” of the Bonsai Directory Tree Modeler (name pending).

The “Why”

Artificial Intelligence (AI) has a history of going in a pretty weird direction. A lot of sci-fi stories portray AI as something that will ultimately become more intelligent than humans and take over civilization. Fortunately a lot of early pioneers (Ada Lovelace, Doug Engelbart, and others) envisioned it as something that would grow with humans, and become an extension of our own cognitive abilities. It’s clear today that computers play a crucial role in how we understand and relate to one another as a global community.

With the recent growths in the Virtual and Augmented Reality industries, a lot of ideas have sprouted up as to share realistic experiences integrating various cultures, perspectives, and situations around the world. One thing that wehaven’t seen enough of, and what brought our team together, is data visualization. Currently, Plots, Graphs, Charts, and other representations of information have empowered us to communicate trends in economics, physics, and many other areas that are critically useful to research and business alike.

However, using immersive technologies we will be able to move around in a virtual environment, using intuitive spacial navigation systems. We will be able to interact with large sets of information from which we derive more meaning and value. In 3D, there are more dimensions to store values than a 2D screen, and this enables us to quickly wrap our minds around the data being communicated.

This is exactly our Motivation: To improve our abilities to derive meaning from information.

The “How”

We have seen some pretty cool interfaces that are used in Iron Man, Minority Report, and Avatar to name a few. There are even a few creations along this that are used by experts in various industries, yet the licensing and infrastructure requirements are too hefty to be an accessible asset to the public.

Using an open-sourced approach we want to build something that helps visualize everyday information for the everyday consumer into Virtual and Augmented Reality. We want to create something that doesn’t require a background of programming to render his/her data in virtual reality. We want to build a product that new users can intuitively understand and utilize our tool to model and interface with their data.

Unity is a cross-platform game engine that has been paving the way in the Virtual and Augmented Reality industries, allowing developers to use their software to develop immersive experiences all in one place. This environment can even build web based experiences that can interact with data the same way any other websites can, so experiences can be built to parse through bank accounts, social profiles, business assets, art portfolios, and any other databases in new and innovative ways.

The “What”

Our starting goal is to create a simple program that can be downloaded and installed on a computer (or headset), and be fully functional upon opening. Similar to WinDirStat, KDirStat, or any of the other current directory visualization tools on the market, this could be a simple .exe file, or an app downloadable on the Steam, Android, and iOS stores (I’m thinking too big here I know, start small).

This program would scan through your local folders and build a three-dimensional structure based on the various pathways to every file. As we were developing the idea we found that there are several other programs that currently do this, but they seem to lack something critical: intuitive and interactive user-interface that has a shallow learning curve. We decided to look to nature to understand how various channels of resources lead to various formations in living organisms. We settled on truncated root systems in trees and plants (among other organisms): a familiar system that has been perfected by millions of years of evolution.

A Standardized Structure

There currently exists a method of modeling information called Force-Directed Graphing.How it works is that several datapoints are fed into the system with links between points. These datapoints could either represent people (with the links being personal connections), or represent files and folders (with the links being their respective pathways). Force-Directed Graphs plot the information with coordinates based on the sizes and connections of the dataset. The result of these graphs looks familiar to the way branches of a natural tree would grow when looking down from above.

This model used in a spherical coordinate system could be implemented to determine the size and direction of every iteration of branches. The key difference with our design (just to re-iterate) would be that the root of every pathway is planted solid into the ground like a real-life tree.

Concluding Paragraph, because this is way too long

This is our first glance at this idea. In future posts we will explore use case scenarios, and possible growth and extensions of the idea. In the meantime we agreed that it would be best to build discussion around this idea to gather immediate feedback. If nobody wants it, or understands it, what’s the point?

So let us know your thoughts. Are you curious what your personal hard drive looks like when mapped out into a tree? As we begin transitioning to Virtual and Augmented Reality displays should information be visualized in these ways, or should we stick with 2D charts that we are used to?

…Also, if you want to help out this thing is open sourced. Jump in!

--

--

Adam Sauer

Electrical Engineer and Spatial Computing Enthusiast. Founder of Overview Solutions. Lives in Columbus, OH