Animated Lamp — a technical exploration for a motion-emotive, iOT AI

Marisa Lu
Lumar Process Documentation
9 min readApr 17, 2017

Voice Ui’s are a very popular way to humanize interaction with AI and other such software systems. But in certain contexts, say, one tied to a specific environment, can another form be more delightful?

When we interact with anyone in person, a lot of our communication goes beyond the literal meaning of what is verbalized. While Albert Mehrabian’s infamous 7% rule is arguably more just a pernicious urban myth, his research at least attests to the potency of nonverbal-communication and body language for silent messaging. In a collaboration between me, Maayan Albert, and Max Maguire as final project for ‘Building User-centered Sensing Systems,’ we explored the possible sensing technologies and built-systems that would go into making a sort of iOT assistant that was physically emotive and communicative.

Extra special thanks to Lucas Ochoa for his robotics expertise and mechanical consults!

Final Presentation

Max and I presented the following slide deck along with the live end product of an animated lamp that reacts to environmental cues and people, able to be remotely controlled with a scaleable backend for other possible features.

Final System

The digital-physical system that went into making the working demo

Remote lamp control interaction

Quick demo:

http://marisa.lu/analamp/

Final paper

https://drive.google.com/open?id=1U4smSK3NVHdpDebqDmiRy8sMFNsAr_hC

Process Documentation

Physical Mechanics:

Servo motors with custom modeled joints, Rasberry PI modifications+case

A. It’s a servo arm bent and screwed into the lamp head so that the servo motor attachment is easier and pre-made. The original approach was to custom model and 3d print an attachment head, but the printed piece ended up not being as nice a fit as simply warping a servo arm.

B & C. 3d-printed servo holder and rotation axle

Progress Report:

Physical mechanics and custom parts:

  1. Found out the best lamp to buy, tried to get a last minute art grant, application past due, and compromised on a smaller lamp with plastic joints
  2. Figuring out the mechanics to the lamp — took a little longer than expected. We did as follows:
  3. 3d model and print custom lamp joints
  4. Repeat step above until structure works
  5. Prototyped with miniature servos (we only have immediate access to small servos)
  6. Ideal servo is a $35 turbo max. Not good, too expensive; tried different servos
  7. Found a turbo max to play with, and it works better so we defintely still need turbo max servos….um….permanently ‘’borrowed’ two turbo servos from a lab. (Ask no questions there and we will tell no lies)
  8. Installed servos with new plastic joint structures with new holed drilled into metal lamp to motorize joints.

— — — — — — -

side note: I learned how to sodder! Exciting. We soddered all the GPIO pins that didn’t come with the servo hat:

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

Two axis rotational head joint and servo holder (pseudo-gimble)

1.Step one: think through your ideas by visualizing

Step two, model the basic gear head

Step three: Measure available bolt length and adjust depth of gear accordingly

Step Four: 3d print…and then discover that the tolerance level wasn’t made high enough to fit the servo

If you have to reprint, might as well make some design changes.

iteration 1 and 2 of another design

Motion study above shows the jerkiness of the servo movement, even when the lamp is moving one degree at a time. Some of it can be fixed with tightening an axis, but mostly its that the pulse width modulation’s rate of change needs to be gradual, both easing in and out so there is less sudden start and stops.

I had a lot of trouble trying to figure out ways of easing.

I tried “anglePWM = 0.95*anglePWM + 0.05targetAnglePWM” to enable easing in because I’ve used it before

— — — — — — — — — — — — — — —

In the end, smoothing the motion down a little more was just decreasing the step size for each servo

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

FINISHING TOUCHES:

We need a reliable, sturdy and self contained way of havig the pi connected to the lamp. I created a case for it first prototyping through sketches, cut foam core, and finally laser cutting out of acrylic.

Laser cut box template made in illustrator and cut out of clear acrylic. The design pivots around the lamp base and swings the pi underneath the table and back around when needed.

Digital system mechanics:

Quick UI to prototype physical animations fo the lamp with, a client/user facing UI to control /access lamp features with, and the backend to communicate between systems

Rapid prototyping/motion/GUI

  1. Maayan made a basic python GUI to help us visually debug and calibrate servo movement. We referred to servo pivot points as joints; head of the lamp is referred to as a wrist, and currently rotates around, the middle joint is referred to as the elbow and movement is raising up and down, the shoulder joint is the foot of the amp that clamps on to the base. A custom clamp base will have to be built for rotational movement.

Client Side mobile Interface for remote control of the lamp:

(lol, this isn’t the actual one hooked up to a backend, but you can still play with the interactions!)

Progress update:

“wow, lots and lots of possibilities of different ways of making this remote back end work. So far I’ve heard of wireless AP, REST API’s, php web servers (apache + php or nginx + php) and some relation to dynamic DNS services as well as various possible node.js implementations.

I might have lost a great deal of time trying to figure out how to get a commercial dynamic web server/mutable database(?) through my amazon web services account. Only to later discover the year of paid services I won at a Hackathon expired and trying to figure out how to make one myself had led me in circles around apache + php and then to node.js implementations before finally deciding to cut losses and try another route.”

We are currently exploring making the rasberry pi a web host and accessing the html/css ‘website’ on there with just the IP address. That would have the added benefit of allowing the scripts to write directly to a local file that the python running the servos could parse.

— — — — — — — — — —what we ended up doing— — — — — — — — —

So I did try to get the rasberry pi set up as a web server, but having the website hosted there would mean it was accessible only through the unique IP address accessible if both the phone (or computer) was on the same network that the PI was on, which it won’t/can’t always be. So after exploring that option, I moved on to looking at dynamic IOT backends/pipelines. I knew particle photon had something so I started there. before settling on implementing the site through Shiftr.IO

the web page to the left is the client side mobile interface that generated the angles of the head and the body at the bottom of the page which were then taken via jquery and published to shiftr’s unique namespace. The visualization of information transfer is on the right.

Now I just need to figure out how to best pipeline the info towards the python script…currently looking into paho as a python library for this…

Yup! So Paho was great. I wrote a quick python placeholder script to test the receiving end. Ran the python script through the terminal.

https://shiftr.io/Lumar/analamp

Now, to figure out an efficient app/file structure to organize everything…yikes, ok.

___________________________________________________________________

Bulb:

  1. Bought a Phillips Hue bulb to play with. I saw that there was a Phillips Hue API and a ready made app to connect via bluetooth to the bulb so I thought it wouldn’t be hard to figure out.

a) Issues:

  • well….the bulb wouldn’t connect to my phone with the commercial app. Either the bluetooth broke…or there was some sort of network issue.
  • the API has terrible documentation

2. Max hunted down a tutorial for light bulb control specifically for hackers.

Mi-Light light bulbs look just like normal LED light bulbs and are available at a similar price, but they include a 2.4GHz RF radio link. This can be used with an RF remote control to switch lights on and off

— — — we tried it….

Unfortunately, the required software or hardware didn’t mesh with our system. This was the radio transmitter for the bulb:

___________________________________________________________________

CV/Sensing:

  1. Developed a framework to stream video from the raspberry pi camera to a local server for mobile access.
  2. Using Flask to display Motion JPEGs
  3. Implemented Face Detection feature using Haar cascade Classifier
  4. Working to improve hand/finger detection feature…

To do: figure out how to setup a infrastructure to connect phone to programs running on PI — What should we google for this? Do we stick with android studio? Not really looking forward to figuring this part out….

To do: make preset animations

Would like to do: ….I want a miini projector in the lamp now. Wouldn’t it be so cool to have a projected keyboard on your desk whenever you turned on the lamp? Or what if it acted like a projection of what would’ve been on the latest macbook touch bar? fun… The CV through the little camera would enable the keys to work….though honestly CV for finger recognition sounds daunting because the essays we read were rather hard to grasp…..

Would like to do: Sound reactive lamp? Grab a speaker and analyze the pattern for what clapping sounds like? Is there a way to detect location from two microphones that happen to be close to each other?

AGREED ON WEB APP!!! (prototyping the UI for remote controlling of the lamp can start)

--

--