Hyper Lapse Robot: Week 2

Shane Griffin
CSE 468/568 Robotic Algorithms
5 min readDec 10, 2019

--

Goals

My goals for this week were to finalize the hardware design & implement a user interface. This included routing the wires, deciding on batteries, improving the traction of the wheels, and providing a way for the bot to be controlled by a smartphone.

Final Design

Finalizing Hardware

Batteries

The two batteries I decided to use are the Onn 3350 mAh power bank. This power bank can provide up to 1 A, which is enough for the Pi Zero W or the motors. The reason for using two different batteries for the Pi and for the motors was to prevent the Pi from possibly being damaged from not being able to draw enough current. The MG90S servo being used can draw 250 mA when moving, and up to 700 mA if it stalls. The 28BYJ-48 stepper along with its driver board, the ULN2003, can draw around 200 mA when active. This means that in total, the two servos and two steppers can draw 800 mA. The Zero W at load has been shown to draw 230 mA, which means the Zero W could be given an undercurrent, and potentially damaged if any servo stalls.

Wheels

Initial tests showed that the printed PLA plastic wheels did not provide enough friction to move the bot, even with the additional weight of the batteries and GoPro. To solve this issue, the wheels were wrapped in masking tape increasing the traction just enough for the bot to move. Later, the wheels were wrapped in non-slip furniture pads, increasing the friction further.

Circuits

The jumper cables were routed under the frame of the bot, keeping the robot’s look organized and clean. After some deliberation, I decided not to hard solder the jumpers. Being able to quickly swap motors in & and out and change the design was necessary for fast prototyping. The downside of this is that it greatly increased the length of wire used, since I didn’t have jumpers of exact length without splicing the wires. Future, more complete designs will use exactly cut to length wires with solder. The batteries were mounted under the frame with zip ties, hiding the jumble of cable used.

Final design circuit diagram

User Interface

System design

Web Server

In order to command the bot externally, a web application & API is hosted on Nginx. The API provides all the necessary hooks to control and read the bot’s state, such as ‘goto state’, ‘start hyperlapse’, ‘stop hyperlapse’, and ‘get status’.

The web application provides three different tabs for defining a hyperlapse. They are used for setting the path the bot will take, the pan & tilt transformation, and the hyperlapse settings. The web app was written using HTML/JS/CSS along with the Bootstrap and jQuery libraries.

Creating a path

Path

The curves tab lets the user define a bezier curve representing the path the robot will take. This is done through four draggable points. The first two points are the curve anchor points, and the second two are used for controlling the bezier curve’s parameters. The distance between the anchor points is shown, allowing the user to define the curve with a real world frame of reference.

Defining a camera translation

Camera

The camera’s movement is defined by dragging two rectangles, a start rectangle and an end rectangle. These rectangles represent the approximate framing in relationship with the center frame, which represents a tilt and pan of zero degrees. The degree change for the tilt and pan is also shown at the top of the screen. The bot will linearly transition between the start and end frames over the course of the hyperlapse.

Additional settings

Settings

The settings tab contains the additional parameters for the hyperlapse. This includes the length in minutes, as well as the option for an exponential transformation instead of a linear one. Currently, only the length setting is implemented. Once all the parameters are set, the data is sent to the Pi, kicking of the hyperlapse.

Control

Once the hyperlapse bezier curve and camera transformation is sent to the Pi, the data is converted into control outputs to send to GPIO. To do this, the pigpio library is used. Pigpio is a C library that is designed to interface with the Pi’s GPIO pins. This library was the ideal choice because it allows hardware timed PWM pulses to be sent on any GPIO pin. When controlling servo motors, the PWM signal sent to the servo needs to be consistent. Software timed PWM on the Pi can produce jitters in the servo movement, which is why interfacing directly with the hardware through pigpio is ideal solution.

Results & Future Goals

The final robot is portable and provides an easy to use interface to define hyperlapses. Due to all the processing and battery being self contained, you don’t need a computer to instruct the bot what to do. All you need is a smartphone to set the hyperlapse parameters. This means that the bot can be used in real world situations.

Many of my initial goals were not met for this project due to spending more time getting the hardware right. Dealing with unexpected issues such as not being able to print a usable castor ball wheel as well as not running ROS on the Pi resulted in time being taken away from other features I had planned. Here is a list of future objectives I have in mind:

  • Connect Pi directly to GoPro for automated control
  • Use camera feed from GoPro for object tracking + localization
  • Redesign frame to be smaller + more rugged
  • Multiple keyframes for path & pan/tilt

--

--