The playful side of physics

How we built Simple Machines, an App Store Best of 2015

Tinybop Inc.
Jan 6, 2016 · 7 min read

, (twins!), and from Tinybop’s development team walk through some highlights and learnings from our iOS app Simple Machines. The app is an App Store Best of 2015 around the globe; it also won a Parents’ Choice Gold Award.


Every app in our series poses unique challenges. For the series’ fourth app, , we had to make a seemingly esoteric topic (simple machines are every day) engaging for our young audience. We had to create physics simulations that were scientifically accurate and also fun.

Here’s how we did it in .

Rapid prototyping

When building any interactive media, prototyping is key. Ideas on paper are quickly tested out with interactions at fingertips. This was especially true for Simple Machines. We didn’t have art assets when we started building, so we kicked off prototyping with test assets: simple squares and circles. This allowed us to focus purely on the physics and the interaction flow.

These quick prototypes gave us the opportunity to touch and play with the interactions almost every day early on in the development process. This provided two major benefits: one, we were able to put something in front of kids to play test and give crucial feedback; two, research could keep a close eye on the feel of the physics simulations to make sure we maintained scientific integrity.

Gesture manager

We designed an input system with a focus on multi-touch and different types of gestures. We have a touch-and-zoom gesture, with support for an object “capturing” a touch. We wrapped the Unity Touch and created a spoofed version that works with a mouse. This provided us rich information like phase, number of taps, start time, and mean values.

To interact with a scene, we raycast the first frame, then switch to a trigger which is dragged around the scene with the touch, using physics events for input. This means the initial input is instant (physics events generally take one frame to process) but is lightweight — an additional trigger is fast while raycasts are extremely slow.

Events are sent to objects (the Collider2D) using SendMessage, although we have a layer of type safety on top of using interfaces. In most cases, we use a matching interface when using SendMessage.

As a side note, how these interfaces function and look is very similar to Unity’s canvas interface events. We try and go with the grain, and if possible, try and code with same mindset as Unity, as these features are generally better supported and it is simpler to have fewer styles.

Effect graphics

We used a lot of what we call “effect graphics.” We combine a physics simulation with graphics effects. A good example is the pulley; the physics and the graphics are simulated separately.

The pulley is one of six simple machines in the app. The physics and graphics in the scene are simulated separately.

Another example is the castle you can destroy with a lever. In this scene, the castle bends back and forth slightly. But the actual physics and objects do not move inside Unity. We use a custom shader to move the vertices right before they are rendered. Since objects aren’t moving inside Unity, this is very fast. Objects in Unity are made of points, and vertex shaders can animate these points directly in a basic way. So, we can do waving, bending, ripples, morphing, etc.

The lever is one of six simple machines in the app. The physics and graphics in the scene are simulated separately.

Automatic graphics sorting

We are always striving to improve our processes. We had some issues with sprites sorting in the app we released before Simple Machines, . For instance, if you interacted with an item, it would show up in front of others when it was expected to be behind. So, for Simple Machines, we created a system that sorts graphic objects based on their order in the hierarchy of the scene, using editor scripts. Functionally, this works very similarly to the Photoshop or Unity canvas sorting system: the farther down objects are in the hierarchy, the higher their sorting layer. The sorting can be overridden if special sorting is needed. This system greatly reduced our production artists’ workload as they placed assets in the scene.

Dynamic mesh generation

In Simple Machines, you drag a slider to see an alternate view of the scene. When you interact with machines in this view, arrows illustrating force help you understand the mechanical advantage of the machine. To have these arrows stretch and bend dynamically with each machine and each touch, we created a system to generate meshes at run time so we could modify the geometry dynamically.

A slider reveals the physics at work behind each machine in Simple Machines.

The editor tools we created also became useful in free-form deformation of meshes. For example, in the wheel and axle scene, a scarf worn by a cyclist is an animated mesh thats geometry is modified while it’s floating in the wind.

Illusion of life

We have some systems that add an illusion of life to objects to make them look more dynamic. For example, in the lever scene, a ball stretches and contracts when it collides with other objects — like a rubber ball.

We created this by moving the graphic to a child GameObject. We then added one extra layer in the GameObject structure that rotates in the direction the object is moving, and scales based on how fast the object is moving.

This allowed us to create illusion-of-life effects using only Transforms and Sprite Renderers, which are simple and very compatible.

Some more advanced effects were created with dynamic mesh generation, in which we use a somewhat complex base class (we call it Visual) that we pass points using IEnumerator<T>.

The triangular fulcrum below is an example of an “anticipation” effect, as your finger drags it left and right.

In the lever scene, you can drag the fulcrum to change the machine.

Mixing animation with physics

Simple Machines relies on physics for most interactions but we also used custom animations to give the app more character. We found combining animations with physics can be very effective. For example, in the wheel and axle scene, the cyclist’s body relies on physics while his facial expressions are a product of Unity frame animations activated through standard Unity Mecanim state machines. Instead of creating a “riding” animation, we moved the feet “targets” in code to follow the rotation of a kid’s finger; the cyclist’s calf and thigh naturally follow with inverse kinematics. In instances when the cyclist raises his hands up in the air, we animated the hand targets and the arms follow with a simple IK calculation. Ragdoll physics are employed when the cyclist falls off the bike, so the fall varies with the obstacles and the velocity of the bike.

Sprites atlas generation

We have found that working with sprite atlases in Unity can be painful. Setting sprite atlases manually is tedious and error prone. And removing a texture from a scene does not automatically remove it from the atlas.

We have a system that automatically sets sprite atlas names based on the level. Textures detected in the level are added to the sprite atlas using editor scripts.

In Simple Machines, this was automated for the most part. If a folder or texture name was prefixed with an underscore it was not placed in a sprite atlas.

We have scripts that run in Play Mode in the editor which throw errors if textures from other sprite atlases are detected in the scene. We used a lot of this soft state validation in the app.

When we want to opt out of this system, we can add [IGNORE] to a folder name.

Sound design process

For this app, we wanted to give sound design complete control over audio effects, making sure they could edit audio directly inside the Unity project.

We created a system to separate audio from the scene into a specific prefab, which could be fully assigned to Brian Jacobs, our master of sounds. Triggers are activated in the scene using what we call “global events,” which can be tweaked and recombined inside the Audio Prefab.

One exception was the generative audio in the inclined plane scene. The audio effects there are a result of collaboration between sound design and the dev team.

New apps, new learnings

While what we talked about here mostly applies to Simple Machines, it overlaps a bit with other apps in the Explorer’s Library series. But our Digital Toys apps have very different engineering problems and development processes.

Every new app we make gives us the opportunity to evolve our processes, solve new problems, and learn new things.

Tinybop Labs

Thoughts on learning, growing, and making from the team @tinybop.

Tinybop Inc.

Written by

We’re a Brooklyn-based creative studio building educational apps for curious, creative, and kind kids.

Tinybop Labs

Thoughts on learning, growing, and making from the team @tinybop.