Raindrop, through the Microinteractions.

Francesca Diana
Apple Developer Academy | Federico II
5 min readApr 17, 2020

Micro interactions represent a vast world, which goes far beyond animations. According to Carrie Cousins, a microinteraction is any single task-based engagement with a device.

Each element, animated or not, with a specific purpose within an interface can be defined as microinteraction.

My goal for this project was to build a user interface based on good UX criteria, to identify and study the most important microinteractions and to eventually animate them.

What is Raindrop
Raindrop is a prototype I designed a couple of years ago, for my last university exam: web design.

check out the project: https://www.behance.net/gallery/87407013/Raindrop

It is a cloud storage dedicated to electronic music composers; its main value is represented by the division into two macro sections: My Compositions and My Sounds. It also gives you the possibility to record directly into the cloud through a Record section.

Each electronic composer or producer is faced with infinite archives of sounds to manipulate: Raindrop gives the user the possibility to always have his archive and the corresponding finished compositions with him, in an orderly manner. Each sound and each composition is directly connected to Ableton Live, audio manipulation software, through an icon.

Web to Mobile
How to report all those actions in a mobile app without creating noise?Continuing without re-studying the UX and without busting my head on new wireframes would have been a failure.

I started by identifying the key values ​​of the website from scratch and I continued translating them into the iOS language, while I was trying to maintain the strong graphic identity. I therefore transformed the macro sections of the platform into a customized segmented control and the sections (home, favorites, shared and records) into a customized tab bar.

Raindrop mobile UI.

What about the icons of each audio file (share, edit, trash and favourite)?
I decided to hide them behind the audio file itself, taking advantage of the long press that iPhone users have been accustumed to: they will come out with the gesture, which will also allow the user to move the file or edit its name.

Despite this, I decided to put a menu at the top right, containing the trash, the filter displaying recent files, the specifics on the personal profile and the percentage of occupied memory.

After having arranged all the elements, I focused on the existing microinteractions and which ones were worth animating.
This is where Dan Saffer’s Microinteractions was supporting me.

I schematized the microinteractions of the UI that I had prepared and I focused on Dan Saffer’s principles; according to him, every interaction has a trigger, rules, produces feedback and is composed of loops and modes.

I chose to focus more on the feedback.
S
ome of them in fact require a certain type of animation so that the feedback, essential to give the user a signal of successful action, arrives as clearly as possible, and so that the user recreates a fake spatiality within the digital area, memorizing the map and becoming practical to use it.

Animated video about Raindrop Microinteractions.

I animated the launchscreen for a matter of identity: the logo represents decibels and also a box.

For the upload at the center of the tab bar, I got inspired by the Bear app, I opted for the feeling of something growing, which gives the idea of space that fills up, simulating a sort of surface tension that is resolved in a tick that bounces with fireworks. The user will perceive that the action has been successful and will experience a slight feeling of satisfaction, thanks also to the audio feedback.

I chose to animate the play/pause transition for a more aesthetic matter: the user does not need it, but since a light animation would not have affected the comprehensibility of the action, I decided to give movement to be coherent with the dynamic mood and the modern style.
On the other hand, the icons bouncing from the audio file with the long press seem linked to it by a thread.

Same for the menu: the jelly effect was not necessary, but at the same time it does not create unnecessary noise and contributes to the dynamic definition of the user experience.

The record case
I replaced the iconic red button with a sort of on/off switch, very revisited.
The switch, totally white, gets bigger and gives off an intermittent halo when it’s on, as if it were a signal, which indicates, together with the seconds passing, that the recording is in act.

We are all used to see a record represented by a red button, is it therefore correct to go against the principles of affordance?

Mine could be called a gamble.

The record section is one of the value propositions of the service: it is located on the tab bar, on the front line, and represents the project together with the two macro areas (My Sounds and My Comps).
Since the user easily understands what he has to do to start the recording, I decided to experiment with an iconic and different solution: the switch, through the movement of the finger, provides a more interactive and more satisfying experience to the user, than the tap he’s used to.

Dan Saffer speaks of Signature Moment as animation that becomes a symbol of a service: what if this is the case?

--

--