How we created a product vision movie without having a product

Eynat Pikman
HPE Design
Published in
9 min readJun 24, 2015

--

Background

HP Safe City product vision movie tells the story of Big Data and how UX is an essential part in making it usable. The Safe City film imagines a command center of a “smart city” and demonstrates how a city can be managed in a more effective manner by leveraging HP’s Big Data technologies. Our goal was to envision a real product that shows how these technologies can be used, but before we even started thinking about the product, we had to come up with a story, a good story that connects all device fragmentation with its limitless combinations of devices, browsers and versions into one holistic tale that delivers the message. Once we had that story, we started thinking about the final delivery, what will it be? a presentation? interactive prototype? a movie? We had many dilemmas but finally we chose to make a movie, since, in our opinion, it was the best way for quickly and easily making the audience grasp the message.

Safe City product vision movie

There were few challenges though:

a. No such HP product existed. We had to invent it, create use cases, screens, and UI all from scratch.

b. The movie had to be ready for the big HP Discover bi-annual event, which meant that we had only two and a half months to work on it.

c. None of us in our studio had ever created a movie before.

d. Last but not least, we had no budget.

So how did we do it?

Inventing the Safe City product

Since our story was about helping city leaders to gain insights into various aspects of city management in routine and in time of crises, the natural key players of our futuristic product were the city’s underlying agencies, such as emergency services, police, power supply company, and more.

City underlying agencies

The fact that most of these service agencies have the same basic organizational hierarchy and they also share the same critical need to be aware and in control of what is happening in the city, led us to the design strategy not to create a unique solution for each agency but to unify their UIs, of course with adjustments to each agency according to its special needs.

Next, we needed to figure out who the personas are that are going to use the Safe City product. Since time was short, we collected information in any way we could: we visited a police station control room, we read web interviews with different command center operators and policemen working on the field, we even “discovered” an HP employee who was a police volunteer that helped shed some light on a day in the life of a traffic policeman. This was done in the hope to clarify how things really work. Having said that, we naturally made many assumptions and used a lot of common sense (which is not always right) to fill in the gaps. Eventually we focused on 3 personas: the operator of the control room, the shift leader of the control room, and the onsite worker.

The UI was adapted to the device each persona would probably use according to their use case, several large screens for the control room, and a variety of smart mobile devices for the field people.

Two monitors for the operator

Trying to envision how new smart devices might come into play led us to find new solutions, like a smart watch alerting the policeman on events nearby, or an augmented reality device that helps the power engineer to get extra information on the street.

We aimed to craft a novel approach that would allow a more natural flow of information and insights between all the personas.

UX strategy

Several guidelines helped us focus on our message;

The first guideline was that the UI should empower the idea of “getting the correct information to the correct person in the correct context”. This guideline gets utmost importance when your application is in use in a time of crisis and pressure. For example, the Safe City app determines that due to damaged power cables, children from a nearby school that are about to finish their school day and leave the school premises may be in imminent danger — the app sends an alert to the operator about this additional danger threat and gives a recommendation on how to act. The app automatically takes him to the street view of the area and shows the phone number of the responsible contact person from the school.

Threat identification over street view

So, with one tap, the operator can warn the school from releasing the children home. All this without him needing to identify the problematic situation on the map by himself or to search for the contact person in his collaboration lists, and so on.

Timeline component

The second guideline was to create unified UI that could support all service agencies (for example, the power supply company and the police would use the same UI). One of the UI elements that support this is the timeline which synchronizes in real time all agencies’ occurrences. Any update or action taking by one command center is logged in and shown in all agencies’ command centers. For example, the police operator can see that the operator of the power supply company has already handled the school danger threat so he knows that he can move to the next action item.

The third guideline was to demonstrate that it is all about one story, but with many contexts of use. We achieved that by showing the use of the application on various devices in different situations: tablet in a police car, policeman’s smart watch, power engineer’s smartphone, and power engineer’s augmented glasses. It was important to establish the devices UI within a strong and genuine use case. For example, the use case for the policeman’s smart watch — in some occasions, the policeman is going to be occupied by doing something else, maybe giving a traffic violation ticket to someone and therefore unable to look at his smartphone, but he can take a quick peek at his watch when it alerts him about events that need his attention.

Policeman’s smart watch

UI design

The common practice is UX first and UI after, but in this project, due to time constrains, both UX and UI were started on at the same time. The UI designer was involved right from the beginning as an important part in the creation of the story and the UX discussions. This created an unusual challenge - starting a UI concept without any details and before the UX matured enough; and an unusual opportunity - giving the UI designer the chance to influence both the story and the content.

So how do you give look and feel design to a product that does not yet exist?

First, efforts of the UI designer were focused on the operator’s screen. At that point, it wasn’t even clear how many screens the operator needed. The only certain thing was that there would be a full size map to help targeting the situation awareness of the user and function as the core base of the Safe City application. We figured there may be different types of map modes — geographic, street, and satellite — and also probably embedded with color strips with content on each.

Preliminary map design
Preliminary map design
Final map design

As the work evolved and there were enough UX mock-ups for the UI designer, the UX and UI were split apart.

The data sources, action items and timeline components also evolved over time; they stretched and became border-less in order to utilize maximum screen space and to show more content with less visual clutter.

Final design

So by now, we had a concept of content layers on blurred glass.

Final design

The look and feel of blurred layers also served the augmented glasses’ mask.

Power engineer’s augmented glasses design

The final challenge was to grant all devices a similar look and feel that spoke the same language as the operator’s screen. By bonding them all together and enhancing the sense of continuity, they all become extensions of the same Safe City application.

Power engineer’s smartphone design
Policeman’s tablet design

From story in slides into a movie

For building the story, we used Indigo Studio (UX prototype tool) storyboard. It helped us to keep the story coherent and stay focused on the Big Data values we wanted to convey. When all the materials were ready, it was time to pack it all into a movie. We wanted to give a high-end feeling to the movie, but we also didn’t have the time to produce a real full-capacity video and since our studio is not specialized in video editing, we wondered what would be the best way to tackle it such short time. We weren’t sure how we were going to take a bunch of slides and pictures and create a movie out of them?

We chose to keep it simple; we had a live prototype that we created in Indigo Studio, and this prototype included all the important animations. So we recorded the live prototype and used it as the base for the video (using Adobe Premier to edit the video).

On top of it, using Adobe After Effects, we added the elements that explained the story and gave the right atmosphere.

Metropolis 1927

One example for that are the inter-titles that were used in silent movies to help the audience make sense of the chain of events and narrating the story points. We used the same trick in our movie.

Using inter-titles in safe city movie
Storm footage

We added footage to help the viewer to get into the mood of the stormy weather.

Using filter in transition between slides

We used filters in transitions between slides.

And also basic methods like animation for playing around with the slide content: text animations, blur effects, and zoom effects.

Closing notes

The end result of this project was a product vision movie, but in essence, we tried to design the Safe City application as if we were going to start developing it as soon as the video was completed, building an end-to-end UI that takes all the important considerations. All that in two and a half months.

After we finished the video, the project gathered amazing positive feedback from across the company, including all major stakeholders and customers.

Telling a good story, thinking about the end user all the time and remembering the value you want to show, are the three key points I took from this project.

Studio members that worked hard on the project :-)

UX: Eynat Pikman, David Ismailov & Oded Klimer

UI: Shiri Gottlieb & Mor Goldstein

--

--

Eynat Pikman
HPE Design

User Experience expert @HP Software, with broad experience both in user experience and dev. Trying to make apps and the world a better place.