A tool to reconcile HR with the use of data

Dataveyes
Dataveyes Stories
Published in
12 min readApr 17, 2020
This project was made for a HR department from June 2012 to December 2012.

In 2012, we dedicated over 6 months to the conception and execution of a tool visualizing human resources data for a leading French corporation. The tool greatly facilitated the elaboration, implementation and monitoring of talent management strategies within the company.

THE TRUE REASON BEHIND OUR FIRST MEETING

Back in December 2011, we met for the first time two of the main HR managers of the French corporation. Their mission consists in attracting and nurturing talent within the group, on an international scale. This led them to repeatedly question the way they would approach data.

  1. What type of information can one rely on to initiate a relevant strategy?

A robust plan of action, consistent with the reality of HR tasks, is required to effectively implement any strategy within a global group

In order to build such robust plan, one must gain a sharp, reliable view of all HR situations and scenarios met across hierarchical structures

When your group has hundreds of thousands of salaries, you can’t obtain such a reliable view counting on your human knowledge only. To master the granularity of information to such degree, you also need to take knowledge from data.

2. How to communicate and assess the implementation of this strategy?

Data knowledge and elaborating strategy is only one step of the way. Implementation issues must also be tackled and assessed so that each branch of the group can expect to benefit from it in a measurable way.

This lead to a second challenge: how to clearly communicate about an operational strategy and measure its success?

Of course, managing data is one part of the solution, and the HR stakeholders had already made significant use of the data to start answering those two key questions.

In this context, we were approached to precisely go faster, and go further.

Great ideas cannot settle for average, limited tools that would hinder their natural progression.

HR managers within the group longed for a tool serving their vision, that would empower them to clearly formulate their strategy and communicate about it; in a word, a tool to help them work better.

As a result, we created a bespoke tool to visualize the company’s data.

Along this mission, we quickly realized that focusing on exploring the data was not enough; we had to design a whole process clarifying how to work with the data.

REVEALING MORE IN-DEPTH NEEDS

Our contacts have access to three distinct sets of internal data that describe the employee populations and their mobility across the group, for each step level of its hierarchical structure.

HR managers within the company had for some time been using Excel to analyze the data, and Powerpoint to present the compiled insight.

At the time of our first meeting, our contacts were particularly concerned with how data insight would be presented and shared:

“What is the point of elaborating a sophisticated strategy if, upon sending graphs presenting the analysis results, our stakeholders do not fully grasp the insight behind the data and fail to take the right decision?”

That questioning marked the beginning of our mission.

We convinced our contacts to work further with us, by sharing examples of how we could visually translate concepts such as branch maturity or the evolution of good recruiting grounds.

Later on, we then understood how their needs far exceeded the framework of information representation…

Following the method with which we approach each of our missions, we started with the immersion phase. We carried out a series of interviews with our contacts. They shared their views and insight on their jobs, vision and how they were used to work with their data and software.

With their help we managed to give birth to more of their needs:

  • Need to save time: reduce the high number of days spent manipulating data in Excel and Powerpoint.
  • Need to simplify the monitoring and presentation tasks: to facilitate data interpretation.
  • Need to unify the information being presented: reach the same depth level of relevance for all indicators, country or hierarchical level.
  • Need to identify long term trends: establish comparisons, analyze results in an interface that doesn’t require users to do complex calculations.
  • Need to provide a genuine steering tool: beyond the presentation features.
  • Need to boost a network of operational managers: through compelling presentations that translate a vision and tell stories that guide decision making.
  • Need to inspire good practices around data use: in order to promote a culture of internal reporting and help implement new objective-focused management techniques.

This initial immersion phase led up to a true summary deliverable, listing the identified objectives the tool would be enable the company to reach, and how we intend to proceed.

EXTRACT INFORMATION FROM DATA

We then started working with the data. This is always a sensitive stage in our job. Datasets are in many ways similar to the goldmines of 19th century America: they can prove to be extremely rich but it requires to dig, extract them from existing silos, maneuver a bit and be patient, in order to extract the necessary raw material.

In our case, we also needed to handle two main issues:

Data was very heterogeneous:

  • The data comes from three distinct sets from different work tools. Those three tools are not implemented by the same people, nor are they updated at the same time. The hierarchy of the group actually varies from a tool to another. Some of those tools are in English, other in French, which further complicates any attempt at creating a consistent classification by country. Some tools display month by month cumulative figures, while others show absolute data, etc.
  • Despite those issues, we decided to build indicators at the crossroads of all those data points, to be as relevant as possible to our stakeholders’ needs.

The chronological dimension of the tool was a brain teaser:

  • How to display the history of data for a specific Business Unit which changed its names several times during the last few years?
  • Similarly, what happens from a data standpoint when a business unit is absorbed by another; which historical data should then be displayed?

Those questions greatly helped lay down the framework of our mission, revolving around the following steps:

  1. Defining new indicators based on our understanding of the data and the client’s profesionnal environment

We always approach new datasets from the angle of the what insight they contain and how they can be used in a professional environment.

Spending a long time listening to our contacts allowed us to describe which type of information was needed to support HR decisions across the company.

Consequently, a further analysis of the content enabled us to pinpoint which indicators should be highlighted at the crossroads of datasets, and for which use.

2. Guarantee consistency and temporal continuity through the most appropriate architecture

We have described above the issues relating to data architecture and data quality as of when the mission started.

Acting on this observation, we conceived a new data architecture allowing to unify the three core datasets, in order to ensure chronological continuity.

3. Automate data treatment and storage

We developed a script that receives the source data, processes and extracts a clean dataset which is specifically tailored to provide the insight required by the HR function.

4. Improve data quality thanks to curation standards in line with the client’s mission

Along with our client we also defined quality standards for the data and well as curation rules:

— What are the data ranges for each indicator?

— How should the tool behave when confronted to values that exceed the indicated ranges?

For example, what to do when an employee is listed as being 134 years old in the source database?

By establishing data quality standards, the tool can identify that the value is abnormal.

— Should the tool then delete all information related to this employee, to avoid the risk of skewing the relevance of other potential indicators?

— Or should the tool simply ignore the age of this element, and conserve the other information points related to this employee?

As we agreed with the client, this decision essentially comes down to what the mistake tells us about how the data was collected. In this particular instance, it was the knowledge of the HR job, coupled with an analysis of the data quality, that we managed to establish data management rules.

5. Facilitate the correction of data entry mistakes in source files

Our data treatment script was able to detect different mistakes inside the inital values collected by HR collaborators: bad completion of the organisationnal chart, missing values, sub-totals incoherence, etc.

This way, the tool can offer to the administrator to download a error log to help them correct the sources of data.

In parallel to our work with data, we focused on the graphical and interactive embodiment of the key indicators.

EMBODYING THE MEANING OF VALUES

Our client had initially asked us to formulate a first sketch on paper, early in the project — just after taking in the data specificities and the user needs. The purpose was of course not about having a clear view of what the final tool would look like, but rather showing how we could come up with innovative ideas to visually represent data.

We seized this opportunity to test the match between new visual forms and the meaning of the data, by sketching many small data visualization modules.

We then put our sketched modules to the test, by mapping them up against actual data points. We then kept the modules that appeared to be easiest to read and information-rich.

How to, for example, represent the ratio between two manager populations — juniors and seniors? Our understanding of the client’s work allowed us to see that the key insight often resides in the links between two populations: one is often intended to replace the other, with an expected loss ratio.

We knew here the ‘target’ value of the ratio, which guarantees that the number of senior management positions will be replaced when the junior population will evolve. We thus imagined a graphical representation that materializes this mutation from one population to another via a ‘projection cone’. If the origin and destination populations join up with the two bases of the cone, then the ration is close to its ideal value. It any of the two populations leaves the cone, the ratio goes off the desired values renewing the population of senior managers could reveal problematic.

This type or representation allows us to visualize the true meaning and implications of the data, with regards to the good practices within the client’s work.

The purpose was then to find out to assemble those modules so they generate meaning as a whole inside the interface.

SETTING OUT INFORMATION ARCHITECTURE

We then initiated a phase dedicated to information architecture.

Our objective was here to optimize the interface to deliver the most relevant insight from the data, in order for this information to be pleasant to browse and find. The goal is to make the returning user comfortable.

The first stages of conception felt like a 1000-piece puzzle.

We started working with dozens of different indicators, all of them with their own logic and themes, and offering very different ways of filtering.

Regardless, we worked towards grouping the data and identify a coherent interface structure, from a thousand possibilities.

A few days and post-it papers later, we had derived a solid architecture, where each indicator was in place and in synergy with the others. The architecture was made of several coherent tables, arranged along a natural progression.

We sketched this first architecture using zoning and shared it with our client.

This allowed us to avoid an interaction issue: sometimes a user may click on a button and doesn’t see what it changed on screen because the change operated quite far away from the button they clicked.

We obtained their approval which, in addition to our work on the data, gave us some solid ground to go further.

BRINGING THE APPLICATION TO LIFE

At this project stage, we had brought together an analysis of data and bespoke information architecture.

For the rest, we had to focus on the details to make it work.

We moved from the initial sketches on to wireframes.

We added more matter to the first skeleton, determined how the navigation should unfold for each action: filtering, zooming on the hierarchical levels, picking a country, change time period etc.

All the options given to a user must be displayed in a simple, transparent way within the interface: how does the interface behave/ change at each move of the user’s mouse?

We programmed this interface so that it may address the needs users have, in the most fluid and natural way possible. We attempted to anticipate their issues, logical thinking and their preferred points of reference, to build a navigation path they would be comfortable with.

From updating the datasets to impressing a graph, including interpreting the data and navigating through the company’s hierarchical levels, we have tried to anticipate everything. Being a step ahead of the users would help us give them the information they need, before they even look for it.

REVEALING THE INTERFACE

Once those wireframes were validated, we were able to add a layer of visual design.

In the meantime, we started building the technical architecture. It was intended to support the entire process from data collection to processing, including complex situations that involved chart restructuring or error spotting.

This backend then delivered refined data to the frontend via APIs. This is where the Dataveyes magic operated: our wizard developers structured the application interface and brought the data movement to images.

In less poetic terms, that means writing lines of Javascript code. A lot of them.

Those tasks kept us busy for about 4 months and accounted for over half the time spent on this mission (6 months in total).

In the end, we had a finished, tangible product to show our client.

Very quickly, the stakeholders on our client’s side proved to be very comfortable with it: they understood the information and performed interactions seamlessly. They were eager to dive deeper and start telling stories from the data.

A FEW MONTHS LATER

  1. The first data updates enabled us to carry out final adjustments
  2. Our stakeholders now feel comfortable enough to run some of their internal strategy meetings using the new tool rather than the old Excel/ Powerpoint combination
  3. The first version of the tool received overwhelmingly good feedback, beyond the circle of the stakeholders we were directly in contact with.

Looking back, our contacts initially requested us to create a tool that would draw their colleagues’ attention to the importance of data. Six months down the line, their colleagues are now proactively asking to access the tool.

As a result, the second phase of the project will now revolve around deploying the tool across a larger number of users, in order to have it benefit more managers around the group.

CONCLUSION

6 months of work
12 deliverables
15 client meetings
100 work sessions

The mission lasted six months, during which we carried out 15 meetings and catch ups with the client, and just about 100 workshops and scrums within our internal team to progress.

In addition to the first version of the monitoring tool, we also provided the client with 12 deliverables, describing our methodology step by step, as well as recommendations on how to maintain the tool.

Want to know more? Get yourselves ready for our next articles by subscribing here on Medium.

You can also follow us on Twitter or discover our last use cases on our website.

--

--