Implementing the Tableau training program at inDriver

Hey everyone! I’d like to share some news about how the BI platform at inDriver is evolving. I already wrote a piece about selecting a platform, but today I’m going to tell you how we implemented the Tableau training program across the company in order to develop our self-service analytics and decentralize the data analysis process.

Dimitriy Sherbenko
inDrive.Tech
11 min readMay 31, 2022

--

Before launching into the story of our thorny path of trial, error, and success, I’d like to veer off into a quick digression and explain the major building blocks that make up our BI-platform. There are three of them, namely architecture, self-service, and reporting. I will talk more about each of the building blocks in this article and the subsequent ones.

How it all began

Ever since efforts to develop the BI-platform began, the plan has been to implement self-service across the company. Back then, though, not enough resources were on hand to meet the existing needs, and those that were available were all used to address the top-priority tasks such as developing badly needed reports, and building processes and BI architecture.

In fact, I taught only one introductory class, to show how convenient Tableau was for self-service analytics. Although the feedback was good, no further lessons were provided due to a number of factors. Time was the number-one reason for this:

  • It was impossible to find a time slot for lectures that would work for everyone involved, and learning new material in the format of one-hour videos can be difficult.
  • There was no time to check the homework, prepare answers to students’ questions, or draw up lecture materials and teaching datasets (which also had to meet certain standards).
  • Different students set aside different numbers of hours that they were willing and prepared to spend studying and doing their homework. Everyone of them had their specific workload to deal with, based on a bunch of factors.

I will point out here that different students needed different amounts of knowledge to use Tableau. Some of them just wanted to know how Tableau Server works, while others were eager to delve into the functionality of Tableau Desktop in great detail.

I offered help to those wishing to use Tableau with regard to selecting training materials, consultations, and getting the license key to work. That said, they were all expected to study the learning materials on their own. To be honest, the people who stepped forward to be counted were few in number, but their performance surpassed my expectations.

The reason these guys made such an effort was because they have to cover the need for rapid provision of interactive and multifunctional reporting to customers in large volumes and in a short time-frame. Over time, the space in which one of these early users (and, by now, his team) publishes their reports has become one of the top five most-visited areas on the Tableau server, and I am, of course, very happy about that.

Conclusions:

  • Try to communicate the value of the product to users as quickly as possible. If its value can be conveyed to team leaders, that’s an additional advantage.
  • Put no limitations on yourself and do your utmost to support your colleagues in their desire to self-develop, as well as apply new approaches and solutions in their work.
  • Identify the audience most in need of the product offered and give them everything they need in order to study and implement the product in their work environment independently.

I think this kind of experience will be most useful for those who are just starting to implement BI within the company, as well as for small teams that deal with a large number of tasks. Now let’s move on to the part where we implement a full-fledged training program in the company.

Amat victoria curam

By the summer of 2021, we already had enough people and resources to get down to developing the dream curriculum. The first iteration generated a general list of requirements as to what it should be like:

  • Relevant — the training should be conducted using data that is relevant to the trainees’ workflow.
  • Convenient — the training can take place at a time that is convenient for the student. It requires no feedback from the instructor checking the homework, so the student does not have to wait for this.
  • Variative — the curriculum has different levels, based on the amount of knowledge that is needed to solve the problems at hand.

Once the requirements for the curriculum were defined, we moved on to dealing with its content. First of all, we defined the expected outcome of the training. Based on this metric, an employee who has completed the training should be able to build a dashboard and use Tableau Server as a reporting user.

There are a multitude of examples of Tableau training and similar tools on the Internet. They are all made up of blocks describing the functionality of the application, with examples of how to interact with it. The methodology of the training is based on navigating from the general to the particular. In addition to this, the basics of data visualization and dashboard layout techniques are covered. Sometimes problems are set, involving the analysis of data using plotted graphs.

The trick here is to choose from this amount of information how much time to allocate to each of the blocks involved, and in what order to arrange them. We set ourselves three objectives:

  • Make the user realize as quickly as possible the value of our program and the use of the tool at work (that famed “a-ha!” moment in the case in hand).
  • Teach users how to develop dashboards or use Tableau Server in as little time as possible. If the user fails to complete the training program to the end, they will still have enough knowledge under their belt to meet their need for data analysis and automated reporting using the tool.
  • Make sure that in terms of content and structure, the training program is also a knowledge base. Ensure that the user can always quickly and conveniently find the necessary materials, review them, and use them to resolve the problem encountered.

We test-ran the first two items on volunteers from the operations analytics team. Some volunteers came forward, despite the fact that the training took place during working hours. I’d like to thank them very much for agreeing to help us out.

The training was conducted in the format of several one-hour lectures. In addition to positive feedback, the success criterion for us was defined as the ability to build a dashboard for the team without getting any help from BI analysts. In the end, both conditions were met.

The third item was brought to a close with the recording of video tutorials of not more than 5 minutes each. In addition, we prepared a short text description for each of them.

Next on the list was choosing a platform on which to host our content, conduct workshops, and collect analytics on students. Our first choice was Google Classroom. The API functionality and capabilities were sufficient for the tasks we had at hand. We were then approached by our in-house People Growth Team, however, who suggested we use their Academy Ocean platform.

Frankly speaking, at first, we were reluctant and pushed back on this because we had a tactic and we wanted to stick to it. But then we decided to give Academy Ocean a test run, and we liked the solution. The main advantage here was that we had the support of a team of hotshot experts in the development of our product behind us.

Ultimately, we realized what was important to us:

  • Draw up a list of requirements that our training program has to meet.
  • Understand the problem that the program addresses and the result of its use.
  • Have the ability to analyze how users are learning.
  • Find like-minded people who are interested in developing employees within the company, and help each other in every possible way.

Let’s take a deep dive into the details

Once the platform had been chosen and the structure of the programs prepared, we proceeded to implement the plan. Initially, we developed a training dataset. In terms of its structure, it was similar to the tables used as a data source for our dashboards. All the metrics were also based on our business.

For one thing, this was done to make the transition from training datasets to real-life data marts as seamless as possible. The metrics and structure will already be familiar to users. Consequently, users will have less difficulty mastering them.

For another, data pertaining to the company’s business will spike trainees’ interest in covering the study material. The program clearly shows how you can deal with your daily tasks using a convenient and easy-to-learn tool.

Next, we proceeded to create content: writing texts and recording videos, and developing homework assignments. As the volume was large, we had to outsource some of the work to a contractor. It was a real challenge for all of us to develop such a unique product with tailor-made programs, exercises and content materials. I am really happy that we managed this challenge on an excellent level.

After making a few test videos, it became clear that the quality of the content when the audio and video are recorded together is much worse compared to when they are recorded separately. Thus, the workload doubled in size. A short bullet outline of the video block turned into a full-blown transcription for the video. At the same time, the quality of the material significantly improved, and this more than made up for the increased time and amount of work to be done.

Additionally, we specified a condition: the dataset we have prepared has to be used everywhere to demonstrate real-life examples of how to apply the tool to analyze and visualize data, as well as solve work-related problems.

In terms of homework assignments, we opted for quiz-format tests. If a student fails to complete the test, incidentally, they are not allowed to continue until they give the correct answer. Sure, the answer can be arrived at via a lucky guess, but we sincerely hope that our course-takers are learning in good faith.

Our outputs included:

  • Three training programs according to the three types of Tableau licenses (Creator, Explorer, Viewer). This division is premised on the fact that each user may have different needs when it comes to working with the tool (from viewing reports to the full-fledged development of a report, including the creation of a data source). Plus, licenses vary in cost. With this factor in mind, we issue Creator to analysts, Explorer to anyone who wishes to use Tableau for developing dashboards, and Viewer to those who use ready-to-use reporting templates.
  • 15 topics, 45 assignments, and 150 videos across three programs. This material can easily be rearranged, adapted, or combined to suit our convenience and preference. Course-takers can study at their own pace, at a convenient time, and from any location in the world. Furthermore, by leveraging the fact that the learning material is posted on our corporate portal, fellow colleagues can use it to develop their onboarding and internal training programs and improve their teams’ skills. Besides, the portal already provides ready-to-use analytics for training results, and this helps us a lot when it comes to assessing its effectiveness.
  • Additionally, we integrated into the training program our Style Guide — a set of recommendations for visualizing data in Tableau, as well as ready-made templates for dashboards, charts and videos about rendering them. This made it easier to master the tool: you can download a ready-to-use template for a dashboard or visualization, and immediately use it in your work. You don’t need to waste any time distributing visualizations nicely over the dashboard or coming up with a graph to render the data. We will tell you more about our Style Guide in a separate article.

Getting the result

Once the work was done, we were ready to test-run our training program. A separate Tableau Server site was launched for Explorer, whereas for Creator it was suggested that Tableau Desktop Public Edition (a free version of the Tableau desktop application) be downloaded.

We assembled two groups of volunteers and asked them to complete the training at their own pace. Once the training had been completed and the feedback collected, we made adjustments and improvements, the most important one being implementing the final homework assignment. The task required the trainees to build a dashboard, on which the specified metrics had to be visualized for each of the semantic blocks.

Selecting a course to take as part of our training program

We had to settle for a compromise as users were expected to cover all the material on their own, and our involvement was required once the program had been completed. In our opinion, the only potential downside to this change is that it might slow down the process of issuing a license slightly, and that is not a critical issue.

Next, we had a brainstorming session in which we clearly defined how we were going to implement our training program:

  • Our corporate portal will carry a one-pager with links to the Academy Ocean training program.
  • On the Confluence page, we put in a link to our Academy Ocean training program.
  • We announce our program in the chats and provide a link to the one-pager on our corporate portal.
  • In the request for access to Tableau Server, we insert a link to the one-pager on our corporate portal.
  • We add more information to Tableau Server, with a link to said one-pager.
  • Additionally, we purchase unique merch for those who are going to complete the training as part of the Explorer and Creator programs.
Funnel of user engagement for training

A separate chat room was set up for communicating with students. This chat room is also one of the main entry points contributing to building a visual data analytics community within the company. Users share experiences, discuss what they’ve learned with their peers, and receive support from us.

The launch took place at the end of March of this year. We advertised the event heavily and held spot presentations with the heads of business areas and product. More than 80 students joined the program in the first month alone. Forty are in training now, and 22 have already completed one of the programs.

Making plans for the future

We know that we can go to “infinity…and beyond,” and we are in full agreement with Mr. Buzz Lightyear on this point. We will continue our efforts to improve our training program, supplement the Style Guide, and develop the community within the company.

Together with the People Growth team, we developed an implementation plan for our programs, to integrate them into our teams. Interestingly, it is going to vary depending on whether it is intended for “newbies” or “old-timers.” The main reason for this is the different experiences and habits of the “old-timers,” which are difficult to change. A clear demonstration of the program’s value when it comes to tackling work-related tasks will go a long way to help here. We will be doing this by collecting problems and solving them using Tableau in workshops. For “newbies,” it’s onboarding, which can incorporate both obligatory and voluntary courses.

Plan for promoting the training program

We will continue our efforts to “test-drive”, gain valuable experience, and improve our product, making it more convenient for users to solve their problems. And, of course, to further share the experience and knowledge gained.

--

--