Analyzing and Visualizing Insights From A Freelance Project (Part 1)

Gbóláhàn Adébáyò
6 min readJul 18, 2022

--

A picture showing freelance on a laptop

This article is divided into two parts. The first part covers collation, cleaning, and modeling of the dataset with Microsoft Excel while the second part covers the querying of data with SQL and visualization of insights within a dashboard built using Tableau.

I am very excited for two reasons. The first reason being I got my first freelance client on Fiverr after so many months of longing for one. The second reason is that I was able to work on the project, get paid, and get a wonderful review from the client alongside. I am also happy to say the client gave me full permission to document this project.

After successfully completing and submitting the project for review, I thought it would be a perfect idea to document the entire process I undertook while working on this project so that data analysts out there could learn something new or add to their knowledge.

I will be explaining the entire process of completing the project from meeting the client to submitting the final project for review in several steps. It’s going to be a fun ride.

So, let’s go!

1.) Meeting the Client:

I created my Fiverr account in 2021 and after a short while of being unable to successfully get gigs, I became inactive on the platform. However, earlier this year, after trying severally to get a job and not being able to, I decided to try my hands at freelancing once again. I edited my gigs to show some of my best works and I reorganized my entire profile. Luckily for me, a client reached out to me and asked if I could help to do some exploratory analysis on the dataset he had and then visualize information that could help provoke making the right decisions to solve the problems that the dataset showed. I told him I would be willing to take on the project, we agreed on a price and I started work on the project.

2.) Asking questions:

The first step for every data analyst is to ask questions and that exactly was my next step. I made sure to ask questions in order to understand the context of the problem we are trying to solve. The client then briefed me on what my task was and what he expected from the exploratory analysis.

For context purposes, I will like to give some background on the arbitrary company we are trying to analyze their data.

HappyPhones Ltd is a company that is concerned with delivering mobile communications services and finding ways to improve its customer service and knowledge base. In recent times, their customers have been finding it difficult to properly reach out to the company, lodge complaints, and get solutions to their problems. My goal as a data analyst was then to ask the right questions that could give insights into why this was happening. My task was to properly collate and clean the data, write queries that can answer questions I ask while analyzing, and suggest tables that could be added to the data model to ideally help understand the data schema better while visualizing actionable insights and information that is necessary to spur data-driven decision-making from the stakeholders.

Database Schema for HappyPhones Ltd

3.) Collating, cleaning, and preparing the dataset

After asking questions as an analyst, your next step is to get access to the data, clean it and prepare it for analysis. After discussing the questions I had with the client and understanding the problems that I will be finding solutions to, it was time to get access to the data and really dig in to understand it.

However, there was a little challenge. The client had access to the data through the Azure data studio and access had to be authorized with a pin making it difficult for him to share the data with me. So, we had to think and improvise a way to work around it. I told him to create a tableau workbook, connect it to the data and then create an extract of the workbook for me to work with. After he was able to successfully do this, he sent the tableau extract to me for cleaning and preparation.

The extracted workbook in Tableau

I was able to then load the Tableau workbook and view the tables in the dataset. In order to clean the data, I had to properly check the data types and also replace the null values in some columns with zero. This is because in Tableau, nulls aren’t automatically represented as zero and thus one has to be careful while creating calculations. It was not so easy to replace these values in Tableau, and I had to find a way to copy the data, and then paste the several tables in the dataset into a new worksheet I had created in Microsoft Excel.

The created worksheet in Microsoft Excel

I used Excel power query to change the data types, clean, and replace the null values during the data cleaning and preparation process.

4.) Modelling the dataset and answering questions:

The next step was to correctly model the tables shown by the schema given in the database using Power Pivot in Microsoft Excel. I was able to achieve that by modeling the pasted tables based on the relationships between them using their primary and foreign keys. In the image below, I like to show my facts table below my dimensions table in the diagram view as it makes it easier to know them respectively and to visualize the existing relationships between them.

The next task was to modify the given schema and suggest dimension tables that will provide insights we wished we had in the database schema. These tables are also supposed to help improve the knowledge base of the company.

Solving this task was one of the most difficult parts of the entire project. These tables have to connect with either the fact table or one of the dimensions table in the database schema. I had to set up a new meeting with the client in order to ask questions and generate new ideas. After a lot of brain-racking, I was able to come up with two tables: The TechnicianDim table and the InactiveUsersDim table. The TechnicialDim table was able to discuss creating records for the Technicians sent by the company to fix customer complaints that require physical attention while the InactiveUsersDim Table contained the records of customers who have stopped using the company services and the reasons why they stopped.

To continue with this article, I will be writing the second and final part of the steps I took to complete this project in part2. I am excited to get your comments and answer questions you might have so far. It’s been exciting and I hope you enjoyed the read so far! Follow me for part 2 of this article and I hope you were able to learn something new from this.

Thank you for reading!

Gbolahan is a passionate, creative, and goal-oriented analyst. He is a strong advocate of using data to help derive insights and make clients and companies make better data-driven decisions.

Let us connect on Twitter and LinkedIn.

You can check out my past projects at my website or take a look at my Fiverr gigs.

--

--

Gbóláhàn Adébáyò

I am a passionate, creative and goal-oriented data analyst. I’m a strong advocate of using data to help derive insights and make better decisions.