Decoding Data: A Glimpse into my Life as a Senior Data Analyst at Inato

Barthelemy Chopin
inato
Published in
7 min readJun 26, 2023

Hi, I’m Barth, and this is a day in my life.

This is not my workplace; although it could be | Yosemite Valley by Bailey Zindel

I can confidently say I enjoy my job. And, I should.
After all, this is what I spend most of my day doing.

I can also attest that it has not always been the case, nor is it the case for everyone I know.

Some of it has to do with the position itself and the tasks it entails. But a lot actually has to do with the company I work for and the people I work with.

Let me walk you through what makes my job so enjoyable by showcasing a day in my life as a Senior Data Analyst at Inato.

7 AM: I wake up

I wish I could say I easily wake up by myself, but parenting means I have to force me out of bed.

My dad life takes over for a couple of hours until the morning kid routine is sorted.

9 AM: I start my job day

I am usually well caffeinated by then so I start straight away. I’m in my most productive hours in the morning so I try to make the most of it. I also know what I have to work on because my projects are planned well in advance based on business team priorities.

Part of a project could lead me to answer a business question with data.
For example :

How well are we doing with our marketplace user engagement ? Are our engagement campaigns useful for disengaged users ?

I take time to document my analysis scope in Notion ahead of doing the actual “data” work :

What am I trying to analyze ? What for ? What are the hypothesis to test ? What will I not cover ?

Then I think about the right tool for this use case. For this type of task I would use Husprey, a SQL notebook. This tool allows me to build agile data analysis that is easily refreshed as context evolves and that produces easy to digest insights. It also pairs well with Notion which means I have my work documented right as I produce it.
If I were to decide to code in Python for an analysis, I would pick Google Colab.

An example of output from Husprey

This is deep, hard work. Being conscious about biases in your data set, knowing when to stop the analysis and taking a step back, interpreting results.

It is one of the most satisfying feelings to complete an analysis and providing good actionable insight that unlocks decision making.

This is not something I have always done right. As a matter of fact, this is something I almost never did right until I joined Inato and learned from my peers.

Which leads me to my next item in the agenda for today.

11 AM: Our Data Retro

It’s now time to gather with the team and reflect on the past week of work.

What are we proud of ?

What happened that we wish would never happen again ?

What are the opportunities ahead of us ?

I have gone through countless team meetings over my career. Having a meeting objective that does not solely serve the manager of said team is a key to its success in my opinion.

At Inato, Pragmatism is a key value we thrive to honor, sharing for the sake of sharing is a waste of time.

Rather, retros are made to share stuff that makes us feel proud about our work, or things that should be avoided at all cost.

Then, when taking time to reflect on opportunities, we fuel our future projects with tips and tricks that help us continually improve. We log actions that help us prevent the bad stuff from happening again and create more occasions to work on things that make us proud.

This is not a manager’s meeting about its team’s progress. It’s everyone’s meeting.

The Miro Board we use for our Retros

12 PM: Lunch Time

I have the luxury to work from home thanks to Inato’s remote policy.

I say luxury because I feel no difference in inclusion, whether I spend my work day at home or at the office.

So I’m cooking. Today is shakshuka. Yesterday was fried rice.
I like it, so it’s a real break for me, it’s not just about forcing nutrients in my body to make it work.

1:30 PM: Pairing

My coworker Hugo reached out on Slack earlier today, he’s working on one of our data models and is unsure about the best practice to adapt the model structure to account for a new array field.

A model in our staging layer where a number of nested structs and arrays are re-modeled

I don’t necessarily have the answer for him. But, we pair on Tuple. He shares his progress, we go through his reasoning and I can directly interact with his screen.

1:45 PM: Communication Catch-up

I go to Notion and check my Backlog, then Slack if I have messages or discussions that require my attention.
I make sure my stakeholders are informed of my latest progress, I redirect them to findings that need their attention or questions I have for them.

Today, I found that one of our campaign was not providing probing results. I reach out to our customer success team to get their point of view on my findings. I rely heavily on asynchronous interactions to avoid packing my agenda with meetings at the detriment of analysis work.

Not everything can be done asynchronously though, and that’s okay.

2:30 PM: Meeting my Project Stakeholders

We discuss requirements about a tool I am building for them. I made a process diagram and a prototype in Excalidraw.

A process diagram I sketched in Excalidraw

We go over the expected value of the project (here, it’s about efficiency so a $ equivalent of expected time saved) : it’s important to have in mind as much as possible a quantitative estimation of what the data team provides.

Support roles like ours struggle to justify their impact on the business, clear value per project is a perfect way to tackle precisely that.

We also go over business metric definitions and how exactly we will calculate them. I then store this in a single documentation page that is easily available by all in Notion. This avoids confusion down the line as to how figures are computed.

3:30 PM: Wrap-up and coffee time

Summarizing the call’s outcome and next step in my project’s Notion ticket whilst I sip on my afternoon espresso.

4 PM: Auditing our Data Model and our Team Organization

Logging code refactoring and lineage simplification opportunities.

We update or add new code frequently. Taking the time to assess the relevance of past choices in light of today’s business state is key.

This is usually the time for me to take a look at what is done by other companies. I look online and here on Medium about ways to tackle the different problems we face at Inato. It’s an important part of continuous improvement. Not all ideas come from within, and thankfully most of the times, other brilliant people have already found a way to fix exactly your problem.

This is also the moment to do peer reviews on code from my coworkers in the data team.

A PR Review example on our inato/dbt Github repo

5 PM: Delivery Work

Working on dashboards or data apps that are either in progress or that are already in production and may need tweaking.
This could either be in Looker Studio or Retool.

6:30 PM: My Job Day is Done…

… and my dad life takes over again.

Because I’m working from home, I’m already available to enjoy some quality time with my daughters, do some exercise and the daily chores.

11 PM: Bed Time

That coffee from earlier did nothing to me and I shortly fall asleep, ready to start a new day again soon.

Hope you enjoyed this day in my life. If you did, check out the day in the life of our Product Data Analyst or our Lead Data Analyst.

Not all days are exactly like this, but this is a pretty good depiction of the average day.

If you too would like to know how it’s like to work for Inato’s data team, we’re hiring !

Have a look at Inato’s career page.

--

--

Barthelemy Chopin
inato
Writer for

Senior Data Analyst who used to work as project manager. Went from Banking to Tech (for good). Enjoy everything data, food and cocktails.