Quantifying the impact of remote work on the work-life balance

Arik Friedman
Data at Atlassian
Published in
9 min readNov 18, 2020

--

Recently, we published on “Work Life by Atlassian” findings on the impact of COVID and remote work on work-life balance. This post is an extended version of that write-up, providing more detail and a “behind the curtains” look into analysis methods.

On March 12, 2020, the staff at the Atlassian office in Sydney got a message on Slack announcing the office would close until further notice due to COVID-19. This was shortly before all of Australia went into lockdown. Within a few days, we made the decision to temporarily close all our offices globally and transition to a company where everyone works from home. More recently, after several months of company-wide remote work, Atlassian announced that staff could opt to continue working from home indefinitely.

Atlassians meet in the San Francisco office and remotely (2019)

Throughout this transition, I’ve felt that working from home blurred the boundaries between work and life. But was that really the case?

Atlassian makes team collaboration tools, and in my role as a Principal Data Scientist, I spend a lot of time looking at how our customers use these tools. As users interact with our products, we record anonymized analytics events that capture those interactions. Those analytics events help us understand the user experience and guide product improvements. Our products are an integral part of daily work for millions of people across the world. With such a large data set, I wondered whether user activity patterns could illuminate changes in our collective work habits with the transition to remote work. Are we starting work earlier? Stopping later? Has taking a break for lunch been canceled?

Recent research from NBER analyzed mail and calendar data to put a spotlight on the blurring boundaries between home and work. In the Github Octoverse spotlight, GitHub analyzed patterns in developer activity and concluded that their workdays have indeed gotten longer by up to an hour per day. Inspired by those analyses, in a recent ShipIt hackathon day I joined forces with fellow data scientists Pedro Queiroz, Praveen Bysani, and Shahin Elliin to investigate what our product analytics events can reveal about the changes in our work patterns with the transition to remote work.

Using product analytics for fun and fact-finding

Analytics events allow us to understand interaction patterns, identify navigation flows, discover friction points in the user experience, and guide product improvements. Currently, we organize our analytics events into four categories:

  • Track events capture state changes, and provide visibility into the lifecycle of product entities (like Jira issues, Bitbucket pull requests, Confluence pages) and the context in which they go from one state to another.
  • Screen events capture user flows in the product, and provide visibility into user journeys in the product and navigation changes.
  • User Interaction (UI) events capture interactions with the product user interface, like clicking buttons and operating menus. In some cases user actions can trigger multiple events — for example, creating a page in Confluence can fire both a UI event (button clicked — publish) and a track event (page created).
  • Operational events are a catch-all for things that don’t fall into the other categories. They typically capture system events, like performance indicators and experiment flags.

To understand work patterns, we looked at analytics events from Jira, Confluence, and Bitbucket. To reduce the chances of capturing activity from automated scripts or system events that happen while a user isn’t active, we focused on the subset of UI events, which signal direct interactions with the user interface. Although only part of everybody’s work takes place in these products, the aggregated data was still sufficient to identify patterns in the workday.

Every analysis starts with data prep

It is a fact of life that a data scientist in possession of a dataset must spend 80%+ of their effort getting it into shape before they can generate any insight. This work was not an exception.

To assess trends and changes over time, we looked at the data from two perspectives. One approach we took was to track overall trends over 2019 and 2020, week by week. The other was to hone in on the impact of the pandemic by comparing aggregated data from before and after lockdowns were announced.

According to Wikipedia’s National Responses to the COVID-19 pandemic page, most countries applied lockdowns at some point throughout March 2020. We defined the before period to consist of activity between January 2020 — February 2020, when most of the employees and workers were predominantly working from physical offices. The after period consists of April 2020 — May 2020, at which point most knowledge workers worldwide were working from home. We only considered customers who were with Atlassian throughout the whole period (January 2019 — July 2020), so that comparisons are not affected by changes in customer mix over time.

A first look at before/after user activity heatmap (by user count) of our own teams Before Working From Home (Pre-WFH) and After Working From Home (WFH) told us we are on to something, as the notion of blurring work-life balance became quite literal:*

A heatmap representing user activity of Atlassian staff before/after working from home. The “after” period looks blurry.
* This visualization was created during an internal hackathon (called ‘ShipIt’). We were not able to reproduce such a stark visual contrast after further data clean up and code structuring, so it did not make it through to the Work-Life post. Nonetheless, it was a great motivator towards the insights that followed.

We used the first and last time of user activity — Monday through Friday for most countries (but not for all!), rounded to 5-minute intervals — as indicators of the start and end of the workday. The time between these end-points approximates the length of the workday. There are seasonal patterns over the days of the week (Tuesdays are longer, Fridays are shorter), so to keep things simple, we captured the average start and end times for each user in each period (before/after lockdowns, or for each week). We then calculated the average start and end times in each location across the user base.

A note on aggregation.

The chart above, as well as the ones below, look at averages. We actually went over multiple iterations refining how we handle aggregates, landing with two options:

  1. Calculating the average start/end of day across all users: averages are intuitive, and maintain nice arithmetic properties: for example, average day length = (average end of day) — (average start of day). This also makes it easier to think about total work time as the average x user count.
    The main downsides are: (a) they are sensitive to outliers (long-working users skew the average up), and (b) averages and standard deviations are most meaningful when dealing with normal distributions, which may not be the case here.
  2. Assessing median and percentiles: looking at medians and the 25th/75th-percentiles provides some useful ways to interpret the data. For example, given the median start time in a given week we know that for half of the users the average work day started before that time, and for the other half it started after that time; we know that for half of the users the average day started within the range defined by the 25th/75th percentiles; And if the 75th percentile for the end of day is at around 9pm, this tells us that a quarter of the users in that office finish their day (or at least interact with our products in some capacity) later than that. The main downside is that unlike averages, medians don’t lend themselves to arithmetics — for example, the median length of day won’t align with the median start and end of the day.

The impact of lockdowns

The impact of lockdowns on our customers was consistent with the patterns we observed amongst Atlassian staff. A striking change took place in March, with workdays becoming significantly longer by April.

A chart showing the average length of the work day week by week in the United States, Italy, Australia and Korea.

Beyond looking at the total length of the workday, we wanted to assess typical start and end times, before and after lockdowns.

A chart comparing the average start, end and length of the workday in 13 countries before and after the lockdowns in March.

We also compared the distribution of working hours for our customers. Because we were looking at the distribution of work throughout the day, the data captures the proportion of activity throughout the day. Think of it this way: working an extra hour in the evening doesn’t necessarily mean you’re working less in the morning; it’s just that mornings are now making a smaller relative proportion of the workday. In most countries working from home had the effect of extending the work day.

A chart comparing the distribution of the user activity in 8 countries, before and after the lockdowns in March 2020.

For easier comparison between the before and after periods, we can zoom in and see the hours for which the proportion of user activity increased post lockdown (in blue) versus the hours in which the proportion of user activity decreased post lockdown (in orange).

A chart showing the proportion of changes in user activity in 8 countries, before and after the lockdowns in March 2020.

A note on the limits of analytics events.

The chart above looks at the distribution of user activity throughout the day. However, an increase in relative activity doesn’t tell us whether there was an absolute increase in activity. One particular question that came up was whether the longer hours are traded-off in exchange for more breaks throughout the day, and that’s not something that the distribution of work in each period tells us.

There are two main challenges when comparing absolute levels of activity across different periods:

  1. Changes in the user base — over time, there are new users who onboard our products, others users churn, and some users may change their interaction patterns (for example, due to role change).
  2. Changes in the product and the analytics events — our products evolve all the time, and new features are added. Analytics events evolve with them, and over time we may also increase analytics coverage in parts of the product that weren’t instrumented before. The further apart the periods we compare, the more changes we’ll see in instrumentation. We don’t want to confuse increase in analytics coverage with increase in user activity.

While the first challenge can be somewhat controlled by focusing on specific user cohorts, the second may be harder, especially when aggregating data coming from multiple products. While additional internal analysis we’ve done suggests that on aggregate the late-hours overtime was not actually traded-off for extra breaks throughout the day, the challenges above limit our level of confidence in these findings.

What does it all mean?

The lockdowns due to COVID-19 have pushed many industries to embrace remote work, temporarily or permanently. This poses a challenge for employees, who are trying to adjust to work in new settings, as well as for employers, who need to keep their businesses running throughout this big change.

The patterns we observed suggest that the boundaries between home and office blur, and companies and employees need to adjust their practices to this new world. This was further confirmed by qualitative data from a study commissioned by Atlassian and conducted by research firm Paper Giant in July 2020. Even those without caregiving responsibilities reported struggling to delineate between work time and personal time, and were prone to working long hours without pausing for a break. Over half of respondents said it is harder now to maintain work-life boundaries than before the pandemic, and 23% reported thinking about work during their off-hours more than they used to.

In other words, we need to find effective ways of “switching off” or risk burning out. Because team leads and executives set the tone for their organizations, they have a special role to play. First, they can devote a portion of 1-on-1 meetings to checking in on their team members’ well-being and chatting about what’s going on outside of work. This helps build the rapport needed for employees to be valued and heard. Leaders can also encourage team members to set reminders on their calendars for when it is time to take a break or stop work for the day. And most importantly, they can set an example by muting notifications on their phones after hours and taking time off to recharge.

Learn more about Data at Atlassian

Stay in touch with the Analytics & Data Science team and learn more about careers at Atlassian here.

--

--

Arik Friedman
Data at Atlassian

Principal Data Scientist at Atlassian. Studying how data from products like Jira and Bitbucket can help us understand how software teams work.