The Evolution of Why We Work
To understand our relationship with work, we need to start from the very beginning.
By Lawrence Yeo
There’s a reason why the ever-present question, “What do you do?” is shorthand for “What do you do for work?”
It’s because the very act of work is universal, and is a concept that has unified humanity ever since the dawn of our existence. So asking what someone does for a living is often seen as an anchor point of dialogue that will reliably produce a response instead of an overwhelming silence.
But long, long before crowded bars, networking events, and millions of questions about the companies we work for, we were all nomadic beings, roaming around this planet gathering food and taking up temporary shelter in the harshest of climates.
During this time, work was a universal phenomenon across our entire species for a very simple reason:
If you didn’t work, you just kinda… died.
Work was a purely functional endeavor, as our survival directly depended upon the food we needed to hunt and catch for that very next meal. A lucrative career had nothing to do with selective promotions, bigger houses, or higher salaries. Instead, it meant that you were adept at hunting game, gathering fruits/nuts, hand-building shelter, harnessing the power of fire, and nurturing your young.
In other words, you were a successful human simply if you stayed alive.
Survival and reproduction were the only items on our collective job descriptions for a really, really long time, but things started to shift around 10,000 BC. In what is now known as the Agricultural Revolution, farming communities began to spring up across the world, with each region cultivating crops that were specific to their natural environments.
As agriculture spread and farming became the primary means of food production, people were finally able to settle down and stay in one area instead of roaming around from one place to another. And with this change, the nature of work fundamentally shifted as well.
While the nomadic lifestyle viewed work as a generalized endeavor, the agricultural society viewed work as something entirely different:
A series of specialized tasks.
This specialization of labor will eventually carry itself into the Industrial Revolution… but with a pretty significant twist. Whereas a farming community in 2,000 BC gauged the progress of work by the actions performed that day, a factory in the nineteenth century tracked productivity by the number of hours that were worked by its employees.
This introduction of time into the workforce was a significant development, and is something that most companies still use today as a leading measure of workforce productivity. In order to understand how this happened, let’s rewind a bit to the fourteenth century and place ourselves in the middle of a large European city during that time.
Okay, you see that clock tower in the background? Well, that thing should actually be in the foreground, as it’s the main character of the point I’m about to make.
According to English historian E.P. Thompson, the local merchant guilds of these major European towns funded the construction of these clock towers so its citizens would view time in the same way they did — as a valuable resource that could be counted, budgeted, and sold. The ability to slice up time into seconds, minutes, and hours changed the way people perceived it, and pretty soon time was viewed as a disposable commodity that could be bought and sold, just like money.
It’s no coincidence that the dissemination of domestic clocks and pocket watches happened in the late 1700s, which was also when the Industrial Revolution began and 10–16 hour work days soon became the norm. Since everyone was on this unified, fixed grid of time, factories would be able to tell its workers to show up by a very specific time, and everyone would reliably be able to do so. As employees punched the clock upon entering and leaving, they were quite literally leasing their time for money as their wages would be pegged by the hour.
This trend continued into the twentieth century, but with a slight revision whose echoes can still be heard in the conventional wisdom of today. In the 1920s, Henry Ford introduced the 5-day work week, with the requirement that people work 40 hours across five consecutive days, and rest up for two.
With this move, the culture of consumerism entered the workforce, as people sold their time for money, and then spent that money on the newfound leisure time of the 2-day weekend. The sale of one’s time via the 40-hour week became the way we defined our work, which represented a massive departure from any of the ways we viewed work in the past.
As the years went by, less and less people worked in factories, but the mentality of selling our time for money remained. Access to information continued to be controlled by only a few companies (for example, only three channels dominated television broadcasting in the U.S. for most of the twentieth century), and the production of goods still belonged in the hands of corporate monopolies and those with access to capital.
We traded in our workbenches for cubicles, but the work philosophies of the industrial age continued their reign.
But as society pressed onward, the seeds of yet another revolution were being planted — one that would fundamentally alter the way humans would interact and build things with one another. In the late 1970s, these little guys were starting to make their way into the walls of our homes:
The world’s first mass-produced home computers were starting to populate the United States, but for the most part, the people that owned them were simply trying to understand how they worked. In addition to doing some basic programming work on these intriguing machines, people also played some games, used them for word processing, and created some rudimentary spreadsheets on them as well. They were great for personal productivity and fun, but widespread information dissemination wasn’t a strong feature of these machines at the time.
But then of course… this little thing came along and changed everything.
With the introduction of the World Wide Web, the architecture of the internet was born. Each computer became an online node that would receive and distribute information to all other nodes in this growing web of knowledge, which became cheaper and cheaper to access as well. As more convenient mediums such as smartphones allowed more nodes to populate this web, our ability to communicate with one another grew at an unprecedented rate, and fundamentally changed the way we exchanged ideas and collaborated with one another.
Through the internet, what was once in the grubby hands of the few became accessible to the masses. As the bottleneck of information broke wide open, it became easier for us to learn about new ideas and discover platforms to share our own insights as well.
While the industrialized society focuses on brute productivity with the intention to scale, today’s connected society thrives when we create awesome things that can make an impact. The culture of innovation is not about leasing someone’s time and forcing them into a rigidly defined job position — no, it’s about collaborating with the best people you can find to build a role that will bring out their talents, interests, and abilities in the best possible light.
When you combine passionate, hard-working people with a unified belief in what they are building, the culture of innovation allows that work to be cultivated and spread amongst the masses. Creativity and innovation have reliably made their way to the forefront of why we work, and for the first time in our history, we can gauge our purpose of work by the favorable impact it has on others rather than the means of survival it provides for ourselves.
However, there is this common belief that viewing work through the lens of its meaning and impact is to do so with a sense of “entitlement.” People that believe this often tell others that they should just be happy to have a job, regardless of the mundaneness or purposelessness it brings about. One must learn how to work and spend their time doing things they don’t really care for to make money — after all, that’s “just the way the world works.”
While I understand where they are coming from, this argument reliably fails for the following reason:
When the landscape of “how” we work shifts, the “why” naturally follows as well.
Here’s a quick thought experiment:
If a caveman from 20,000 BC was transported to the industrial age and saw that he could now settle in one place without the fear of being eaten alive, he could now work a set period of hours a week to make money, he could buy groceries with that money to provide for his family, and to top it off, he actually had some leisure time to just chill out and relax, would it make sense for him to call this society “entitled” and instead go back to hunting animals on the street to show everyone what’s up?
No, that would be ridiculous — his environmental landscape has fundamentally shifted, so he no longer needs to view work as a sole means to stay alive and pass his genes on as soon as possible. Because a new way to work has been introduced, his adaptation to that fact will shift the reason why he works as well.
On its surface, our environment today may not look like it has changed much — we still have cities, roads, power lines, and other things that can be traced directly to the industrialized age. However, underneath the hood, the whole structure of human connectivity looks nothing like it did a hundred years ago.
We now live in a world where the pace of change is unprecedented — technological progress is moving at a rate where our predictions of the future sound more like science fiction than reality. In the twentieth century, it was okay to spend the first twenty-some-odd years of your life in school, and then dedicate the rest of your life to a set career path leading to predictable retirement benefits. Multiple decades’ worth of uniform, 40-hour workweeks in a singular line of work was the formula for individual success.
But in today’s world, we cannot afford such stagnation.
We need to use the tools of creativity and curiosity to constantly iterate on what we know, and reinvent ourselves in the process. The technological revolution has set our work’s purpose to create something that will positively impact the lives of others, and the only way to do this is to continuously update our understanding of the world at any given moment.
In his latest book, 21 Lessons for the 21st Century, Yuval Noah Harari sums up this point quite nicely:
“[I]n the twenty-first century, you can’t afford stability. If you try to hold on to some stable identity, job, or worldview, you risk being left behind as the world flies by you with a whoosh. Given that life expectancy is likely to increase, you might subsequently have to spend many decades as a clueless fossil. To stay relevant — not just economically but above all socially — you will need the ability to constantly learn and to reinvent yourself.”
The frustrating thing about conventional wisdom is that it lurks in the cultures of even the most forward-thinking companies today. Centuries-old work philosophies like standardized work hours, boss-to-worker relationships, and rigid corporate ladders are still prevalent in the technology giants of the modern world.
So regardless of where you work, it’s important to press pause during the workday and ask yourself a few questions.
Are you in an environment where self-directed, curiosity-driven work is highly sought after… or frowned upon?
Do you have the autonomy to work on the things that highlight your unique abilities, or are you forced to make something that fits the “company mold”?
Is creativity viewed as a cornerstone value necessary for growth, or is it seen as a disposable luxury that gets in the way of profit maximization?
As you take the time to ponder these questions, it’s important to see how your work life may or may not be aligned with the reality of what is possible today.
No one really knows what the world will look like a hundred years from now, so performing a quick, hundred-thousand-year zoom-out on the past is a good way of understanding where we are today. While the pillars of curiosity, creativity, and collaboration were major players through most of our collective existence, they will be the only things that can successfully navigate us through the ever-shifting landscape in front of us now. History’s greatest lesson is that change is the only constant, and these pillars will allow us to be on the right side of it as it happens.
So I leave you with a final question:
Are you in an environment that encourages and fosters those values?
If so, that’s awesome — keep pushing forward.
If not, then I encourage you to reflect on that. Perhaps now is a good time to think about what it means to work on something that brings out the best you have to offer.
Since how we work has fundamentally shifted, now is a good time to adjust why we do it as well.
Just remember that necessity — not entitlement — is the fuel behind this desire for meaningful and impactful change.
Note from the author, Lawrence Yeo: This post was created in collaboration with Unsplash. A few months ago I came across the thinking behind their wonderful hiring page, and the insightful ideas in it inspired the creation of this story.
Note from Mikael Cho, Co-Founder/CEO at Unsplash: Lawrence is an incredibly thoughtful creator. As soon as I saw his work, I was hooked. You can see what got us all excited about Lawrence on his site More To That. We hope you enjoyed this story.