AI Can Help Workers and Create Better Jobs

Q&A With MIT IDE co-director Andrew McAfee

ByDan Bigman, Chief Executive.net

The last time I talked with Andrew McAfee, a cofounder and codirector of the MIT Initiative on the Digital Economy at the MIT Sloan School of Management, it was just before Covid. McAfee, author of the bestsellers More From Less, Machine, Platform Crowd and The Second Machine Age, didn’t predict that the whole world would be shut down by a virus just months after our conversation, but he did nail a lot of other issues, many of which were accelerated by the bull-rush of technology change during the last three years.

Tops among them: The explosive growth of artificial intelligence which fuels a dizzying number of interactions between customers and suppliers, and bosses and employees. While so much of the coverage around technology and the workplace focuses on doomsday scenarios, McAfee — based on a lot of consulting inside of actual companies--sees things very differently. He sees a huge, under-discussed opportunity to help humans do work meant for humans. He also sees the chance for companies to truly understand the potential and needs of all of their workers and expand their skills, their pay and their career aspirations.

As an example, he worked with a big e-tailer recently to map the underlying skills of key jobs and then developed algorithms to scan for people at other companies with very different jobs and titles that share the same skill sets, unearthing hidden candidates hiding in plain sight. This kind of effort, scaled and unleashed across business sectors, will prove a boon for employers-and workers. “You’ll increase their wages and give them a path that feels better to them,” he says. The following conversation was edited for length and clarity.

Andrew McAfee

Q: So often what we hear about the interface between technology and the workforce is very negative: That AI is going to put tens of thousands of people out of jobs and we’re violating privacy. But you say we’re really underselling and underestimating the potential upside of technology and how it can help the workforce and help business do things better together. Is that right?

A: I’m completely saying that. Measurement is not the enemy on either side of that social contract, right? The better I understand what you are actually good at, the better I can make use of those skills. And if I combine that with some information about where you might want to take your career or your next job; that’s fantastic. Am I demonstrably underpaying you? Yeah, in some cases, I absolutely am. We only know that by measuring things pretty well and putting that information in a digestible format.

So, if measurement is intrusive, or surreptitious, or inappropriate for what we’re trying to accomplish, then there are problems with the measurement.

But the idea that measurement of what people do on the job is somehow bad doesn’t make sense for a minute.

Q: So, what’s the gap? Where are we, and where do we need to be so that technology can play a better, more useful role when it comes to the talent in the workforce?

A: We need to build up our capabilities to handle people’s professional information; to get that data as refined as people’s social information. So, the social Web 2.0 is what, 15 years old now? Holy cow, we are good at assessing all the things you do on platforms, all the things I do on platforms, all of our clicks, all of that to service very tailored ads. That is a huge, well-developed industry! We can argue ethics. We can argue pros and cons. But that is a big very, very refined industry.

We need to do as well with our professional activity measurement for the simple reason that it’s very hard to manage what you can only measure poorly.

The better we get at measuring things like your skills, your trajectory, and where you want to take your next job and your career, the better we can help you with that.

The thing I don’t like about a lot of the current [media] coverage is it hearkens back to this robber-baron era of thinking about business where there are just evil bosses trying to exploit helpless employees. Man, that’s a tired narrative. That doesn’t really work anymore. I guarantee you every company that I’ve worked with really wants to prevent its talent from walking out the door.

So, they want to know who those people are, why they’re being underpaid, what is the right market wage for their skills, taking into account local conditions, taking into account inflation. A simple question to answer is, what’s the right wage to pay you to minimize the chances you’re going to walk out the door because you’re unhappy with your salary? If we can help companies answer that, both sides will be happier.

Q: So if we get this right, where could we be going? What is the goal?

A: The instant I talk to business leaders about their merit cycle process — what we used to call performance reviews or bonuses — they do some combination of hitting their head against the nearest hard object and curling up into a fetal position. They’re saying, “Look, it’s this long drawn-out process.” Who are we most at risk of losing? Who’s about to walk out the door? The problem is, we don’t really know. We don’t really know the fair wage.

So we wind up in this long series of meetings where there’s some poor person trying to manage an Excel spreadsheet capturing the results of the discussion, where we take the merit pool and try to allocate it intelligently across a group of people.

One thing we can do pretty quickly [with technology] is remove a huge amount of the headache there and say, “Okay, you got $x mil of merit pool to allocate across all the people in the marketing department.” Boom. Here’s the first pass at that. Taking into account everything we know about job titles, wages and prevailing regional wages, and things like that, boom, here’s a plan.

Now, you can tweak it: “We’ve got to give Dan more for all these reasons; we can talk about. We’ve got to rejigger the rest of the numbers and give people a merit cycle process that is relatively painless as opposed to relatively painful; one that involves a decent amount of data and a decent amount of optimization as opposed to a huge amount of guesswork. That’s what we can do.

And that’s a beachhead to all kinds of other things. What’s the training plan? What’s the upskilling path? How do I, as a boss, feel like I’ve got a decent cockpit or a decent dashboard for the people in my organization instead of the current list of headcount by department? Let’s get a lot better than that.

If you’re going to build a cutting-edge factory today, every machine in that factory is instrumented and you have a very precise dashboard of how well the machines in your factory are doing. The idea that we’re many years behind that with the people, that doesn’t make a ton of sense to me. We need to fix that.

Q: Of course, there’s the potential here for that Orwellian future; look at what China does with social credit scores. How are we not falling into that in this context?

A: It’s a justifiable worry. What’s going to keep us from tipping into one side or the other? At a high level, it’s our values as a society, and then the laws and institutions that come down from that.

We have decided as a liberal democracy that we do not want to be an authoritarian state.

We have this 200-plus year tradition of thinking a lot and then trying to carefully limit the powers of government, for example. So, I ain’t voting for anybody who thinks the social credit score is a good idea. And I’m pretty sure that it would get struck down by the courts as they’re currently configured for a long time to come in the same way that the cops can’t just, you know, beat down your door without a search warrant, right?

On the corporate side, the other big force that we have is competition; employees can take their human capital and go elsewhere with it, which is the huge advantage that a market-based system has over a centrally planned economy.

[It’s so important] to give people valuable skills that they can take elsewhere if their current situation seems really Orwellian to them. There need to be requirements. For example, your employer has to make it clear what is being monitored and not. I don’t know if that’s the current state of the law. It’s the kind of thing we need to think about.

I always believe that sunlight is the best disinfectant, and if you or I take a job and the employer says, “Look, here’s what we’re going to do, and if we change that, we are going to tell you about it.” Good. I believe that adults, once you give them information, tend to make informed decisions.

There’s this deep split in how we think about people in this technologically sophisticated environment. One view is that they’re kind of clueless and helpless and we must make choices for them. We must say, for example, you can’t keep user information, or you can’t use AI to place people in a job, or to admit them to school.

I find very paternalistic this idea that, for example, people aren’t aware of what Facebook is doing. I really think people are aware of what Facebook is doing and how it makes money.

I don’t think that’s one of the great mysteries of our time. And the fact that Facebook has billions of users, to me, means that people are kind of aware of the contract and they’re signing up for it.

Now again, do we need transparency? Do we need clarity? Yeah, I think we can do with some more of that. But I’m in the camp that believes, especially with adults, that if you are sure to give them information, we can trust them to …look out for themselves. I’m on the side that says, “Wow, I believe we can fundamentally trust people if we give them information to make decent decisions.”

Q: When trust is violated people can either step away from the job or step away from Facebook or step away from whatever. That’s when they feel that they’ve been misused.

A: Absolutely. And that can happen if the employer, for example, wasn’t clear about monitoring, about surveillance. But people working in an Amazon warehouse, are they aware that their activities are being tracked? I certainly think so. There can’t be any big mystery. You’re a warehouse worker at Amazon. You think [Amazon] just doesn’t care what you do for eight hours? I don’t love the idea of that as some kind of violation.

CEO Editor’s Note: Andrew McAfee will keynote the Sept. 28 CEO Talent Summit (hosted virtually) along with former Medtronic CEO Bill George and Verizon CHRO Samantha Hammock. Join Us! More Information >

Originally published at https://chiefexecutive.net on September 20, 2022.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
MIT IDE

MIT IDE

4.2K Followers

Addressing one of the most critical issues of our time: the impact of digital technology on businesses, the economy, and society.