CTO Corner #4: Know What Problem You’re Solving
We’re back with the CTO Corner! I’m Leo, a software engineer at Jebbit, and I’ve been interviewing heads of engineering to get their take on running a team, working with other departments, and the future of tech. This week, I talked with Aaron Olmstead, the head of engineering and product at Liquid. We discussed the difference between engineering at a large company and a startup, what makes an ad-tech leader, and when to build things in-house versus outsourcing.
How did you get into engineering?
I did a couple programming workshops when I was a kid. Those were pretty neat and I got the bug to want my own computer. I got a Commodore 64 when I was a eight or nine, and when I got to high school I took some computer science courses. Then majored in that in college and here I am.
In the past eight years I’ve been in ad tech, mostly at startups. Publishers Clearing House (PCH) is a slightly different situation, where I work for Liquid, which is the digital advertising part of PCH. It’s a great opportunity to have a startup culture within the stability a bigger organization gives you.
I’ve been at a bunch of startups and a few larger companies. It’s been useful to see both sides of that. They have very different ways of operating. It’s good to be able to adapt to that change.
Is being head of engineering more difficult at a startup or at a large company?
After Fiksu I had my own startup that I tried to get off the ground for a couple years. That was a tremendous learning experience, to get exposed to the other parts of business that you don’t get as a engineer. I learned how to write contracts, what an LLC means in the state of Massachusetts, and how to do sales (though not very well!). That’s something I don’t get as much at a company like PCH, where I work more with my team and HR and develop growth plans. But, it’s not like you’re learning more at one than another; you’re just learning different things.
What are the overall challenges of being a department head at a tech company?
There’s stuff that transcends the size of the company. Once you’re more than three or four people, you see tough things like getting the business side to engage with the technology and getting the business aligned with upfront engineering costs. Unless you’re working for someone straight out of engineering, you run into issues like once a platform is built, they think you’re done. But really, once you build a piece of software, you’re married to it.
Then there’s stuff unique to ad tech in particular. This transcends big companies and small companies. This industry is dominated by Google and Apple and Facebook — any one of them can make a decision that completely changes the competitive landscape overnight. As a CTO you’re always balancing between building out particular business relationships and the technology stack, versus diversifying, to prepare you for these changes.
What would you say are major changes you made to positively impact Liquid?
I’m cynical and experienced enough that I am less excited about writing code for the sake of just writing code. I try to get us to think more deliberately about things like when is it time to build a solution versus buy it off the self. When I came on, we were about to build a giant software stack that did all the same stuff as all the other ad stacks I’ve worked on. So I said, what do we do at PCH that’s unique to us? Is this huge initiative worth it, that’s going to eat up a bunch of resources to not build anything truly unique? Or do we want to build something out based on our unique longstanding data capabilities, granting sweepstakes rewards, and things like that that only PCH can offer? We built services to handle the unique PCH business rules, and integrated those with commodity, ad serving and mediation platforms, which means we built out a lot less software, but it’s far more robust and scalable than what we had before, and has far more unique offerings than things we could have gotten just off the shelf.
Was it hard for your team to make the change of stepping back and realizing things didn’t need to be built in-house?
There was friction, but I think it helps to be able to paint decisions like this in a revenue and cost basis. So, you’re saying here’s what it takes the engineering team to maintain this stack we’ve built, here’s what we’d need to hire to build this other thing, and here’s the opportunity cost of it all. You go around and gather support one person at a time and try to paint the picture that it’s better for the overall business.
As you build new products and features, how do you get feedback from account management, sales, marketing, and other teams back to the engineering department?
That’s a really good question. It can be hard to get engineers to give a damn about anything that’s not code. One thing you can do is to make sure that engineers actually see the results of the changes that they make. Data about campaign performance, revenue, impression volumes, margins and all that stuff. Engineers tend to like games, so you give them a tight feedback loop with the visualization of this information, and the changes they make to their systems, and how those impact monetization, becomes a game. You engage the engineers that way, and they tend to like seeing it and pay attention.
One of the other things, just to be clear, I don’t think it’s possible to get all the knowledge from Ad Ops teams, from sales, and from marketing to the engineering team. I think sometimes there can be a chip on the shoulder of engineers, in that they think they can know everything, but there’s so much knowledge in the heads of ad ops people and sales people that you could never fit all that, plus all the systems knowledge they already have, in the engineers’ brains! We should work with ad ops and understand that it’s really hard. We’re setting up an ad ops boot camp for new ad ops hires but also for engineers and product so they can sit in the ad ops chair and learn about how people use the tools that my teams build. It’s easy for us to think that we’re the only ones at the company with hard jobs because we have to get up in the middle of the night to fix servers that are melting down, but getting your hands dirty and operating an ad campaign can help. It shows that different teams are all really smart.
To make that communication happen effectively and reliably, everyone needs to have the respect of their peers. You can’t have engineers saying we’re the smartest people and ad ops saying engineers never build the tools we want — that can spiral. That’s why I make sure our teams are all talking to one another.
What would you say makes an innovative leader in mar-tech and ad-tech?
What I’ve come to learn over my time doing this is that a lot of this is about, as simple as it sounds, paying attention to the market. Using data that’s relevant to your business, tracking it obsessively, and using those data to adjust what you’re doing. A lot of people come in having an ideological agenda and use that to drive decisions in the face of all the evidence, trying to predict where the market will be in three years. Instead, you need to experiment, iterate, and correct your strategies. You’ve got to figure out not just where the opportunities are, but also what are the kinds of business problems that automation is really effective at solving? What can we build that makes people able to get their work done faster?
We’re coming out of a long period of ad-tech where people assumed that hiring lots of CS PhDs and building out these big machine learning systems was going to be the answer to everyone’s problems. Now we see some of those companies crashing because they invested more in the notion of building really exciting software, than in offering a useful service to customers that offers good value. There’s a growing understanding that machine learning on its own is not a panacea especially in a space that changes as quickly as this one. The kinds of things humans are good at — using intuition, and using knowledge aggregated from a ton of different domains to make quick changes without a lot of data, are complementary to a lot of what software can do, but not necessarily the kinds of skills that AI can come close to replacing.
Do you think the rapid increase of new companies is healthy for the ad-tech space?
I think competition is always healthy for the space. One thing we’re seeing in ad-tech in particular is that venture money has become more reluctant to invest heavily in ad-tech because it’s become overcapitalized. You’re seeing some bigger and more longstanding companies refocus from rapid growth to sustained profitability, and some people from those companies go out and do their own thing. The more competition you see, the more innovation you’re going to see, because there are zillions of different ways to solve the challenges in the space, and everyone comes at it with different backgrounds and skillsets. Someone with a background building DSPs is going to approach a problem very differently than someone with a background choosing ad networks to integrate, versus someone with a background running campaigns.
Do you have any advice for founders?
That’s the thing that’s hardest about starting a tech company — you need to very quickly figure out all of those things. You need a crash course in different business models, different kinds of contracts, figuring out how to sell if you haven’t before, how to negotiate, and you have to recognize when you know enough about running ad campaigns and when you don’t, and then you need to find people who complement that. You have to get people with really specific skills.
I highly recommend that anybody interested in the business aspect of whatever tech space they’re in try and start their own company at some point. Mine wasn’t successful but what I learned was invaluable.
Where do you think tech is heading?
That’s kind of the zillion dollar question, right? One of the things I’ve learned in my career is that no one knows. We’re getting better about figuring out the spaces where automation, crowdsourcing, and AI are good at solving problems.
I would have been flabbergasted if five years ago someone had said, we’re going to see proliferation of self-driving cars on the roads way before AI is good at targeting ads. But that’s where I think we are now. Tech innovation can be very hard to predict.
So you see things like the spread of facial recognition in social media, and the potential for leaking PII in ad targeting and big data systems, where people can end up using those tools and techniques to do things that, depending on your point of view, might not make the world a better place. We as technologists have a ton of responsibilities that we tend to wash our hands of, saying, it’s not my decision, I just make my code and don’t tell people what to do with it. But without us, some of these issues couldn’t happen. I’d like to see us try harder to avoid hubris, to be aware of what people could potentially do with our tools, and to ask questions like “can I build my ad targeting system in a way that it’s just not possible to leak PII?” Rather than assuming that if some bad actor uses our tools to do sketchy stuff it’s not our problem, can we bake things into the tech itself to ensure it can only be used on the up-and-up? (PCH does some of this with our audience systems, by the way, which I’m really proud of.) It may be a stretch to think about Asimov’s Three Laws of Robotics in the context of ad targeting, but I think we as engineers can do more than we do, and I’d like to see us as a profession think about how our tools might get used a lot more than we do.