There are lots of articles out there telling you how to be a good manager. This isn’t one of them. This post is specifically directed at people who manage search teams, as well as at the people who hire and manage them.
Managing search requires both product and engineering strengths.
What does it mean to manage a search team? Are we talking about an engineering manager or a product manager? The short answer is yes.
In search, you cannot cleanly separate product from engineering. A search product manager needs to understand a lot about engineering, from how architectural choices impact scale and performance to how search user experience decisions affect the design of machine learning pipelines. Conversely, a search engineering manager needs to understand a lot about the product, since so many technical decisions about indexing, retrieval, ranking, and other aspects of search have subtle product implications.
If a single person leads search, that person needs to possess a robust combination of product and engineering strengths. If there are parallel leaders for product and engineering, it is critical that each leans into the other’s role. To some extent that’s true for all software development managers, but it’s especially true for search, where so many key decisions involve a combination of product and engineering trade-offs.
Everyone on a search team needs to be data-informed.
One of the advantages of working on a search engine is that you get robust and rapid quantitative feedback from users. It is fairly straightforward to measure searches, clicks, and conversions. Search success can be tricky to define in a way that best aligns with searcher and business value, but in general it is highly measurable. There’s no excuse for not measuring search success and using that data to drive product development decisions.
But metrics are not just the responsibility of the analysts and data scientists supporting the search team.
It’s critical that search product managers use data to drive their decisions. That goes beyond making launch decisions through A/B testing (which has unique challenges for search). It also means using analysis to prioritize development efforts based on the expected return on investment. Data should drive every aspect of the product development life cycle.
Engineers need to think about data too, whether they are focused on infrastructure concerns like scale and performance or quality concerns like precision and recall. Engineers invest a lot of effort into their work, and they should be aware of and optimize for measurable return on that investment.
Search is fundamentally a data product. Everyone on the team needs to continually look at how searchers behave and what impact that has throughout the search stack, in order to understand which problems need to be solved and how best to solve them.
Search teams should prioritize efforts for incremental innovation.
Full disclosure: I’m a strong proponent of incremental innovation. But part of the reason I’ve gone all-in on incremental innovation is that I’ve spent most of my career working on search, where it’s almost always the right approach.
There are two main reasons to prioritize incremental innovation in search.
The first is that a search product involves a large number of interconnected components, each requiring its own design choices. You make choices about how to represent documents in the index, how to interpret and rewrite search queries, how to determine which documents to retrieve, how to rank and diversify them, how to organize and present them, etc. By innovating on each of these components independently, you can deliver continuous, incremental improvement. In contrast, changing multiple components at once can cause as much harm as good. And even if a complex change yields a net improvement, you won’t know which parts were good and which parts were bad — which leaves gains on the table. Incremental innovation avoids this problem.
The second is that search generally breaks down into a collection or taxonomy of use cases. Sometimes you can improve search for all of these use cases at once, but these opportunities for across-the-board improvement tend to hard to discover and even harder to address. More often you’ll observe — or your users will observe — problems that only affect one or a handful of use cases. Incremental innovation allows you to quickly address these problems with targeted improvements, obtaining rapid return on small investments.
In short: dream big, but execute incrementally.
Search is not just infrastructure, and it’s not just machine learning.
One of the mistakes I’ve seen happen with search teams is that they don’t embrace the holistic challenge of search. Specifically, I’ve seen many search teams that focus mostly on either infrastructure or machine learning.
A search infrastructure team may end up achieving great reliability and latency, but it’s unlikely to address user challenges. Making content findable and discoverable requires more than optimizing search infrastructure.
Machine learning — or AI if you prefer — offers a great tool set, but it still isn’t enough to deliver great search. A machine learning team may be able to build a robust ranking model, but it’s likely to neglect other aspects of search, such as ways to rewrite search queries or organize results.
A search team needs to think about search as a holistic set of engineering and product challenges. Fortunately, all of us have some experience with search as end-users. The trick is finding people who can understand those challenges from both the outside and the inside.
Search is hard!
A short summary of the above is that managing a search team is hard! If you’ve read this far, you probably knew that already. If you are managing a search team, then I congratulate you on your successes.
Meanwhile, I hope you see each of these areas as opportunities for you and your team to improve. And if you’re hiring people to work on search, I encourage you to look at this set of interview questions. Ultimately, your search will only be as good as the people who work on it.