How to Achieve Perfection by Sweating the Details

Tommi Forsström
8 min readAug 8, 2017

--

Quite often when we talk about product management, we focus on innovation. Big launches, moonshots, hockey stick inflection points and other overnight success stories. What if I told you that the saying “there is no big data, just a whole lot of small data” also applies to products?

To really care for the details, you need tools suited for the finesse you’re shooting for.

I have a confession to make: I’m actually not a big ideas guy. I don’t thrive on chasing after the next revolutionary concept.

I actually think faster horses are a perfectly valid idea most of the time.

A lot of times when we observe hockey stick moments in a company’s lifecycle, we think it’s due to some profound single realization. A “eureka” moment that was preceded by a long string of failure, and followed by nothing but winning.

What we fail to see without zooming in is that quite often that inflection point was actually a ton of tiny little realizations clicking together into one huge landslide. Compounding interests. A whisper triggering an avalanche.

Projects vs. Products

This kid is clearly launching and forgetting. Duh. Iterate, kid. Iterate.

While I didn’t include it in my Agile faux-pas list, one tell-tale sign of Agile gone awry is when a team is constantly working on “projects”. Bigger pieces of finite work with a clear beginning and an even clearer end, lasting multiple sprints (or scaramuccis).

While that’s not a total anti-Agile move, it’s quite often a slippery slope down a path where things are launched and forgotten, as the team moves on to the next project.

To graduate from shipping one-off projects to really working on a long-term product, a team and its product person need to have space to sweat out the details. They need to be able to truly embrace the feedback cycle that’s fundamental to Agile. Building, launching, measuring, learning… and repeating ad nauseam.

Big projects are hostile to details. The bigger the project, the less you’ll have the ability to really grind on every little nook and cranny of it.

When you really work on products, you get to drill in. You get to know every single funnel and metric. You know all the levers you can pull to tweak the balance. You understand where a 0.5% increase in conversion will give you $1MM more annually — and where 50% more will barely yield $5.

Devil Is In The Details

One of the most gratifying times of my career was the 1.5 years I spent with Greatist. On the surface it’s not really anything super exciting for a product person. They create content and deliver it to their readers online, monetizing through 3rd parties who are willing to pay for the attention and brand equity Greatist has. In short: a content website.

The thing just is, their content, brand and tone of voice are exceptionally good. It deserves to be delivered on a platter worthy of the main course.

During my time leading tech and product at Greatist, we were able to drive our monthly uniques from 4M to 10M. A pretty nice 150% increase with 0 paid traffic acquisition.

What made my job particularly exciting was that the only magic trick we did was having a collective commitment to never stop experimenting, and tweaking every single detail we could.

While the content team was obsessing over headline formats, different article types, content lengths and social sharing tactics, our platform team (tech, product & design) was making sure we understood every single pixel’s role on our article pages.

We did this through an endless array of tests on literally everything:

  • Placement, size, design, number of social sharing tools, recirculation elements, modals, ads and other retention devices
  • Article font size and column width
  • Machine curated vs. human curated article recommendations
  • Different criteria for recirculation element topics and content
  • Personalization features

…and many more.

Data In, Ego Out

#data-driven

I have another confession: I can have a pretty big ego sometimes. At least I used to. Especially around 10 years of experience in the industry, I went through a time of “I’ve been around the block, son. I know how this works.

The best part about my time at Greatist was that I realized how wrong my gut often was. I kept making assumptions and having real world users prove how horribly wrong I was. They talked to me through the magic of data.

If you understand what it’s telling you, data doesn’t tell lies. It’s the ultimate argument resolver. At Greatist, I learned to love being wrong.

When you resolve disagreements with data, you don’t harbor resentment that your great idea was pooped on. The winner of arguments doesn’t win by pulling rank, yelling louder or using bigger words. They win, because their point of view got proven right.

It also really helps you learn something from being wrong. So basically you can only win. Either you’re right, or you get to tweak your view of the world for the better. Sweet!

How to Rock your A/Bs Right

The best way to squeeze information about your users and how you can impact their behavior is A/B testing. Here’s a few simple tips on how to make sure you’re getting the most out of A/B testing — and to figure out whether you should even do it in the first place in your context.

1. Make Sure You’ve Got the Numbers

The first really unfortunate caveat here is that to get any mileage out of your tests, you need to have the numbers to get there. When you’re just trying to get your startup out the gate or if you’re running a B2B tool, getting to statistical significance on any test will probably take you months, making the process uselessly slow.

When you’re still operating on low volumes, you have no choice but to lean in on the qualitative sides of our craft: user research, interviews… and your gut!

2. Know What You’re Testing

Whenever you run a test, make sure you understand to a ridiculous level of detail what you’re actually testing for. Tests can only give you accurate readouts if you can truly isolate the change in behavior to the test you’re running.

If you change a color of a button, don’t try to observe the change in top-line numbers, like revenue. Make sure you observe the interactions with that button that indisputably are driven by the test.

Also make sure you have a strong hypothesis on the negative effects of the test. If, for example, you’re testing new larger recirculation elements that make the sidebar of an article more prominent, don’t just track the click-through of the elements. Make sure you track completion rates of reading through the whole article and time spent on the page as well. You need to understand what you’re trading in for the gains you’re receiving.

3. Don’t “A/B” Until you Have an “A”

This is a tough one. A/B testing is an absolute waste of time when the baseline version of your product or feature is clearly faulty. Why bother testing something that is already intuitively completely wrong?

This also holds true when you don’t really understand how your baseline works. Before running headfirst into testing, make sure you understand the current product and all it’s details first.

Only when you’re comfortable and familiar with your control, you can make hypotheses on how to improve it and truly figure out if you were right.

4. Don’t Micro-Optimize

Grinding the details can spiral out of control unless you understand the value of that grinding. A 0.05pp increase on a critical sharing function or conversion flow can totally be worth it, while gaining huge improvements on something peripheral can still be an absolute waste of time.

Always understand the big picture impact of whatever detail you’re working to improve.

5. Don’t Forget to Segment

If the tools at your disposal allow it, make sure you understand the impact of your tests on different segments.

At Greatist, we usually observed most tests segmenting based on these groups, among others:

  • Traffic source: FB, Pinterest, other social, search, direct
  • Article type (recipe, workout of the day, listicle etc)
  • Device class (mobile, desktop, tablet)

It was really useful to see how some changes impacted for example mobile users coming from Facebook reading a listicle more than desktop users coming from search on a recipe. This helped immensely with understanding the different usage patterns between user types.

6. Be Serious About Data

Captain Obvious here: None of this is possible without data being a top priority since day 1. If your company doesn’t already have it’s act together with data, the time to start was yesterday. Without data, you’re flying blind and you’re missing out.

Every interaction a user does before you’re gathering data is valuable information that is lost forever.

That isn’t to say more data is better. Being serious about data also means making sure you’re tracking the right things in the right format and without ambiguity.

Hopefully this post will inspire you to find satisfaction in details and data. Don’t get me wrong, to be a really advanced product person, you still need to ace the big picture stuff and have a strong qualitative side too.

I’ve just found that in the trenches of the real world, this attention to detail and appreciation of compounding small wins rarely gets the recognition it deserves. This is my shout-out to all the product brothers and sisters out there milking out the tiny multipliers!

Nailed the details, kid. Great job. The growth of our lute business will be off the chain. Full-on hockey-stick.

Like What you Read?

  • Join my Email Gang on TinyLetter! One weekly email, no spam, no ads.
  • Please join the discussion in the comments below!
  • Read more about why I’m writing these posts:

--

--

Tommi Forsström

VP of Product at Teachable. Ex-Shutterstock, Splice & Produx Labs / Insight Partners. Lives in NYC, originally from Helsinki, Finland. http://forssto.com/blog