Getting traction for Traction book: 12,328 copies by the numbers

Traction — a book that Justin and I self-published — launched on Aug 26, 2014. Within two months we met our initial goal of 10,000 copies sold (12,328 when this post was first published, now 37,617). Below is all the details about how we met our initial goal.

Getting Traction for Traction

We used our own Bullseye framework and other traction strategies and tactics detailed in our book to get traction for Traction. We first established our initial traction goal of selling 10,000 copies, which was chosen because it would achieve meaningful distribution of our ideas in the tech startup community, and be about break-even for the project. Next we decided that we were going to initially target tech startup founders and marketers, as opposed to the wider entrepreneurial or small business audiences.

We chose this niche because we thought we could reach our initial traction goal by doing so and our personal connections and brands are strongest within it. Once chosen, we systematically compiled all the places (online and offline) where these people (ourselves included!) hang out. Then we brainstormed every traction channel and chose three to test as part of our middle ring based on the strength of our testing ideas: email marketing, targeting blogs (and podcasts) and existing platforms (e.g. Amazon).

Designing Middle Ring Tests

Testing email marketing was somewhat of a no-brainer since we’d been growing our list passively for five years. However, there was some question whether it would be successful or not since the list had almost no traffic on it for most of those years (not the best email marketing strategy I know). We decided to test “warming up” the list by asking people to apply to be early readers, which was also extremely useful for our last round of editing. We then started sending some traction tips, announced pre-orders and ultimately a “We’re Live!” email at launch.

Launch Test Results

Our “list warming” email marketing test went well. Our “early reader application” email went to 3,500 subscribers and had a 62% open rate and 30% click-through rate. Our “pre-order” email went to 4,300 subscribers and had a similar 32% click-through rate. And our launch “we’re live” email went to about 5,000 subscribers and also received good metrics. Those increases in subscriber counts were from people joining our list between messages.

Our testing strategy in general for these three middle ring channels was a bit different than what we would normally advise in that we did more extensive testing from the get-go. We did this for a couple of reasons. First, one of the channel tests we really thought was promising was getting high rankings on Amazon, and to test that effectively we needed a lot of sales in a short amount of time. Second, we studied successful book launches and there is a phenomenon of saturating a niche where sales multiply if the target audience hears about the book “everywhere” within a few days or weeks. We wanted to make the most of our launch and get this effect going if possible for at least startup folks focused on growth.

To figure out what was working or not, we stated using Amazon affiliate IDs, as you can make different ids per link. (Author tip: you can get some extra money back from Amazon this way and increase your effective royalty rates.)

Different IDs had drastically different volumes and conversion rates. (Note: not all of the items purchased through our Amazon affiliate IDs are for our book as we don’t control what people buy after they click our link. And this is US only.)

From this data we decided that guest posting was not the best strategy when compared to the others. Our high-profile guest posts were amounting to 100s of books at best. This strategy was not going to get us to 10,000 copies, especially since we already hit several of the major places in our chosen niche. I think the main reason is that no one wants a blatant ad for a book on their blog, and so the posts, while good in terms of content, did not have great calls to action and therefore didn’t actually convert into book sales that much.

The same was true of our Twitter, YouTube and Slideshare existing platform tests. We tried building up followers on @tractionbook by tweeting out traction tips, but this test really didn’t move the needle at all. Actually it looks better in the chart than it actually was since most of those conversions were off of just two RTs by @pmarca and @davemcclure. Our book trailer still has less than 2,500 views despite our efforts. And our slides, while featured on the Slideshare homepage with over 35,000 views, just didn’t translate into sales.

After a few days we backed off of these IDs for tests because we realized we had an international audience and so couldn’t reasonably send people directly to Amazon.com (as opposed to Amazon.de etc. — 25% of our Kindle sales are international). So then we started sending people directly to tractionbook.com. After we started doing that, all of those clicks went into the main site ID.

You can tell from the above web site traffic stats that obviously the Product Hunt and Hacker News tests went very well and significantly contributed to our initial traction goal and launch success, but nevertheless we decided they aren’t very repeatable for the long term. Also, t.co (Twitter) is misleading since this is more organic twitter spread encouraged from our email marketing tests than from the explicit Twitter existing platforms test explained above. Here’s a more birds-eye view of traffic sources on the site:

“Links” (24%) includes Product Hunt and “Social media” (19%) includes Hacker News. That means direct (42%) amounted to about both of them combined (43%). Some of that is probably mis-counted and some of that is certainly podcasts (more on that below), but it is also clear that organic spread was very high during our launch period with searches (14%) being another indication of this. In previous periods search and direct traffic were very minimal. This is what the same length period looked like earlier in the year for comparison:

That covers all the initial tests except Amazon and podcasts. The third way we could tell what was working or not was through our bonus strategy where we gave away special bonuses for buying early and more bonuses for picking up multiple copies. When redeeming the bonuses we asked people to tell us where they had heard of us, and that became a rough proxy of successful tests (with the large caveat that it was just the subset of people who heard about the bonuses by going to our site or being on our list). From these emails it became clear that podcasts were really useful as they got a lot of bonus mentions just from those initial launch podcasts that we had lined up, despite their audiences not being enormous.

On Amazon, at launch our book shot to the top of all sorts of sub-genres (like Startups and Web Marketing), consistently jumped in and out of the top 20 on the major Business & Money category and topped the new releases for a while in that section. However, we saw very little evidence that this actually led to a lot of increased sales. My guess is not many people actually go through these category lists. It may have had a “social proof” effect because they show best selling badges next to books, but this actually happens with very detailed sub-genres and so you don’t really need to have that many sales to get these badges.

Finally, we also ended up running a few natural experiments in other channels. I hosted a launch event at Villanova for the Philly startup community, which was a good speaking engagements test. We set up all our book resources on a discussion forum that gets linked to at the end of every chapter, which is a good community building test. We also started encouraging Bullseye meetups, which was a good offline events test. And I reached out to a few reporters I know as a quick PR test. None of these tests proved nearly as fruitful as our middle ring tests, which is a good indication that we actually applied Bullseye correctly!

Focusing After Launch

From the initial launch tests we decided to focus immediately on podcasts since the ROI was clearly high and the path was easy and clear. We quickly recorded or scheduled twenty podcasts in our target audience. Here is what that looked like:

We could tell pretty quickly this podcast strategy was going to reach diminishing returns unless we widened our target audience, so we quickly switched our focus after that to email marketing. As mentioned above, one strategy we added on was our bonus strategy where we offered our list incentive bonuses for buying Traction in the first few weeks. The main bonus was to get a special e-book (ten awesome traction research interviews that we transcribed), but you could get even more cool stuff for buying multiple copies. 622 people took us up on the first level bonus, 30 for the second level, one for the third and none after that.

We also set up a simple email sequence when you first join the list to engage you and that we hope convinces you to eventually pick up the book. Email marketing is clearly the most sustainable channel that emerged from our testing phase and therefore arguably is the channel we should continue to focus on now. However, for various reasons including determining our next traction goal and our personal situations, we may not (more on that at the end).

Our final email marketing strategy was to run a Kindle Countdown Deal where we discounted the Kindle version of Traction by 70% to $2.99 (from $9.99). When the deal was live we hit our whole list with it hoping to pick up people who had been on the fence to date and to help spread the deal to their startup friends. You can see the results in our Kindle unit sales graph over the whole period the book had been live. The deal period is the huge spike, dwarfing our initial Kindle launch. I believe most of these were from the email blast because of the steep drop off (besides still being ranked highly on Amazon lists) and because we still have affiliate ID data.

If we had continued to focus on email marketing, we would have needed to find a steady stream of great content to send to our list — like this post! (That is, if you like this post, you probably want to sign up for the list.)

Traction Channels Summary

It is hard to break down exactly how many sales were contributed to from each channel, but here is our best guesstimate:

  • 33% Email marketing
  • 25% Targeting podcasts
  • 25% Viral marketing (completely organic)
  • 10% Targeting blogs/aggregators (Hacker News, Product Hunt)
  • 3% Amazon rankings
  • 2% Speaking engagements
  • 2% Other

We could account for about 4,500 books through the Amazon affiliate IDs, which means more than half of the sales were directly at Amazon. It is hard to tease out which of this half is directly organic (people we didn’t influence recommending to others) or indirectly organic (people recommending to others who are on our list or heard us on a podcast). In other words, these numbers have decently high error bars but are probably order-of-magnitude correct.

Also, I should mention two other factors that contributed to and took away from our traction, unrelated to traction channel. First, we asked readers to review our book on Amazon because we’ve heard that it really increases conversions when people are deciding whether to buy or not and it has some impact on rankings and recommendations. (Related note: if you read the book and liked it, please post a review!) On the flip side, we went out of stock in Amazon for hardcover for several weeks from mid-September to early October, and this probably significantly hurt conversions since Amazon was saying “ships within 1–2 months.”

Going from 10,000 to What?

We are very proud of our book launch and that applying Bullseye got us to our initial traction goal of selling 10,000 copies. This put us in an interesting situation. First, Justin and I are running our own startups and don’t have time to treat Traction book as a second startup. Many authors do this successfully by building a successful business on the “backend” of the their works through speaking, consulting, etc.

Second, we are unsure what the reasonable upper bound is for Traction copies sold. We are fairly certain we could get to 50,000 copies if we tried hard and pressed forward with email marketing (we’re at 37,000 now). But 100,000? 250,000? Beyond? This is unclear and somewhat depends whether it appeals to a broader entrepreneur/small business audience or whether it is really a tech-startup founder book.

There are some good signs:

Update: Ultimately what we decided is to set a new traction goal of 100,000 copies and re-run Bullseye in that context. After doing a new set of middle ring tests (email marketing, content marketing, business development), we decided to focus on business development and partnered with Penguin to release a second edition of Traction, re-launching October 6th, 2015. Please wish us luck on the re-launch!

Update 2: If you’re curious about the profitability of Traction, check out 37,617 copies of Traction book: how much they cost and what we made.

Gabriel Weinberg, @yegg
CEO & Founder, DuckDuckGo
Co-author, Traction


Re-edited September 13, 2015. Originally published on October 23, 2014.