The history of fanart.tv: Part 2 — The troublesome middle years

Part 1 of the series covered the time before we had a dedicated domain. Part 2 in the series will cover the 7 years between launching the site on it’s new domain and now.

The last 7 years have been interesting. There have been many ups and downs. People have come and gone; admins, moderators and contributors alike. But nothing has stopped the sites onward march. As the article published a few days ago showed, since 2012 an average of 184 images are approved every single day.

A tale of two sites

When fanart.tv went live there was another site that also provided custom image types. While it didn’t provide ClearLogos or ClearArt, many of xbmcstuff’s members were also our members.Unfortunately xbmcstuff’s owner had been MIA for a long time and the domain was coming up for renewal. I was approached and asked to import the images onto fanart.tv to ensure they wouldn’t be lost. After a lot of consideration I agreed and that was the start of our image support expansion.

A fresh new look

On March 17th 2012 the new new design for the site was launched and over the last 5 years has been tweaked.

The new site brought with it a new API and before long it’s popularity increased significantly. While gratifying, this did bring with it challenges. Our shared hosting couldn’t handle the load and so we had to start looking for a dedicated solution. Over the last 5 years our infrastructure has resided on OVH, SYS, Online.net or a combination, depending on cost and value for money. At the time of writing, as detailed in this post, all our infrastructure is located on Online.net.

Financial meltdown

Around July 2012 I started to get a bit worried. Recurring donations only accounted for about 1/6 of the costs of running the site. I was in the process of buying my first house and having to make up the difference was hard. I put out a plea for ideas to the community as while I didn’t want to shut down the site, it was definitely on the cards.

We received a lot of ideas, some of which were implemented; others are scheduled for the new site. But the biggest thing that got us out of that hole was xbmcnerds (now kodinerds) and Passion-XMBC (now Media-Passion). They had a private collection between their members and ended up donating £304.43 and £458.65 respectively. Over time, they and other communities donated enough, along with Adsense and VIP memberships to enable the site to support itself.

So, about those API requests…

I wrote in part 1 about how amazed I was by 130,000 API requests a month in 2010. At the end of 2014 we had processed 1.7 billion requests, or an average of 143 million requests a month. This is skewed by the fact hits per month increased month on month. What was interesting was Plex appeared to be the major player missing from the top 10. We posted a tweet asking where they were, by April 2016 they were our biggest user with over 134 million requests a month. In addition API requests across the board were up and we were processing over 300 million requests a month.

Not long after, google started politely reminding us that the cap for analytics is 10 million hits a month. They kindly invited us to move to an enterprise account (a bargain £100k a year if I remember correctly). This effectively ended any useful tracking data for the API, a situation that will be fixed with the new site.

From there the only usage tracking we had was provided by CloudFlare by our Pro account. It was an overview only, but still useful. There was no way to split out requests for images (much higher) from API requests, but it was nice to have some kind of overview at least. You might be able to guess what came next.

In May 2016, Daniel Carrillo from CloudFlare contacted me, inviting me to upgrade to an enterprise plan for just $2,500 a month. A bargain compared to the $4,000 a month other CDNs would charge me I was assured. The problem at the time was they were caching 59TB a month out of the total 83TB a month. $2,500 a month was more than we got in donations a year. The solution, it turned out, was a $16 a month dedicated server from online.net and disabling caching. That got CloudFlare off my back for a few months.

In May 2017, Kelby Balson from CloudFlare contacted me, again inviting me to upgrade to an enterprise plan.

we want to offer you the opportunity to upgrade to our Enterprise tier product at the discounted pricing below

“Below” what I have no idea, as no pricing was specified. As it turns out $1,000 a month is what they considered about the right level. What was the issue now? They were no longer caching assets for the API; now at 353TB a month. So what was the problem? Proxying over 16 billion requests a month apparently.

We were proxying the data through CloudFlare for two reasons. First, to easily enable SSL on everything. Secondly so we could track how much bandwidth was being used and how many requests. At $1,000 a month we only had 1 option, and that, surprisingly, wasn’t to start paying $1,000 a month.

Well, that concludes part 2, look out for part 3 tomorrow to find out where the site is heading in the future.