Engineering at Imgur

Brian Kassouf
Imgur Engineering
Published in
4 min readSep 15, 2015

--

Imgur started in 2009 as an image host, and over the years we have become the go-to image host for many sites across the internet. We did this so well that we were likened to the electricity of the internet.

Electricity is something that everyone needs to live comfortably in their homes. More often than not, we take it for granted and use it without thinking. As people see the latest viral gifs and memes around the Internet, they’re using Imgur and not even thinking about it. — Amanda Dodge

But in a few short years, Imgur evolved beyond an image host. Today Imgur is a community of image sharers — a destination where people come to discover, share and enjoy awesome images on the Internet. This community around images is where our most interesting engineering challenges are introduced. Let’s take a look at some of the problems we are trying to tackle.

Scale

Imgur operates at a huge scale. We are one of the most trafficked websites in the world, with 150 million monthly active users, generating more than 60 billion image views every month. Our code has to be as efficient as possible to cut down on load times and server costs. We desire to make Imgur bullet proof. We want it to be fault-tolerant and scalable.

We do this through many initiatives, including: designing for failures, building redundancy and fault recovery, and graceful degradation in the form of a “big red switch”.

Data & Analytics

The best way to improve our platform is to understand it. We are already logging a lot of metrics, and every day we add more. We have only scratched the surface of what we can learn from this data and how it can transform our product decisions. We run A/B tests to help us experiment and verify new products. Going forward we want to be able to answer any question we might have with both intuition and data.

A problem we recently solved involved building the pipeline for moving and processing our massive amount of image view logs. Another engineer, Carlos, and I got to learn technologies like flume, kafka, and storm. We can now count every view in near real time.

Recommendations

We have always been great at surfacing the best of the best content on Imgur. We have gotten the algorithms down that determine what image will be globally appealing. We have a “virality score” for what images are popular this very second.

However, how do we determine what images one specific person would be interested in? How do we balance favoring recency vs. content? How do we show a variety of posts that are interesting to the user, without narrowing the scope of the content too much? After all, there is a limit to how many cat pictures someone can view in a row…or is there? And finally, how do we get all the data to know these things?

Mobile

We think there is so much potential for Imgur on mobile. Flipping through the gallery on the bus is the perfect use case. Images can be so quickly consumed, perfect for on the go.

We are constantly iterating on our iPhone and Android apps. We strive for them to deliver a seamless and beautiful interface for digesting images. Mobile poses difficult challenges for us: preloading so there aren't long waits, handling image compression in a smart way to save on bandwidth, graceful fail overs when a device is offline, quick and sleek animations, displaying arbitrarily nested comments in a performant way, etc.

Content Creation

We want to provide a platform for not only sharing the best images, but creating them. We already have a meme generator, and a gif creator, but how do we take this a step further? We want to be the best place on the internet to share and create memes, gifs, and long-form image stories. We want give users the best tools for self-expression through images on the Internet.

Technology

Our stack is awesome, and constantly getting better. We use php for most of our web app. We use a few data stores: MySQL, HBase, Redis, elasticsearch and memcached. We write modular client side javascript using React. Nginx is our reverse proxy, and HaProxy powers our load balancers.

We are starting to use Go for many of our back end services: Incus, our real time messaging service, powers all websocket connections and mobile push notifications for our platform. Mandible powers a subset of our thumbnail generation. We have a Go service that converts gifs to various video formats.

We use hadoop, storm, flume, and kafka for our analytics pipeline and real time processing.

All of this is built on AWS.

Team

Our engineering team is filled with people who are intelligent and humble; people who learn together, support one another, and have fun together.

If joining our team and helping us solve these problems sounds interesting to you, send me a note at brian@imgur.com or checkout http://imgur.com/jobs

--

--