Creating engaging and custom sharing images for your web content

Sinisa Grubor
8 min readMay 4, 2016

--

I work as a frontend/web engineer @Onefootball. As any other product company, we aim to bring the best, most engaging, content to our users. We want them to engage with that piece of content so much, that they feel the urge to share it with their friends.

We all know how hard it can be to make the user press that share button. This is not an article about that. It’s about what happens after. When a user shares your content, a new world of problems arises (this applies also to your own content shares — on company accounts). How are you supposed to make your content stand out in this sea of information?? I mean, just take a look at your Facebook/Twitter/…/VK/Google+ news feed. So many articles, images, inspirational quotes (I really do not like these ones — especially if they are not one liners), funny cat videos, not so funny cat videos, baby videos, etc. Well…you get the point.

You have probably seen the most common approach that some news publishers use — I hope you hate it as much as I do. Blowing title completely out of proportions, so after reading the article they shared, you feel it has absolutely nothing to do with the piece of content you have just clicked on. Hell, I've even found myself having closed a tab, looking back on my social stream thinking “what did I just click on?”.

I think in the long run, this approach can really backfire, because you will drive your user base away from you, you can associate your brand with negative experience and sooner or later, the social platforms will penalize you with smaller reach for using this or similar dirty tricks (edit: Facebook is taking active steps against that).

So, what did we do to tackle this problem?

Well before answering that, let’s quickly look at characteristics of — in my opinion — well defined content meta-data.

  1. They communicate clearly what a user can expect from a piece of content.
  2. They give a user additional value just by seeing them in their social stream.
  3. They enrich your content with different media (images, video, …) and make it more attractive.

Users today do not want to work for content. They like things delivered to them. And they should! Demanding users force us to improve our services. This is why it’s important, not to force them to make an additional click to get value from your content. If they click and you don’t deliver on your promise, they will not appreciate it.

I know what you are thinking: if I expose to much info in the meta-data, then users have no need to click on the shared piece of content; I will lose a potential visitor. That might be true, but you are actually just deflecting the origin of the problem — bad content. This would manifest itself anyway just in a different way — higher bounce rate, short average session duration, etc. By offering good meta-data (and content of course), users that are interested in the content you’re offering, will click on it. And even if they don’t, they might still get something useful just by seeing your content meta-data and can engage in to the conversation on the social networks quicker. This still helps to spread your brand awareness and associate your brand with positive feeling. Isn't that nice?

Match share on Facebook

Highly customizable share images

So that is exactly what we did. We work in the football sphere, so it’s vital for us that our meta-data are always up to date with the content on the landing page. I guess that applies to most of the live content out there. For us, a good example of this is the match page. Until the final whistle is blown, it’s actually a live ecosystem, with its own events (goals, cards, substitutions, etc.) and we want our meta-data to always reflect the actual, up to date data. So we need a setup, that will allow us to easily adapt the meta-data to the events in the match.

So we set up the basic goals that our sharing system should achieve.

  1. Meta-data response should be fast. We set an upper limit to 4s (although we reached quite lower times than that — I know 4s sounds high, but stick with me on that one and you will get why it’s that way) from the time a user clicks on share, until the time he sees a share preview.
  2. Meta-data should always represent up to date version of shared page content.
  3. All shares should give user enough data to know what your content is about and what to expect when clicking on it.
  4. Meta-data should be easy to modify.
  5. All shares should be visually appealing and easy to modify.
  6. All shares should be branded.

Let me just tell you that our web application is written in AngularJS. Sometimes this is not so much fun. Sharing content on social media falls in to the not so fun section. Their crawlers do not execute JavaScript, so you have to do it for them. That can be quite a lengthy process, which can cause a long feedback loop for sharing previews, random time outs from social networks, which makes shares look broken, etc. There are a few ways to tackle this problem, but I won’t describe all of them. Instead, I’ll just sum up what we did. We can discuss other approaches in comment section below; I’m happy to hear your thoughts.

Match prediction share on Twitter

Micro-services to the rescue

If you read Facebook’s sharing documentation, there is a mentioning of serving only meta-data to their crawlers to improve performance. Luckily, all social crawlers actually identify themselves with appropriate UA (user agent) string. So we ended up writing 2 micro-services that now contain all our sharing logic.

Content micro-service (we call it Peregrine)

The first micro-service contains all the content that we want to expose as meta-data. How does it work? First, we identify when social crawler is requesting our content and redirect it to Peregrine. There we build only the meta-data required for that piece of content and send it straight back. This happens extremely fast. Everything is written in NodeJS. We have no JavaScript to execute on the client side, we just populate the template, on the server and return the response.

Photobooth micro-service

Second micro-service is an image creating service, based on PhantomJS. It’s responsible for creating media content that is used to enhance our shared meta-data. When Peregrine responds with meta-data, the image url that we return actually goes through Photobooth. So it effectively serves as an image proxy. Peregrine contains HTML templates of the visuals we share. Photobooth fetches the content there, makes a photo out of it and responds with an image. We decided to publish Photobooth as an open source project.

Your favorite team share on Google+

The Good

One clear benefit is that we have encapsulated our sharing logic in two dedicated micro-services. That means that is much easier for us to edit them and we have a clear separation of concerns.

In addition to that, our share does not need to go through AngularJS app anymore. This has improved our performance drastically.

Our image output is now created out of HTML templates, making it very easy to modify. Even better, it can be modified by designers, who know a bit of HTML and CSS. And that’s how, we make sure our visuals are always stunning. Connecting that with micro-service advantages, design changes can be deployed without touching our core AngularJS app.

We have a good cache breaking strategy, which assures our meta-data are always up to date and correspond to the appropriate view in our Angular app. Discussing our cache braking strategy is outside of the scope of this article, but I might write a new one just to elaborate more on it.

Player profile share on VK

The Bad

PhantomJS can be quite resource consuming and isn't super fast, so you need to be careful on what kind of machine you run it on. Also there is no straight forward way for NodeJS based APIs to communicate with PhantomJS, so we use a great package called phantom (thanks Amir20 and the rest of the community), that does that part for us. Since version 2.0 it is very stable, but before it had a lot of issues with spawning process that were never killed, so it was able to eat up you resources with zombie processes in a matter of seconds. It was also not very fast, which is why we had 4 seconds to respond, as I mentioned above. These days though, it behaves much better, so we managed to get the response time down to ~1s.

Also you are limited to some extend to CSS support of PhantomJS. But I have to say that this is not really an issue with version 2.x.

Creating image with PhantomJS is probably much slower than using dedicated software. But we think the speed is sufficient and the benefit of designers being able to quickly modify HTML templates outweighs a small delay in response.

We address most of these issues, by doing proper image caching, so we make sure that each image is created only once. Next time, we serve it from S3 bucket, we have a CDN in-front of all of this and we leverage social network caching, so all in all, we do not find really a lot of drawback in using this setup.

Conclusion

That was a short overview of our approach to dynamic sharing templates. From images in this article, you can see how does the generated output look like. I encourage you to give it a try yourself and share some content from great Onefootball webapp or our mobile apps (Android, iOS, Windows). You can also try it out in Facebook debugger tool. I’ll be happy to hear about your take on this topic (or bugs you find :)) in the comment section below.

--

--

Sinisa Grubor

Tech leader, advocate of empathic and servant leadership. Passionate about Javascript and software architecture https://www.sinisagrubor.com