Adding a List of Medium Posts to Your Personal Website
A classic story of underestimating a development task
Dropping by for a quick solution? Here are the highlights:
📝 Medium has an API, but it doesn’t provide a way to retrieve posts
🔈 Medium does have an RSS feed per user and per publication
🙉 That RSS feed does not set CORS headers
✨ You can use any of the following options as a workaround:
- A CORS proxy like cors.io or cors-anywhere
- Setup an API Gateway using a serverless (FaaS) provider like AWS
- Use a free service like rss2json.com
- SHAKE YOUR FIST ANGRILY AT THE API GODS ✊ 😧 💢
False Assumptions
I see a lot of my colleagues in tech using Medium as a blogging platform. It looks nice, is easy to use, and provides a nice central location to aggregate your interests. All of this lead me to believe that Medium would have an API, and that API would provide basic endpoints for retrieving and modifying content. I was half right.
You can find Medium’s API Documentation here: https://github.com/Medium/medium-api-docs
GREAT! I go to my account’s settings, create a token, issue a test call in Postman and see results immediately 👍
But wait… There’s no endpoint to retrieve user or publication posts? The API has endpoints for retrieving very ,very basic user info, the publications they belong to, CREATING a post, and uploading an image. That’s it.
Nope. “Use RSS feeds.” says Medium staff.
Whatever. We’ll make it work.
Alright, simple enough. Let’s use the popular promise based http client Axios to grab the RSS Feed, and an RSS parser like FeedMe to turn it into JSON that’s easier for us to play with.
Okay. This has impacted others. What are their solutions?
- AWS Lambda (FaaS) to create an API Gateway 💪
- Use a CORS Proxy to get around the Access-Control-Allow-Origin issue 🤔
- Use a service like rss2json that we found in a random CodePen 💡
Leveraging FaaS is cool, but the complexity and cost are higher than makes sense for this “quick” project. A CORS proxy is elegant and would be my “go to” had we not stumbled upon rss2json.com. For now, we’ll use it, but a CORS proxy is an excellent fallback should the service go down (it is a free service after all).
Glue it all together
This took a lot more time than I originally estimated. It’s a great example of the everyday development tasks we’re handed, and how they can quickly grow in time and complexity. Here’s the itty bitty snippet that makes all the magic happen:
Which gives us the list of posts at response.data.items, which is akin to an array of objects that look like this:
{
"title": "Adding A List of Medium Posts to Your Personal Website",
"pubDate": "2018-05-15 11:14:46",
"link": "https://medium.com/coffee-driven-dev/...",
"guid": "https://medium.com/p/b2988b63e60f",
"author": "Danny Brown",
"thumbnail": "https://cdn-images-1.medium.com/...",
"description": "...",
"content": "...",
"enclosure": {},
"categories": ["programming", "api", "medium"]
}
Parting thoughts
While this solution works well enough for coffee_driven_dev, it’s worth noting that it could be more resilient. What happens…
- If rss2json.com goes down?
- If Medium has a service disruption?
- If we hit some kind of rate limit we didn’t know about?
I tell my daughter not to ask “what if” questions, but in programming, they can be a great tool to highlight your points of failure. To harden this, we could cache the result every X hours, and pull from that cached result instead of directly from rss2json. If rss2json is down, we don’t update our cached result, and we can still reliably pull from it.
This cache (really just a fancy way of saying a “recent copy”) could be a flat “.json” file, a database entry, or really anything that can be persisted and pulled from.
Thanks for reading! If you want more content like this, check out coffee_driven_dev at: