Image for post
Image for post

TECH GUIDE

RapidAPI: first steps with Python

Comprehensive guide

Thorin Schiffer
Dec 23, 2020 · 8 min read

A friend working for a bank told me once desperately: “Why do we pay so much for the IBAN parser? 300 Euro per month? I can’t believe why is this so expensive”. All I had to answer: “I can do it for 50 :D”. He confirmed he would take the deal, and I started researching. Well, deals with business people you know.

Image for post
Image for post

While looking for special hosting for API’s I stumbled upon RapidAPI — the Andreesen and Horowitz baked API market place company from California. This was it byotch, maybe even this is what I was looking for my entire career.

As a backend engineer, you have fewer opportunities to play a single-player: you can’t design a nice-looking app everyone wants to lick; or a website with outstanding usability, which other less abstract colleagues can. Besides that, no Instagram influencer or a teenager needs facial recognition API. Our creed is, in a nutshell, a B2B service.

The problem is that you usually can’t reach the decision-makers so easily, and a lot of boilerplate comes on top: billing, accounting, security, and whatnot. All of that is done with the app stores for the app developers, for example. And finally, here we are, RapidAPI.

RapidAPI, on one side, offers an API developer a platform, which allows them to concentrate on the essential part of their business — complex problems and new pluggable functionality. The platform supports REST and, since recently, GraphQL APIs. On the other side, it allows the companies to configure the access once and request the service from different API providers over a single gateway with a single key. Quite a charming idea, actually!

Image for post
Image for post

In everyday programming, we fellow devs ask ourselves: should we write it ourselves, or should we use a library? If the library is well developed and could survive multiple releases, then why not! RapidAPI takes this idea even further. Instead of a program, it offers you a service where balancing, SLA, and availability are no longer your problems. Well, at least not all of them.

There is always a fly in the ointment, though: The project is not perfect. Many of the APIs are buggy, and the platform itself shows quite a disintegrated usability concept. Maybe it’s because I’m in Europe, maybe not, but the web apps are fairly slow. Some of the elements are buggy and poorly documented, but that is, I am sure, caused by rapid growth. Pun intended?

FastAPI and RapidAPI

It’s been a long due for a good python mini framework for the APIs to appear. Django scares off the newbies and data scientists by its monstrosity, age, and existential crisis. Flask was never created to be an API framework, and although it’s very slim and pluggable, neither Flask nor Django was created for programming APIs. Although Django rest framework is an amazing piece of software, it’s just not what you need.

Python celebrates the second (third?) coming. With all the new amazing technologies being released, python is the lingua franca of data science. Asyncio has spawned a whole new branch of web frameworks and attracted more interest from the community.

One of those frameworks is FastAPI — Flask for APIs with asyncio, WebSockets, and http/2. All that you wanted. Fun fact: I’ve known about it first from a data scientist I’ve met at Cornelsen Verlag. It is so easy to use, so even non-specialist in distributed systems can start right away.

The framework is incredibly well documented. Everything I needed could be immediately found. Like many other rest frameworks, FastAPI supports the schema generation based on the models from pydantic — a marshmallow competition built on new python typing functionality.

Like this:

class ParsedIBAN(BaseModel):
iban: str
compact: str
country_code: str
bank_code: str
account_code: str
length: int
bic: BIC

class Config:
alias_generator = to_camel

And here is how a typical endpoint would look like:

@app.post(
"/ibans/",
response_model=ParsedIBAN,
responses={400: {"model": Error}, 401: {"model": Error}, 422: {"model": Error}},
)
def parse_iban(iban_str: IBANString, current_user: str = Depends(get_current_user)):
"""
Parse IBAN supplied in the post payload.

ERRORS:...

Quite straight forward and self-understanding.

RapidAPI requires the open API schema file so that this one can be created with the built-in functionality of FastAPI like this:

def open_api():
api = get_openapi(
title="IBAN PARSER",
version=VERSION,
description="Parses IBANS",
routes=app.routes,
)
api["servers"] = [
{"url": "https://iban-parser.herokuapp.com", "description": "Production server"}
]
...
return api

Dump the result in a YAML file like that:

@invoke.task
def generate_schema(context):
"""
Generates open api schema yaml for the api
"""
from main import open_api

with open("schema.yaml", "w") as yaml_s:
yaml.dump(open_api(), yaml_s)

CI upload schema and automatic upload

Image for post
Image for post

To provision the API, and at the first place the endpoints, you can upload the schema file over the web interface. This is a quite simple approach but not really compatible with a lifecycle of a modern 12-factor like API. It is a quite common event in the lifetime of an application that the endpoints change, and uploading a schema file every time you need an update is not the way of samurai.

RapidAPI offers a way to integrate those updates in your ci. Funnily enough, over their own API for meta-information provisioning. Unfortunately, the API is quite buggy, and it took me a while until I figured what the thing was. For instance, it returns “204 no content” instead of 200 when all is fine and 400 with blurry error messages. I’ve posted the working curl with schema YAML in the discussions on the API page. Here it is once more:

curl --request PUT --url https://openapi-provisioning.p.rapidapi.com/v1/apis/$RAPID_API_API_ID \
--header 'x-rapidapi-host: openapi-provisioning.p.rapidapi.com' \
--header "x-rapidapi-key: $RAPID_API_APP_KEY" \
--header 'content-type: multipart/form-data' --form "file=@schema.yaml;type=application/yaml" -i

You can find the API id (RAPID_API_API_ID) on your marketplace page, and the app key should be inserted in the provisioning API overview.

Besides that, while trying to guess the names of the fields, I figured that the GraphQL backend requests the same API for the frontend, so you can also update the API's metadata from the overview page, which is not directly configurable from the schema file.

Here is a little invoke task I came up with:

@invoke.task
def update_api_data(context):
"""
Updates rapid api data from the schema file
"""
import requests
from main import open_api

url = f"https://openapi-provisioning.p.rapidapi.com/v1/apis/{env.str('RAPID_API_API_ID')}"

api = open_api()
with open("RapidApiDescription.md", "r") as f:
long_description = f.read()
payload = {
"name": api["info"]["title"],
"description": api["info"]["description"],
"longDescription": long_description,
"websiteUrl": "http://iban-parser.herokuapp.com/docs",
}
headers = {
"x-rapidapi-key": env.str("RAPID_API_APP_KEY"),
"x-rapidapi-host": "openapi-provisioning.p.rapidapi.com",
}

response = requests.request("PUT", url, data=payload, headers=headers)
if response.status_code == 204: # somehow they think 204 is okay
print("Upload OK")
else:
raise Exit(f"Upload not OK: {response.status_code}: {response.text}", code=127)

Rapid API testing

Another interesting part of the platform is the testing engine. For a perfect API, you would probably cover it with unit tests. RapidAPI allows you to monitor and overwatch the API from the outside, combining the integrational testing and monitoring in one tool.

Image for post
Image for post

And now one more bug: the schedules are not working properly: they would not start immediately but at a random point in time. I hope to see that fixed.

The configuration tool is pretty much self-explanatory. A pretty neat feature is building tests based on your schema file's endpoint descriptions: the parameters and URLs are already there!

Image for post
Image for post
Image for post
Image for post

Trigger tests from CI

Every test in the testing tool can be triggered automatically.

Image for post
Image for post

The API gives you information about the test's status, so you can track it and fail the ci pipeline status if something went wrong. Here is the little invoke task I’ve scribbled for triggering and polling from the ci.

@invoke.task(help={"wait": "How many seconds to poll the result of the test"})
def trigger_rapid_api_test(context, wait=5):
"""
Triggers rapid api test
"""
overall_success = True
failed_count = 0

trigger_urls = env.str("RAPID_API_TESTS").split(" ")
for trigger_url in trigger_urls:
response = requests.get(trigger_url)
if response.status_code != 201:
print(
f"{trigger_url} response status not OK: {response.status_code} {response.text}"
)
continue

data = response.json()
print(data["message"])

status_url = data["statusUrl"]
for i in range(wait):
data = requests.get(status_url).json()
if data["status"] == "complete":
print(f"STATUS: {data['status']}")
test_success = bool(data["succesful"])
break
else:
print(f"STATUS: {data['status']}")
time.sleep(1)
else:
test_success = False
failed_count += 1
overall_success &= test_success

if overall_success:
print("OVERALL STATUS: OK, ALL TESTS PASSED")
else:
raise Exit(f"OVERALL STATUS: NOT OK, {failed_count} FAILED", code=127)

The RapidAPI itself refers to the Github actions in their examples for the CI. Here is how it would look like with the tasks above for Gitlab:

rapid_api:
stage: .post
image: python:3.8.6
script:
- pip install -r requirements.txt
- inv upload-schema
- inv update-api-data
- inv trigger-rapid-api-test
only:
- master

Using the API

So all good, the API is deployed, what’s next? There are a couple of text boxes you would need to fill in, and then your API would be actually accessible over RapidAPI client. One of the things you need to think about is the authentication with the custom HTTP headers RapidAPI would pass to distinguish the calls from the RapidAPI from any other calls. You would even want to prevent the users from calling the API as long as the credentials are not passed.

Image for post
Image for post

Here is how it would be done with FastAPI (remember current_user: str = Depends(get_current_user)?):

def get_current_user(
x_rapidapi_proxy_secret: Optional[str] = Header(None),
x_rapidapi_user: Optional[str] = Header(None),
):
rapid_api_secret = env("RAPID_API_SECRET", None)
if not rapid_api_secret:
return None

if rapid_api_secret == x_rapidapi_proxy_secret:
return x_rapidapi_user

raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail={"message": "No credentials provided", "code": "NotAuthorized"},
)

On the client side, the usage is symmetrically simple:

import requestsurl = "https://iban-parser.p.rapidapi.com/ibans/"payload = """{
\"iban\": \"GB33BUKB20201555555555\"
}"""
headers = {
'content-type': "application/json",
'x-rapidapi-key': "XXX",
'x-rapidapi-host': "iban-parser.p.rapidapi.com"
}
response = requests.request("POST", url, data=payload, headers=headers)print(response.text)

Voilá! Fertig! Done!

The marketplace then rounds up the rest: billing, counting the requests, and so on. I am really excited about where the RapidAPI will develop, cuz byotch I don’t wanna do any of those!

Image for post
Image for post

Analytics Vidhya

Analytics Vidhya is a community of Analytics and Data…

Sign up for Analytics Vidhya News Bytes

By Analytics Vidhya

Latest news from Analytics Vidhya on our Hackathons and some of our best articles! Take a look

By signing up, you will create a Medium account if you don’t already have one. Review our Privacy Policy for more information about our privacy practices.

Check your inbox
Medium sent you an email at to complete your subscription.

Thorin Schiffer

Written by

A Python developer from Berlin. Subscribe for more!

Analytics Vidhya

Analytics Vidhya is a community of Analytics and Data Science professionals. We are building the next-gen data science ecosystem https://www.analyticsvidhya.com

Thorin Schiffer

Written by

A Python developer from Berlin. Subscribe for more!

Analytics Vidhya

Analytics Vidhya is a community of Analytics and Data Science professionals. We are building the next-gen data science ecosystem https://www.analyticsvidhya.com

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store