How to create a REST API in Python using the Flask Microframework
An API (Application Programming Interface) can be defined as a set of clearly defined methods of communication between various software components.
I built an API around an application called BucketList. The application enables a user to Create, Read, Update and Delete(CRUD) a bucket and the items within. In this article, I will describe the process I went through while developing the API and I will share a few code snippets strictly for example purposes. The code lives on Github and you can find it here.
The initial folder structure
Above is my initial folder structure with
api as the name of the application folder. The application logic resides in the
app folder. Within the app folder there are three flask Blueprints, auth, bucket and bucketitems.
- The auth blueprint contains the logic for application authentication and authorization
- The bucket blueprint contains the logic for carrying Bucket CRUD operations.
- The bucketitems blueprint contains the Bucket Items CRUD operations.
Exploring the file structure
__init__.pyfile contains the code for setting up the Flask application and registering the different Blueprints onto the application instance.
config.pyfile contains the different application configuration options which are testing, development and production. A different configuration is used depending on the environment in which the application is being run.
models.pyfile contains the different database models,
bucketitemsand the other models.
views.pycontains the application wide routes.
.gitignorefile contains the files that should not be added and tracked by the Version Control System(VCS) in this case Git.
manage.pyfile contains the application commands defined by you and made available by the Flask-Script extension.
requirements.txtfile contains a reference to the application extensions and their versions. It is generated by running
pip freeze > requirements.txtin the terminal.
README.mdmarkdown file contains any description or information the developer makes available to the application users.
Test Driven Development
Test Driven Development(TDD) is a good practice when writing code. From my experience, it enables you think about the different aspects of the code functionality you are going to write and then come up with tests upfront before writing the feature functionality. Later on, you write the code and make sure that your previously failing tests pass. In case you change something in your code later, just run your tests and make sure nothing is broken. If they fail, then you know you have a bug to fix.
During the development of the API, I practiced TDD in some parts of the implementation. I later on continued and wrote tests covering at least 97% of all the code I wrote.
Token Based Authentication
The entire API authentication is built using a Json Web Token(JWT) based authentication system. JWT tokens provide a way of representing claims securely between two parties.
I used a python package called Pyjwt to encode and decode JWTs. When a user signs up or logs in, a token is assigned to them. This token expires after a given period of time or when the user logs out. When the user logs out, the token is invalidated, blacklisted and added to the blacklisted table in the database.
Since most of the API resources should only be accessed by authenticated users. I created a view decorator to be used on all routes that require an authenticated user.
Version control as the name suggests, is all about controlling the different versions of the application during its development. Git is probably the most common and easiest version control tool to use when developing an application.
For the most part of development, I used(simulated) the Gitflow-Workflow. This workflow suggests that you create a
develop branch off of the
master branch and then create
feature branches off the develop branch. The most important take way from this workflow is that the
master branch should only contain ‘working’ code (code ready for release).
I also used Github Pull Requests(PR) so that my colleagues could review and comment on my code before I could merge it into the
As you know, most applications have to store user data in one way or the other. My API uses a postgresql database to persist the users’ data, specifically the user sign-in credentials, buckets and their items.
I used the Flask-Sqlalchemy extension to interact with the database. This basically enabled me to create the user, bucket and bucket-items model classes.
I also used the Flask-Migrate extension to create migrations for the database. Database migrations are like ‘version control’ for the database schema.
I used another extension Forgerypy to create dummy data for the application so that I could test out some features such as
api-pagination that somewhat require much data.
The API runs within three environments
production . In order to switch configurations between these different environments, I created a
config.py file to handle all the different variables for every environment.
The active configuration is managed by the
APP_SETTING configuration variable which is set to one of the three configurations depending on the environment.
I am new to the whole concept of continuous integration (CI) so I am going to only point out one tool (Travis CI) which I used during development.
I used Travis to run my builds whenever I pushed a commit, branch or made a PR. Travis also enabled me through the
.travis.yml file to set up a postgres database so that when it ran the tests no error would be encountered for the tests that required a database. After successfully running the tests, the coverage details were sent to coveralls for further analysis.
Adding Badges to your repository
- The Travis badge showed whether the build had passed or failed
- The coverage badge showed the test percentage coverage
- Bettercode showed the quality of the code based on ten attributes
- The codacy badge showed the general quality of the code.
Most of the API’s I have used before have some sort of guide on how to use them and this made it easy and simple for me to test them out. I felt the need to return the favor by writing documentation for my API using a tool called Apiary and hosted it with the API as the homepage you can try it out it’s live.
Final folder structure
A lot changed during development and so did the application folder structure. At the end of development my folder structured has changed to this below.
I added a
docs blueprint to handle everything concerned with the API documentation including the
apiary.js file that contained the embed code for the Apiary API documentation, site favicon and the
tests folder contained the application tests,
migrations folder contained the model migration files and the
Procfile contained the Heroku configurations.
Finally, after development I had to deploy my API. I chose to host my application with Heroku which is a platform that enables one to host applications written in a variety of programming languages.
Procfile in the above folder structure contained the configurations that enabled Heroku run the application. These configurations included the
gunicorn server config and config to create the database tables.
You can find my Post about Hosting this API on Heroku here.
The API is available at the URL below or click here to test it out.
All the code is hosted here.
If you found this article helpful, hit the clap button and help me get it to more people who need help getting started writing their own APIs. I appreciate your responses as well. You can also find me on Twitter