Creating a Cryptocoin Price Ticker with Django 2.0 — Part One
We’re going to create a simple cryptocoin price ticker page using Django 2.0, Celery, Redis and Channels. You don’t need experience with any of the aforementioned technologies to follow the tutorial, though some familiarity will be helpful.
Cryptocoin prices will be fetched from the CryptoCompare API every ten seconds, and then published to tickers on our Django-based website via WebSockets whenever they change.
The tutorial will be split into two parts and the finished code is in this Github repo.
- First, we’ll set up some Celery workers to periodically fetch and store the prices of some cryptocoins into a local Redis cache
- Then, we’ll flesh out a Django application that has a dashboard for cryptocoin prices
- Setting up Redis
You need to ensure that you have a local Redis instance running on your machine, which is exposed on port 6379. Redis is a in-memory data structure cache — it’s great for holding data that changes often and doesn’t need to be saved to persistent storage. We’ll be using it as a general Django cache as well to store the canonical prices of different cryptocoins, and as we’re updating the prices so frequently, Redis is a good solution as compared to a regular database.
If you are using macOS, you can do brew install redis
then brew services start redis
to get a Redis instance up and running — on Linux, your packaage/services manager most likely can dosomething similar. Otherwise or on Windows, the easiest solution is to use Docker to run a local Redis container with the command docker run -d -p 6379:6379 redis
.
2. Starting our codebase
We’re going to keep everything in one codebase for ease of use and use Git for version control. Create an empty directory and initialise a Git repository inside it with git init
.
Now lets create a new pipenv environment for our codebase. We’ll be using Python 3.6 and you will need the pipenv
package which you can get by running pip install pipenv
. Pipenv is a tool for defining Python environments programmatically and for managing your virtualenvs. The Pipfile
contains details of all the required pip packages for our project, kind of like a requirements.txt
file.
Run pipenv install django
to
- create a new
Pipfile
for recording the dependencies of our project - add Django as a dependency to our
Pipfile
- Create a new virtual environment and install our
Pipfile
dependencies into it
All in one command! Now run pipenv shell
to activate our newly created virtual environment — then we can do python -m django startproject ccticker
to start our new Django project.
3. Setting up Redis as a cache for Django
Do pipenv install django-redis
and add this to your settings.py
file.
CACHES = {
'default': {
'BACKEND': 'django_redis.cache.RedisCache',
'LOCATION': '127.0.0.1:6379',
'OPTIONS': {
'CLIENT_CLASS': 'django_redis.client.DefaultClient',
}
},
}
Django will now use your local Redis instance as a backing cache, when not in debug mode. For example, it will store compressed, rendered versions of your templates into Redis so that it can serve them up faster to visitors. We can now also set and get arbitrary data into the cache from within our Django code like so:
from django.core.cache import cachedef example():
cache.set('key', 'value', 30)
cache.get('key')
4. Integrating Celery with Django
Celery is a way to run code outside of Django’s HTTP request/response cycle, like processing uploads or periodic tasks that have nothing to do with web requests. In our case, we will be using Celery to fetch and store prices from CryptoCompare. Run pipenv install celery
and create a celery.py
file inside the ccticker/
directory. Your project structure should now look like this.
- ccticker/
-- ccticker/
--- __init__.py
--- celery.py
--- settings.py
--- urls.py
--- wsgi.py
-- manage.py
- Pipfile
- Pipfile.lock
The contents of your celery.py
file should look like this:
# set the default Django settings module for the 'celery' program.
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'ccticker.settings')
app = Celery('ccticker')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
Add this to your ccticker/__init__.py
file also:
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ['celery_app']
And finally, add this to your settings.py
file — this tells Celery that we want it to communicate between processes using our local Redis instance. Redis does a lot of work in this project!
CELERY_BROKER_URL = 'redis://localhost:6379/1'
CELERY_RESULT_BACKEND = 'redis://localhost:6379/1'
This code sets up Celery so that it will autodiscover any Celery tasks registered inside tasks.py
files inside your Django apps. Tasks are simple Python functions decorated with a @shared_task
decorator.
Run ./manage.py startapp ccapp
and create a tasks.py
file inside it. First, we’re going to write a simple example task so that we can test everything is set up correctly. Add the below to tasks.py
:
from celery import shared_task
@shared_task
def example_task():
import logging
logging.info("This is a test message!")
Don’t forget to register the newly created ccapp
application in your settings.py
like so:
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
] + [
'ccapp.apps.CcappConfig',
]
Now we will confirm that Celery workers can run tasks registered like so. Celery workers run in separate processes from your Django server — to start up a Celery worker, run celery -A ccticker worker -l info
from the same directory level as your manage.py
file. In another pipenv shell
, run ./manage.py shell
and execute the following code:
Python 3.6.5 (default, May 25 2018, 18:26:17)
[GCC 4.2.1 Compatible Apple LLVM 9.1.0 (clang-902.0.39.1)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
(InteractiveConsole)
>>> import ccapp.tasks
>>> ccapp.tasks.example_task.delay()
<AsyncResult: 97cf4530-60d0-49b3-afba-7764ca3ce1fb>
You call ccapp.tasks.example_task.delay()
instead of ccapp.tasks.example_task()
directly in order to indicate to send the task to be run by a Celery worker process, instead of running it directly in the shell. You should see “This is a test message” printed to the stdout of your Celery process.
Now we know that sending tasks to Celery from a Django shell process is working, we’re going to be bold and assume it will work from the Django server process as well — and now we can write a first task for fetching ticker codes from CryptoCompare. We’ll be using the cryptocompy
library to interact with the API, so install it using pipenv install cryptocompy requests
, then add this to your tasks.py
:
from celery import shared_task
from cryptocompy import price
from django.core.cache import cache
@shared_task
def update_cc_prices():
cryptocoins = ['ETH', 'BTC']
currencies = ['EUR', 'USD']
response = price.get_current_price(cryptocoins, currencies)
for cryptocoin in cryptocoins:
for currency in currencies:
ticker_code = cryptocoin + currency
cache.set(ticker_code, response[cryptocoin][currency])
If you want, you can test this task in the same way as with example_task
to make sure it’s working. If it works, you should see the prices of cryptocoins appear in your Redis cache — you can look inside your Redis cache using a GUI tool, or the redis-cli
command line tool. Run keys *
to see all key-values stored in your Redis instance, and you should see one for each ticker.
Mon May 28 11:14:42 - ~/src/github.com/jameshiew/cryptocoin-ticker-tutorial/ccticker [0]
[james@Jamess-MacBook-Pro.local] redis-cli
127.0.0.1:6379> keys *
1) ":1:ETHEUR"
2) ":1:ETHUSD"
3) ":1:BTCEUR"
4) ":1:BTCUSD"
127.0.0.1:6379> GET ":1:ETHEUR"
"\x80\x04\x95\n\x00\x00\x00\x00\x00\x00\x00G@|\xc5p\xa3\xd7\n=."
127.0.0.1:6379> exit
If you look at the value of the ticker, you will see it is not a simple number — this is because it is encoded using Python’s pickle codec.
We want to run this task every ten seconds. We’ll use Celerybeat, a task scheduler especially for Celery that has good Django integration. Do pipenv install django-celery-beat
and add this to your settings.py
:
# django-celery-beat
INSTALLED_APPS += [
'django_celery_beat'
]
Now run ./manage.py migrate
and ./manage.py createsuperuser
to create an admin user. ./manage.py runserver
and in the admin, create a new periodic task like below.
Once you’ve set this up in the admin, it’s time to run:
- a Celery worker process that will actually run the
update_cc_prices
task as and when necessary - a Celery beat process to call the task as per the schedule
You can do this with the following two commands, in separate pipenv shell
environments:
celery -A ccticker worker -l info
celery -A ccticker beat -l info --scheduler django_celery_beat.schedulers:DatabaseScheduler
If everything has gone smoothly so far, your Redis cache should be getting the latest cryptocoin prices every ten seconds. In the next part of the tutorial, we’ll stream these prices (and changes to them) from our Redis cache to tickers on webpages served up by Django using WebSockets.
You can find part two of the tutorial here.