Integrate Django with Celery
Learn how to integrate Django with Celery
This page explains how to integrate
Django with Celery, two powerful technologies used in many applications and production-ready projects.
The source code
is saved on GitHub for later reference and support.
- 👉 Integrate Django & Celery -
sample project
- 🚀 Free Support via Email & Discord
In the end, the integration will be similar to this UI:
✅ Introduction​
Django is a leading web framework and it has a lot of features built into it. Although Django is a fast and scalable web framework, certain tasks can increase the strain on the web servers and also reduce the response time of our Django application.
Celery is a package that aims to solve this problem.
What is Celery?​
Celery is a task queue, it is a powerful tool for managing asynchronous tasks in Django. It can help developers to build more scalable and efficient applications by running time-consuming tasks on separate processes.
Celery works by using a message queue, such as RabbitMQ or Redis(also known as brokers), to distribute tasks across worker processes. When a task is submitted, it is added to the queue, and a worker process executes it. This allows tasks to be executed asynchronously, without blocking the main Django process or affecting the responsiveness of the application.
Tasks such as sending emails, processing data, or generating reports, can be time-consuming and thus reduce the response time of the Django application. These tasks can be sent to worker processes to be done asynchronously and also speed up our Django application.
Why using Celery?​
- Celery increases the responsiveness of the application by handling processes that will normally block the Django process.
- It allows developers to schedule tasks to run at a specific time or interval.
- It provides mechanisms for retrying failed tasks and handling errors.
- It can also store the results of tasks in a backend such as Redis, so they can be retrieved later.
Using Django and Celery will improve the user experience of your application and also reduce the on Django servers. Django and Celery can integrate seamlessly because Django is supported out of the box.
Django Async Vs. Celery​
Django Async, provides a built-in way to handle asynchronous tasks, without relying on third-party libraries like Celery. It uses async/await syntax and the asyncio library to run tasks in the background, inside the same Django process as the web server. This means that there is no need to set up a separate task queue, broker, or worker process, as is the case with Celery.
Celery, on the other hand, is a mature, battle-tested, and widely-used distributed task queue platform that provides a feature-rich and robust solution for running asynchronous tasks in Django and other Python applications. It requires a separate server and worker processes to execute tasks, usually on a different machine or cluster of machines, which increases its scalability and performance.
Some other key differences between Django Async and Celery are:
- Django Async only supports asyncio-based code, while Celery supports a wider range of task execution strategies, including threads, processes, and coroutines.
- Django Async is easier to set up and configure, as it does not require any extra dependencies or infrastructure. Celery, however, requires additional configuration and setup, such as a message broker (e.g. RabbitMQ, Redis, or SQS) which can store the task messages and dispatch them to workers.
- Django Async is more suitable for smaller applications or simple use cases, while Celery is better suited for more complex, high-throughput applications that require scalability and durability.
In summary, Django Async provides a simple, built-in solution for handling asynchronous tasks, while Celery provides a more robust and scalable solution for large, complex web applications.
✅ Setting up the Environment​
To use Celery, you need a broker such as RabbitMQ or Redis. There are several other brokers, s see Celery Broker Overview for a full list. For this tutorial, we will be using Redis.
Redis Installation​
Redis is an in-memory data structure store, used as a distributed, in-memory key-value database, cache and message broker, with optional durability. Redis supports different kinds of abstract data structures, such as strings, lists, maps, sets, sorted sets, HyperLogLogs, bitmaps, streams, and spatial indices. It can be used as a database, cache, streaming engine, and message broker.
Redis and Celery are not supported on Windows, Although you can use Windows Subsystem for Linux (WSL) and follow the steps for Linux. Follow this link to set up WSL for your machine and follow the steps for Linux to install Redis on your machine.
For Linux
:
From your terminal execute the following commands
$ sudo apt update
$ sudo apt install redis
Once Redis is installed, you can check if it is running using the command below:
$ sudo service redis status
If Redis is not running, run the command below
$ sudo service redis start
Redis by default runs on port 6379
, so this is the port we will use to access the Redis service.
Using Docker
:
The command below will start a new Docker container with Redis installed, and link port 6379
to the port Redis uses in the Docker container.
$ docker run -d -p 6379:6379 redis
Create a Django project​
- First, we will be creating the project directory using the command below
$ mkdir sample_django_task_manager
$ cd sample_django_task_manager
sample_django_task_manager$
- Next, we will be creating a Virtual environment to isolate our development packages from other packages.
sample_django_task_manager$ virtualenv venv
sample_django_task_manager$ source venv/bin/activate
(venv)sample_django_task_manager$
- Run the command below to install the packages that will be needed for the project
(venv)sample_django_task_manager$ pip install django celery redis django-celery-results
celery
will be used to start worker processes, redis
will be serving as the connector to our Redis service that is running, django-celery-results
will be used to view the results of tasks carried out by celery workers.
- Next, we will create our Django project
(venv) sample_django_task_manager$ django-admin start project core .
Django & Celery Integration​
We will now be making changes to our project to allow it to work with celery.
- Create a file inside the
core
directory calledcelery.py
and add the following code
# core/celery.py
import os
from celery import Celery
# Set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'core.settings')
app = Celery('core')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django apps.
app.autodiscover_tasks()
@app.task(bind=True)
def debug_task(self):
print(f'Request: {self.request!r}')
@app.task(bind=True)
Attaches the debug_task
function to the celery app object created earlier. We will be looking at a different way to create tasks without binding them to a particular app as we proceed in this tutorial.
- Then you need to import this app in your
/core/__init__.py
module. This ensures that the app is loaded when Django starts so that the@shared_task
decorator (mentioned later) will use it:
# core/__init__.py
from .celery import app as celery_app
__all__ = ('celery_app',)
- Next step is to add the installed packages to the list of installed applications for our Django project. Make the following changes to
core/settings.py
.
# core/settings.py
...
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'django_celery_results', # Django Celery Results # <-- NEW
]
...
CELERY_BROKER_URL = os.environ.get("CELERY_BROKER", "redis://localhost:6379")
# CELERY_RESULT_BACKEND = os.environ.get("CELERY_BROKER", "redis://localhost:6379")
CELERY_TASK_TRACK_STARTED = True
CELERY_CACHE_BACKEND = "django-cache"
CELERY_RESULT_BACKEND = "django-db"
CELERY_ACCEPT_CONTENT = ["json"]
CELERY_SERIALIZER = "json"
CELERY_BROKER_URL
refers to the address of the message broker, redis
prefixing the address lets the connector installed earlier to connect and make use of the Redis
instance running on that address.
CELERY_RESULT_BACKEND
refers to the storage location of the result of tasks executed by celery. django-db
is the database used by our Django project, this value can be changed as seen above by the line commented out.
CELERY_ACCEPT_CONTENT
is a white list of content-types/serializers to allow. By default, only json is enabled but any content type can be added.
CELERY_SERIALIZER
is a string identifying the default serialization method to use, this can be changed. Check out the full list of Serializers.
Celery can be further customized using a wide range of options, Celery Configuration contains a list of the configuration options.
- Now, we will be creating a Django application that will be used to create tasks and see how Celery can help us process tasks and allow to Django run without having to deal will blocking operations.
(venv) sample_django_task_manager$ python manage.py startapp django_celery
(venv) sample_django_task_manager$ python manage.py migrate
(venv) sample_django_task_manager$ python manage.py createsuperuser
- After creating the application and creating a superuser, now we have access to the admin panel. Execute the command below and open
http://127.0.0.1:8000
on your browser.
(venv) sample_django_task_manager$ python manage.py runserver
- We will be adding
django_celery
to the list of installed applications for our Django project. Make the following changes tocore/settings.py
# core/settings.py
...
INSTALLED_APPS = [
...
'django_celery', # <-- NEW
]
...
- Create a file
tasks.py
inside thedjango_celery
folder. This folder will contain the tasks we want to be handled by the celery workers. Add the function below to thedjango_celery/tasks.py
# django_celery/tasks.py
from time import sleep
def slowdown_django_process():
sleep(10) # simulates blocking processes
return {'Status': 'Process slowdown complete'}
- Now, we will be creating a view that makes use of the function just created. Add the following content to
django_celery/views.py
# django_celery/views.py
from django.shortcuts import HttpResponse
from .tasks import slowdown_django_process
def homepage(request):
slowdown_django_process() # the blocking process
return HttpResponse('Hello user!')
- Create a file named urls.py and add the following, to create a route for the
homepage
function
# django_celery/urls.py
from django.urls import path
from . import views
urlpatterns = [
path("", views.homepage),
]
- Changes will be made to
core/urls.py
to create a route fordjango_celery
# core/urls.py
...
from django.urls import path, include # <-- UPDATED: Added 'include' HELPER
urlpatterns = [
...
path("", include("django_celery.urls")),
]
- If your Django server has been stopped, restart it and visit
http://127.0.0.1:8000
with your browser. Notice the response time of the webpage to render the wordsHello User!
Now we will be making changes to our Django application, to use celery to handle blocking tasks and free up the Django process.
- Inside the
django_celery/tasks.py
file, add the following changes
# django_celery/tasks.py
...
from celery import shared_task
@shared_task
def slowdown_django_process():
time.sleep(10)
return {'status': 'Process slowdown complete'}
@shared_task
decorator lets you create tasks without having any concrete app instance, this helps when creating reusable apps because the tasks does not depend on the project itself. Celery automatically detects tasks with the shared_task
decorator.
- Update
django_celery/views.py
with the following code
# django_celery/views.py
...
def homepage(request):
slowdown_django_process.delay() # <-- UPDATED
return HttpResponse('Hello User!')
The delay
method makes the slowdown_django_process
function to be executed asynchronously using a worker process created by celery.
Now we have our application set up to use celery workers.
- Open a new terminal in the project directory and execute the command below
(venv) sample_django_task_manager$ celery -A core.celery.app worker --loglevel=info
...
[tasks]
. core.celery.debug_task
. django_celery.tasks.slowdown_django_process
...
The -A
option for the celery
command is the application instance to be used when creating the workers, core.celery.app
is the app Celery
instance that was created in core/celery.py
. --loglevel=info
helps us get verbose logs on the terminal.
Under [tasks]
we see all the tasks that have been created being recognized by celery and registered.
- From your browser open
http://127.0.0.1:8000
and see how fast you get the response to your request. If you check the terminal running celery, you will see something like
...
[2023-04-23 18:45:42,120: INFO/MainProcess] Task django_celery.tasks.slowdown_django_process[952c4afe-c7c0-46a7-9373-b0002f8aefa1] received
[2023-04-23 18:45:52,407: INFO/ForkPoolWorker-4] Task django_celery.tasks.slowdown_django_process[952c4afe-c7c0-46a7-9373-b0002f8aefa1] succeeded in 10.285309484999743s
This shows the task being executed by celery, the time taken to complete the task and the result of the task, which is the return value of the function slowdown_django_process
.
- Open
http://127.0.0.1:8000/admin
from your browser and click onTask Results
to see all the tasks executed by celery and other information surrounding the task. Clicking on theid
of any task will open a new page showing more information concerning that task.
✅ Django Tasks Manager​
Django Tasks Manager is a Django & Celery integration - This library is actively supported by AppSeed. It can be used to manage tasks, check tasks' status and also the log and result of tasks performed by celery.
Features:
- Create/Revoke
Celery Tasks
View LOGS
& Output- `Minimal Configuration
- Available TASKS (provided as starting samples)
users_in_db()
- List all registered usersexecute_script()
- let users execute the scripts defined inCELERY_SCRIPTS_DIR
(CFG parameter)- check-db-health.py (sample)
- check-disk-free.py (sample)
- clean-database.py (sample)
The samples are there to provide a usage guide, and you can write your tasks and execute them using Django Tasks Manager
.
Setting up Django Tasks Manager​
Django Tasks Manager
can be installed usingpip
, install the package using the command below
(venv) sample_django_task_manager$ pip install django-tasks-manager
- After installation, you need to set it up as an application in
core/settings.py
# core/settings.py
...
INSTALLED_APPS = [
...
'django_tm', # Django Tasks Manager # <-- NEW
'django_celery_results',
'django_celery',
]
...
CELERY_SCRIPTS_DIR = os.path.join(BASE_DIR, "celery_scripts") # path for scripts to be executed
CELERY_LOGS_DIR = os.path.join(BASE_DIR, "celery_logs") # path for log files for each task executed
- Next, we need to create the directory where our celery scripts and logs will be saved. In the root directory of the project create the folders
celery_logs
andcelery_scripts
# new directory structure
.
├── celery_logs
├── celery_scripts
├── core
├── django_celery
└── manage.py
- Add
django_task_manager
URL route tocore/urls.py
, this makesdjango_tm
pages to be accessible fromhttp://127.0.0.1:8000/tasks
# core/urls.py
...
urlpatterns = [
...
path('', include("django_tm.urls")), # <-- NEW
]
- Stop the celery server using
Ctrl + c
, and restart it, this way it recognizes the tasks registered indjango_tm
. When the Django server and celery are running, head tohttp://127.0.0.1/tasks
to see the working application.
If you open the webpage, you will notice it contains four tasks, two were created during this tutorial and two were added from django-tasks-manager
. Another thing to note is that the input to the execute_script
task is empty, this is because our celery_scripts
folder is empty.
By adding Python files into the celery_scripts
folder, we can execute several files from within the task manager, files that can perform a wide range of functions.
- You can add any script of your choosing to the
celery_scripts
folder or use the samples here to get started.
After adding those Python scripts, we can now start executing Python scripts using django-tasks-manager
.
✅ Conclusion​
In conclusion, integrating Django and Celery can enhance the functionality and efficiency of your application by allowing you to offload time-consuming or resource-intensive tasks to background workers. With Celery handling your asynchronous task queue, Django can focus on serving user requests and generating responses.
To successfully integrate Django and Celery, you need to install and configure Celery to work with your Django project. This includes setting up a Celery worker, task queue, broker, and task scheduler.
Once the setup is complete, you can create and register tasks in Celery's task queue, and use annotations to define and pass arguments to your tasks.
While the integration process can seem complex, it is worth the effort, especially considering the numerous benefits it offers. With Django and Celery combined, you can build faster, more responsive, and more scalable web applications to meet the needs of your users.
✅ Resources​
- 👉 Access AppSeed for more starters and support
- 👉 Deploy Projects on Aws, Azure and DO via DeployPRO
- 👉 Create landing pages with Simpllo, an open-source site builder
- 👉 Build apps with Django App Generator (free service)