Introduction
In modern web applications, certain operations can take a significant amount of time — such as sending emails, generating PDF reports, resizing images, or making API calls to third-party services.
If these tasks are executed during a normal HTTP request, they can make your app slow and degrade the user experience.
This is where Celery comes in.
Celery is a powerful, production-grade asynchronous task queue that works seamlessly with Django. It allows you to offload long-running tasks to the background, keeping your application responsive and fast.
In this post, we’ll explore how to integrate Celery into a Django project, configure it with a message broker (like Redis), and implement background task processing — step by step.
1. What Is Celery?
Celery is an open-source distributed task queue written in Python.
It allows you to run time-consuming or scheduled operations asynchronously, outside of the main request-response cycle.
Core Features of Celery
- Asynchronous Task Execution
Run tasks in the background without blocking user requests. - Scheduled Tasks
Perform tasks at regular intervals (like cron jobs). - Reliable Delivery
Tasks are guaranteed to be executed, even if a server restarts. - Scalability
Celery can run across multiple servers and workers, handling thousands of tasks simultaneously. - Integration
Celery integrates easily with Django, Flask, FastAPI, and other Python frameworks.
2. Why Use Celery with Django?
Let’s imagine a simple Django app that sends a confirmation email every time a user signs up.
If you send the email directly inside the views.py, the user has to wait until the email is sent before getting a response.
Example (synchronous):
from django.core.mail import send_mail
def register_user(request):
# ... registration logic ...
send_mail(
"Welcome to MySite!",
"Thank you for registering.",
"[email protected]",
[user_email],
fail_silently=False,
)
return HttpResponse("Registration successful!")
This approach blocks the response until the email is sent — inefficient and slow.
Now, using Celery, we can send the email in the background:
from .tasks import send_welcome_email
def register_user(request):
# ... registration logic ...
send_welcome_email.delay(user_email)
return HttpResponse("Registration successful!")
Notice the .delay() method — it queues the task for Celery to execute asynchronously.
The user gets an instant response while the email is sent in the background.
3. Installing Celery and Redis
Celery requires a message broker to manage task queues.
Common brokers are Redis and RabbitMQ. Redis is simple, fast, and ideal for Django projects.
Step 1: Install Dependencies
pip install celery redis
Make sure Redis server is installed and running. On Ubuntu:
sudo apt install redis-server
sudo systemctl enable redis-server
sudo systemctl start redis-server
Verify Redis is working:
redis-cli ping
If it returns PONG, it’s running correctly.
4. Django Project Setup
Let’s assume you already have a Django project.
If not, create a sample one:
django-admin startproject celery_project
cd celery_project
python manage.py startapp core
Now your structure looks like this:
celery_project/
celery_project/
__init__.py
settings.py
urls.py
wsgi.py
core/
__init__.py
views.py
tasks.py
manage.py
5. Configuring Celery in Django
We’ll now configure Celery for our Django project.
Step 1: Create a celery.py File
Inside your project directory (next to settings.py), create a new file named celery.py:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'celery_project.settings')
app = Celery('celery_project')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
Step 2: Update __init__.py
Inside your project’s main __init__.py, import Celery to ensure it runs when Django starts:
from __future__ import absolute_import, unicode_literals
from .celery import app as celery_app
__all__ = ('celery_app',)
This ensures Django automatically loads Celery when the app starts.
6. Configuring Redis as the Broker
Open settings.py and add the Celery broker configuration:
CELERY_BROKER_URL = 'redis://localhost:6379/0'
CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'UTC'
Explanation:
- CELERY_BROKER_URL: Defines Redis as the broker.
- CELERY_RESULT_BACKEND: Stores task results in Redis.
- CELERY_ACCEPT_CONTENT: Only accepts JSON serialized messages.
- CELERY_TASK_SERIALIZER: Converts tasks to JSON.
- CELERY_TIMEZONE: Ensures timezone consistency.
7. Creating a Simple Task
In your app directory (core/), create a file named tasks.py if it doesn’t exist:
from celery import shared_task
from django.core.mail import send_mail
@shared_task
def send_welcome_email(email):
send_mail(
"Welcome to Celery Demo!",
"Thanks for registering on our website.",
"[email protected]",
[email],
fail_silently=False,
)
return f"Email sent to {email}"
The @shared_task decorator makes this function available as a Celery task that can be executed asynchronously.
8. Calling the Task in Views
In core/views.py, call the task when a user registers or triggers an action.
from django.http import HttpResponse
from .tasks import send_welcome_email
def register(request):
email = "[email protected]"
send_welcome_email.delay(email)
return HttpResponse("Email will be sent in the background!")
When .delay() is called, Celery queues the task and returns immediately.
9. Running Celery
To run Celery workers that process the queued tasks, open a terminal and run:
celery -A celery_project worker -l info
-A celery_project: Tells Celery to use your Django project’s configuration.worker: Starts a worker process.-l info: Enables detailed logs.
If everything is configured properly, you’ll see Celery start up and listen for tasks.
10. Running Django
In another terminal, start the Django server:
python manage.py runserver
Now, when you hit the endpoint for the register view, Django queues the email task, and Celery sends it in the background.
11. Monitoring Celery Tasks
Celery provides multiple ways to monitor tasks, but the simplest method is to check the worker logs.
For more advanced visualization, you can use Flower — a web-based Celery monitoring tool.
Install Flower:
pip install flower
Run Flower:
celery -A celery_project flower
By default, Flower runs at:
http://localhost:5555
Here you can see:
- Task history
- Task state (Pending, Started, Success, Failed)
- Execution time
- Worker status
12. Retrying Failed Tasks
Sometimes tasks fail — maybe due to network issues or an external API being down.
Celery can automatically retry failed tasks.
Example with retries:
from celery import shared_task
from django.core.mail import send_mail
from celery.exceptions import Retry
@shared_task(bind=True, max_retries=3, default_retry_delay=10)
def send_email_with_retry(self, email):
try:
send_mail(
"Retry Example",
"Testing Celery retry feature.",
"[email protected]",
[email],
)
except Exception as exc:
raise self.retry(exc=exc)
Explanation:
bind=Trueallows access toself.retry.max_retries=3retries up to three times.default_retry_delay=10waits 10 seconds between retries.
13. Periodic and Scheduled Tasks
Celery can also schedule tasks using Celery Beat — a scheduler that sends tasks at regular intervals.
Step 1: Add Celery Beat to Settings
In settings.py:
CELERY_BEAT_SCHEDULE = {
'send-report-every-morning': {
'task': 'core.tasks.send_daily_report',
'schedule': 86400, # 24 hours
},
}
Step 2: Define the Scheduled Task
In core/tasks.py:
from celery import shared_task
import datetime
@shared_task
def send_daily_report():
print(f"Daily report sent at {datetime.datetime.now()}")
return "Report Sent"
Step 3: Run Celery Beat
In a new terminal:
celery -A celery_project beat -l info
Now Celery Beat will send tasks at scheduled intervals automatically.
14. Combining Celery Worker and Beat
Running separate processes for worker and beat can be simplified by combining them:
celery -A celery_project worker --beat -l info
This runs both the worker and the scheduler in one process — suitable for smaller projects.
15. Handling Task Results
If you want to store or access the result of a task, you can do so easily:
result = send_welcome_email.delay('[email protected]')
print(result.id) # Get task ID
print(result.status) # Get current status
To check if it’s done:
from celery.result import AsyncResult
res = AsyncResult(result.id)
print(res.ready()) # True if completed
print(res.get()) # Returns result value
16. Using Celery with Django Models
You can use Celery to perform operations on Django models asynchronously.
Example — updating a record after processing:
from celery import shared_task
from .models import UserProfile
@shared_task
def process_user_data(user_id):
user = UserProfile.objects.get(id=user_id)
user.processed = True
user.save()
return f"User {user_id} processed"
This allows you to offload heavy database operations without blocking the main request cycle.
17. Chaining and Grouping Tasks
Celery supports task chaining, groups, and chords for complex workflows.
Chaining Example
from celery import chain
from .tasks import task_a, task_b, task_c
result = chain(task_a.s(), task_b.s(), task_c.s())()
Here:
task_aruns first.- Its result is passed to
task_b. - Then
task_cruns last.
Group Example
from celery import group
from .tasks import send_welcome_email
emails = ['[email protected]', '[email protected]', '[email protected]']
group(send_welcome_email.s(email) for email in emails)()
This executes multiple tasks concurrently — ideal for sending bulk notifications or processing large datasets.
18. Configuring Celery in Docker
If you deploy your Django app using Docker, Celery can easily be added as a separate container.
Example docker-compose.yml:
version: '3'
services:
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
ports:
- "8000:8000"
depends_on:
- redis
redis:
image: redis:latest
celery:
build: .
command: celery -A celery_project worker -l info
depends_on:
- redis
This setup runs:
- Django web server
- Redis as broker
- Celery worker as background container
19. Debugging Celery Tasks
If tasks aren’t executing, here’s a checklist:
- Is the worker running?
Runcelery -A project worker -l info. - Is Redis running?
Check usingredis-cli ping. - Is the task correctly imported?
Ensure the task is inside an app listed inINSTALLED_APPS. - Check logs:
Celery logs detailed error messages that help debug import issues, broker connections, or missing tasks.
20. Optimizing Celery Performance
For production, optimize Celery by:
- Running multiple worker processes:
celery -A project worker --concurrency=4 -l info - Using prefetch limits:
CELERYD_PREFETCH_MULTIPLIER = 1 - Monitoring with Flower or Prometheus.
- Offloading non-critical tasks (emails, notifications, cleanup).
21. Security Considerations
Celery can serialize task data, so avoid passing sensitive information like passwords.
Instead, pass only identifiers (IDs) and fetch sensitive data from secure storage inside the task.
Example:
Instead of this:
send_payment_task.delay(card_number, cvv)
Do this:
send_payment_task.delay(user_id)
Then fetch data securely from the database inside the task.
22. Real-World Use Cases of Celery in Django
- Sending Emails and Notifications
Trigger welcome or password reset emails in the background. - Generating Reports
Build large PDF or CSV reports asynchronously. - Image Processing
Resize or compress user-uploaded images in background tasks. - External API Integration
Offload API requests or web scraping to Celery. - Scheduled Maintenance
Clean old records or cache periodically using Celery Beat.
23. Testing Celery Tasks
During development or testing, you can configure Celery to run tasks synchronously.
In settings.py:
CELERY_TASK_ALWAYS_EAGER = True
Now tasks run immediately, without a broker — perfect for unit testing.
24. Deploying Celery to Production
Step 1: Use Supervisor or Systemd
Use Supervisor or systemd to ensure Celery workers stay alive.
Example Supervisor config:
[program:celery]
command=celery -A celery_project worker -l info
directory=/home/user/celery_project
user=user
autostart=true
autorestart=true
Step 2: Run Celery Beat
[program:celerybeat]
command=celery -A celery_project beat -l info
Both processes will restart automatically if they crash.
Leave a Reply