This guide demonstrates how to set up Django with Celery, Redis, and Celery Beat following Django best practices and the TestDriven.io tutorial approach.
- Start the development environment:
docker compose up -d --build- Check service health:
# Check all services
docker compose ps
# Check individual service logs
docker compose logs app
docker compose logs celery
docker compose logs celery-beat
docker compose logs redis- Test Celery is working:
# Check celery logs - you should see the sample task running every minute
docker compose logs -f celery- Start with production profile:
docker compose --profile production up -d --buildThis includes Nginx proxy and SSL certificate management.
- app: Django application server
- db: PostgreSQL database
- redis: Redis broker for Celery
- celery: Celery worker for background tasks
- celery-beat: Celery Beat scheduler for periodic tasks
- proxy: Nginx reverse proxy (production only)
- certbot: SSL certificate management (production only)
sample_task- Runs every minute (for testing)send_daily_system_report- Runs daily at 8 AM via Django management commandcleanup_expired_sessions- Runs daily at 2 AM for maintenancedebug_task- For testing Celery setup
generate_employee_report- Generate tenant-specific employee reportsrun_employee_report_command- Run employee report via Django management commandsend_employee_birthday_reminders- Birthday reminder notifications
daily_system_report- Generate and email system-wide reportscheck_email_config- Verify email configurationsetup_initial_data- Set up initial super admin and tenant datawait_for_db- Wait for database to be ready
employee_report- Generate tenant-specific employee reportsinit_permissions- Initialize permission systemseed_permissions- Seed default permissions
Create a .env file in your project root:
# Django
SECRET_KEY=your-secret-key-here
DEBUG=1
ALLOWED_HOSTS=localhost,127.0.0.1
# Database
DB_HOST=db
DB_NAME=devdb
DB_USER=devuser
DB_PASS=changeme
# Redis
REDIS_URL=redis://redis:6379/0
# Email Configuration
EMAIL_BACKEND=django.core.mail.backends.console.EmailBackend
EMAIL_HOST=smtp-relay.brevo.com
EMAIL_PORT=587
EMAIL_HOST_USER=your-email@domain.com
EMAIL_HOST_PASSWORD=your-password
EMAIL_USE_TLS=1
MAIL_FROM=noreply@yourdomain.com
DEFAULT_FROM_EMAIL=noreply@yourdomain.com
# Production (optional)
DOMAIN=yourdomain.com
ACME_DEFAULT_EMAIL=admin@yourdomain.comThe following settings are configured in app/settings.py:
# Celery Configuration
CELERY_BROKER_URL = os.environ.get('REDIS_URL', 'redis://localhost:6379/0')
CELERY_RESULT_BACKEND = CELERY_BROKER_URL
CELERY_TIMEZONE = TIME_ZONE
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_BEAT_SCHEDULER = 'django_celery_beat.schedulers:DatabaseScheduler'
# Periodic Tasks Schedule
CELERY_BEAT_SCHEDULE = {
'sample-task': {
'task': 'core.tasks.sample_task',
'schedule': crontab(minute='*/1'), # Every minute for testing
},
'daily-system-report': {
'task': 'core.tasks.send_daily_system_report',
'schedule': crontab(hour=8, minute=0), # Daily at 8 AM
},
'cleanup-expired-sessions': {
'task': 'core.tasks.cleanup_expired_sessions',
'schedule': crontab(hour=2, minute=0), # Daily at 2 AM
},
}# Check if Celery worker is processing tasks
docker compose logs -f celery
# You should see:
# [tasks]
# . core.tasks.sample_task
# . core.tasks.send_daily_system_report
# . core.tasks.cleanup_expired_sessions# Check if Celery Beat is scheduling tasks
docker compose logs -f celery-beat
# You should see the sample task scheduled every minute# Test system report command
docker compose exec app uv run python manage.py daily_system_report
# Test employee report command (requires tenant)
docker compose exec app uv run python manage.py employee_report samplecompany --email admin@example.com
# Test email configuration
docker compose exec app uv run python manage.py check_email_config# Enter Django shell
docker compose exec app uv run python manage.py shell
# In the shell, run:
from core.tasks import sample_task, send_daily_system_report
from client.tasks import run_employee_report_command
# Test sample task
result = sample_task.delay()
print(result.get())
# Test system report
result = send_daily_system_report.delay()
print(result.get())
# Test employee report (replace 'samplecompany' with your tenant schema)
result = run_employee_report_command.delay('samplecompany', email='admin@example.com')
print(result.get())# All services
docker compose ps
# Service health
docker compose exec app curl -f http://localhost:8000/health/
docker compose exec redis redis-cli ping# All services
docker compose logs
# Specific service with follow
docker compose logs -f celery
docker compose logs -f celery-beat
docker compose logs -f app# Connect to Redis CLI
docker compose exec redis redis-cli
# In Redis CLI:
INFO server
MONITOR # Watch commands in real-time
KEYS celery* # View Celery keys# Restart all services
docker compose restart
# Restart specific services
docker compose restart celery celery-beat
# Rebuild and restart
docker compose up -d --build# Run migrations
docker compose exec app uv run python manage.py migrate
# Create superuser
docker compose exec app uv run python manage.py createsuperuser
# Setup initial data
docker compose exec app uv run python manage.py setup_initial_data- Create the task in
core/tasks.pyorclient/tasks.py:
@shared_task
def my_custom_task():
logger.info("My custom task is running")
# Your task logic here
return "Task completed"- Add to
CELERY_BEAT_SCHEDULEinsettings.py:
CELERY_BEAT_SCHEDULE = {
# ... existing tasks ...
'my-custom-task': {
'task': 'core.tasks.my_custom_task',
'schedule': crontab(hour=10, minute=30), # Daily at 10:30 AM
},
}- Restart Celery Beat:
docker compose restart celery-beat-
Celery can't connect to Redis:
- Check Redis is running:
docker compose logs redis - Verify REDIS_URL environment variable
- Check Redis is running:
-
Tasks not being scheduled:
- Check Celery Beat logs:
docker compose logs celery-beat - Verify
CELERY_BEAT_SCHEDULEsyntax
- Check Celery Beat logs:
-
Email not working:
- Check email configuration:
docker compose exec app uv run python manage.py check_email_config - Test email:
docker compose exec app uv run python manage.py test_email
- Check email configuration:
-
Database connection issues:
- Wait for DB:
docker compose exec app uv run python manage.py wait_for_db - Check DB health:
docker compose exec db pg_isready
- Wait for DB:
Set CELERY_TASK_ALWAYS_EAGER = True in settings for synchronous task execution during debugging.
- Separation of Concerns: Tasks, management commands, and business logic are separated
- Error Handling: Comprehensive error handling and logging
- Multi-Tenant Support: Tasks work with django-tenants
- Docker Best Practices: Health checks, proper dependencies, and volume management
- Django Patterns: Using management commands for reusable operations
- Production Ready: Includes proxy, SSL, and monitoring capabilities
- Django Celery Integration
- Celery Beat Scheduling
- Django Management Commands
- TestDriven.io Celery Tutorial
This setup provides a robust foundation for building scalable Django applications with background task processing and periodic job scheduling.