visit
Here I’m assuming you already have your basic Django project setup. And, already know what Celery is? if not, I’ll suggest getting a basic understanding of it . So let’s just directly jump into the steps.
Please follow the comments to get a basic understanding of the code.
Install celery into your project. As celery also need a default broker (a solution to send and receive messages, and this comes in the form of separate service called a message broker). Check the list of available brokers: . So you can directly install the celery bundle with the broker. .(env)$ pip install "celery[redis]"
Once installed. Head to the project folder which contains
settings.py
and create a new file called celery.py
and put the following code into it.import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'meupBackend.settings')
app = Celery('meupBackend', backend='redis', broker='redis://localhost:6379')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
and now head over to
__init__.py
of the same folder and put the following code.from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)
### settings.py
BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
INSTALLED_APPS = [
# others app
'celery',
'django_celery_results',
'django_celery_beat',
]
"""
django_celery_results:
This extension enables you to store Celery task results using the Django ORM.
"""
"""
django_celery_beat:
This extension enables you to store the periodic task schedule in the database.
The periodic tasks can be managed from the Django Admin interface, where you can create, edit and delete periodic tasks and how often they should run.
"""
### tasks.py (in any of your app)
from __future__ import absolute_import
from celery import shared_task
@shared_task
def test(param):
return 'The test task executed with argument "%s" ' % param
### Terminal
(env)$ brew install redis
# once installed, run:
(env)$ redis-server
# Terminal 2, open another terminal or tab
(env)$ redis-cli ping
pong
If you get the pong response, then you’re fine to move forward, can quit the server and close the terminals.
If you’re on Windows and Linux, please check out how you can install the Redis here: . Now… run:(env)$ python manage.py migrate
(env)$ pip install flower
Terminal 1:
(env)$ redis-server
Terminal 2:
$ python manage.py runserver
Terminal 3:
(env)$ flower -A meup ## here `meup` is a project name
Now your project will be running on
localhost:8000
, Redis will be running on port 6379
, and flower will be running on localhost:5000
.Please make sure your Redis server is running on a port 6379 or it’ll be showing the port number in the command line when it got started. So put that port number into you Redis server config into celery configurations file.🙏Previously published at