Celery

1. What is Celery?

Celery is a simple, flexible and reliable distributed system that handles a large number of messages< /span>

Asynchronous task queue focusing on real-time processing,

span>also supports task scheduling

2.Celery architecture

Celery’s architecture consists of three parts, message broker, task execution unit (worker) and task execution result Storage (task result store) composition.

share picture

Message middleware

Celery itself does not provide message services, but it can be convenient Integration with message middleware provided by third parties. Including, RabbitMQ, Redis, etc.

task execution unit

Worker is a task execution unit provided by Celery, and workers run concurrently in distributed system nodes.

Task result storage

Task result store is used to store the results of tasks performed by Workers. Celery supports storing task results in different ways, including AMQP, redis, etc.

Execution process:

user is equivalent to the person who submitted the task, and submitted to the broker is the middle of the message The worker is equivalent to the worker. There is a user in the broker, that is, the task submitted by the program. The worker will take it out and execute the consumer model similar to the producer. The store said that the simpler point is the return result after the execution of the worker.< /span>

Version support situation

Celery version 4.0 runs on

Python ?
2.7, 3.4, 3.5?
PyPy ?
5.4, 5.5?
This
is the last version to support Python 2.7, and from the next version (Celery 5.x) Python 3.5 or newer is required.

If you’re running an older version of Python, you need to be running an older version of Celery:

Python
2.6: Celery series 3.1 or earlier.
Python
2.5: Celery series 3.0 or earlier.
Python
2.4 was Celery series 2.2 or earlier.

Celery
is a project with minimal funding, so we don't support Microsoft Windows. Please don't open any issues related to that platform.

Note: Celery does not support windows, but it is not impossible to use it. It can be done with third-party modules, but if there is a problem on windows, the official does not Will provide help

3. Celery installation and configuration h1>

pip install celery

Message middleware: RabbitMQ/Redis

< span class="md-plain">app=Celery(‘task name’,backend=’xxx’,broker=’xxx’)

4. Test case

share picture

# Import first

from celery import Celery

# The following is the configuration information, here redis is used as the column
#
broker='redis://127.0.0.1:6379/2' without password
#
Message middleware
backend = 'redis://:[emailprotected]**.**: 6380/7'
# Processing results
broker = 'redis://:[emailprotected]**.**: 6380/8'
# Note: If redis has a password, add @ before entering the password, and the last one is the one that is designated to be stored in redis In the library
# The actual words in the library produce a pair of celery images, in a project Multiple Celery may be used
#
The first parameter is the name of the current task, which must be written
APP = Celery('test', broker=broker, backend=backend)

1. First of all, I have to say that I must first import celery, on The article said that selery has message middleware, processor, and result storage. Redis is used as the test here.

2. Configure message middleware and configure the result storage location

APP = Celery('test', broker=broker, backend=backend)

# Real words produce a celery image, which is possible in a project Multiple Celery will be used, so the specified name is passed in the first parameter, and it cannot be repeated
#
The object name does not matter

3. Create a selery task

< p>share picture

# A task is actually a function

#
You need to use a decorator to decorate to show that this is a task managed by celery and can be executed with celery
#
The decorator is actually the actualized object, a fixed method inside
@APP.task
def add(x, y):
import time
time.sleep(
2)
return x + y

4. Create a task for submitting p>

Submit tasks synchronously normally

Share pictures

If the submitted task is not executed, the worker will execute it

share picture

The returned corresponding is to store the submitted tasks in the redis middleware The ID of the attribute is used to query and return the result. It doesn’t matter if you don’t understand it.

Share a picture

5. Workers need to be created to execute the task after the task is submitted

Create a py file: run.py, execute the task, or use the command Execution (not available for win): celery worker -A celery_task_cs -l info (celery_test_cs is the name of the task created -l info is the log level for printing)

Under windows: celery worker -A celery_test_cs -l info -P eventlet

win installation: pip install eventlet

Code execution is usually not used:

from celery_app_task import cel

if __name__ == '__main__':
cel.worker_main()
# cel.worker_main(argv=['--loglevel=info ')

Pay attention to the import method when submitting the task. Sometimes the import method will cause an error strong>

share picture

After startup, it is as follows:

share picture

After receiving the task, assign the task and return to the execution information 7 is us The execution result of the previous is the execution time. This is printed in the info-level log

share picture

Redis stores the result data as follows, and the execution status results are all inside

Share a picture

< p>6. View the results

Share a picture

Process management:

celery Use of 
1. First install pip install celery
2. Write a py file: celery_task
3. Specify broker (message middleware), specify backend (result storage)
4. Instantiate generation A Celery object app=Celery('name', broker, backend)
5. Add a decorator to bind the task, add a decorator app.task to the function (add)
6. Submit tasks by other programs, first Import add, add.delay (parameters, parameters), the function will be submitted to the message middleware, but it will not be executed, there is a return value, direct print will print out the id of the task, and then use id to query whether the task is executed Complete
7. Start the worker to execute the task:
linux: py file of the task created by celery worker -A -l info
Windows: py file of the task created by celery worker -A -l info -P eventlet
8. View results: query by id

Application scenario

Asynchronous tasks: submit time-consuming operation tasks to Celery for asynchronous execution , Such as generating charts, sending SMS/email, message push, audio and video processing, etc.

Timed tasks: perform certain things regularly, such as daily statistics

Celery version 4.0  runs on

Python ?
2.7, 3.4, 3.5?
PyPy ?
5.4, 5.5?
This
is the last version to support Python 2.7, and from the next version (Celery 5.x) Python 3.5 or newer is required.

If you’re running an older version of Python, you need to be running an older version of Celery:

Python
2.6: Celery series 3.1 or earlier.
Python
2.5: Celery series 3.0 or earlier.
Python
2.4 was Celery series 2.2 or earlier.

Celery
is a project with minimal funding, so we don't support Microsoft Windows. Please don't open any issues related to that platform.

# Import first span>

from celery import Celery

# The following is the configuration information, here redis is used as the column
#
broker='redis://127.0.0.1:6379/2' without password
#
Message middleware
backend = 'redis://:[emailprotected]**.**: 6380/7'
# Processing results
broker = 'redis://:[emailprotected]**.**: 6380/8'
# Note: If redis has a password, add @ before entering the password, and the last one is the one that is designated to be stored in redis In the library
# The actual words in the library produce a pair of celery images, in a project Multiple Celery may be used
#
The first parameter is the name of the current task, which must be written
APP = Celery('test', broker=broker, backend=backend)

APP = Celery('< /span>test', broker=broker, backend=backend)

# Real words produce a celery image, which is possible in a project Multiple Celery will be used, so the specified name is passed in the first parameter, and it cannot be repeated
#
Object name doesn’t matter

# A task is actually a function

#
You need to use a decorator to decorate to show that this is a task managed by celery and can be executed with celery
#
The decorator is actually the actualized object, a fixed method inside
@APP.task
def add(x, y):
import time
time.sleep(
2)
return x + y

win install:pip install eventlet

from celery_app_task import cel

if __name__ == '__main__':
cel.worker_main()
# cel.worker_main(argv=['--loglevel=info ')

WordPress database error: [Table 'yf99682.wp_s6mz6tyggq_comments' doesn't exist]
SELECT SQL_CALC_FOUND_ROWS wp_s6mz6tyggq_comments.comment_ID FROM wp_s6mz6tyggq_comments WHERE ( comment_approved = '1' ) AND comment_post_ID = 612 ORDER BY wp_s6mz6tyggq_comments.comment_date_gmt ASC, wp_s6mz6tyggq_comments.comment_ID ASC

Leave a Comment

Your email address will not be published.