Celery¶
Thelogfire.instrument_celery()
method will create a span for every taskexecuted by your Celery workers.
The integration also supports theCelery beat.
Installation¶
Installlogfire
with thecelery
extra:
pipinstall'logfire[celery]'
uvadd'logfire[celery]'
Celery Worker¶
Info
The broker you use doesn't matter for the Celery instrumentation.
Anybroker supported by Celery will work.
For our example, we'll useredis. You can run it with Docker:
dockerrun--rm-d-p6379:6379redis
Below we have a minimal example using Celery. You can run it withcelery -A tasks worker --loglevel=info
:
importlogfirefromceleryimportCeleryfromcelery.signalsimportworker_init@worker_init.connect()# (1)!definit_worker(*args,**kwargs):logfire.configure(service_name="worker")# (2)!logfire.instrument_celery()app=Celery("tasks",broker="redis://localhost:6379/0")# (3)!@app.taskdefadd(x:int,y:int):returnx+yadd.delay(42,50)# (4)!
- Celery implements different signals that you can use to run code at specific points in the application lifecycle. You can see more about the Celery signalshere.
- Use a
service_name
to identify the service that is sending the spans. - Install
redis
withpip install redis
. - Trigger the task synchronously. On your application, you probably want to use
app.send_task("tasks.add", args=[42, 50])
. Which will send the task to the broker and return immediately.
Celery Beat¶
As said before, it's also possible that you have periodic tasks scheduled withCelery beat.
Let's add the beat to the previous example:
importlogfirefromceleryimportCeleryfromcelery.signalsimportworker_init,beat_init@worker_init.connect()definit_worker(*args,**kwargs):logfire.configure(service_name="worker")logfire.instrument_celery()@beat_init.connect()# (1)!definit_beat(*args,**kwargs):logfire.configure(service_name="beat")# (2)!logfire.instrument_celery()app=Celery("tasks",broker="redis://localhost:6379/0")app.conf.beat_schedule={# (3)!"add-every-30-seconds":{"task":"tasks.add","schedule":30.0,"args":(16,16),},}@app.taskdefadd(x:int,y:int):returnx+y
- The
beat_init
signal is emitted when the beat process starts. - Use a different
service_name
to identify the beat process. - Add a task to the beat schedule. See more about the beat schedulehere.
The code above will schedule theadd
task to run every 30 seconds with the arguments16
and16
.
To run the beat, you can use the following command:
celery-Atasksbeat--loglevel=info
The keyword arguments oflogfire.instrument_celery()
are passed to theCeleryInstrumentor().instrument()
method.