i've set djcelery along redis. can see logs in redis
redis-cli monitor
1454060863.881506 [0 [::1]:59091] "info" 1454060863.883295 [0 [::1]:59093] "multi" 1454060863.883314 [0 [::1]:59093] "llen" "celery" 1454060863.883319 [0 [::1]:59093] "llen" "celery\x06\x163" 1454060863.883323 [0 [::1]:59093] "llen" "celery\x06\x166" 1454060863.883326 [0 [::1]:59093] "llen" "celery\x06\x169" 1454060863.883331 [0 [::1]:59093] "exec" 1454060863.883704 [0 [::1]:59093] "sadd" "_kombu.binding.celery" "celery\x06\x16\x06\x16celery" 1454060863.884054 [0 [::1]:59093] "smembers" "_kombu.binding.celery" 1454060863.884421 [0 [::1]:59093] "lpush" "celery" "{\"body\": \"gaj9cqeovqdlehbpcmvzcqjovqn1dgnxa4hvbgfyz3nxbeseswsgcqvvbwnob3jkcqzovqljywxsymfja3nxb05vcgvycmjhy2tzcqhovqd0yxnrc2v0cqlovqjpzhekvsqwzgqwzmjmzc0zytq0ltqxmdmtotnioc01nmi4zmfjnje0mdjxc1uhcmv0cmllc3emswbvbhrhc2txdvupdxrpbhmudgfza3muywrkcq5vcxrpbwvsaw1pdheptk6gvqnldgfxee5vbmt3yxjnc3erfxesds4=\", \"headers\": {}, \"content-type\": \"application/x-python-serialize\", \"properties\": {\"body_encoding\": \"base64\", \"correlation_id\": \"0dd0fbfd-3a44-4103-93b8-56b8fac61402\", \"reply_to\": \"b9f02cee-e562-3e86-90aa-683d205d060c\", \"delivery_info\": {\"priority\": 0, \"routing_key\": \"celery\", \"exchange\": \"celery\"}, \"delivery_mode\": 2, \"delivery_tag\": \"b44aafc9-e6aa-4cd8-8a70-201d895cbd7f\"}, \"content-encoding\": \"binary\"}" 1454060872.513147 [0 [::1]:59204] "get" "celery-task-meta-0dd0fbfd-3a44-4103-93b8-56b8fac61402"
but there's no logs in celery
python manage.py celeryd -l debug
[2016-01-29 10:14:51,073: debug/mainprocess] | worker: preparing bootsteps. [2016-01-29 10:14:51,075: debug/mainprocess] | worker: building graph... [2016-01-29 10:14:51,075: debug/mainprocess] | worker: new boot order: {timer, hub, queues (intra), pool, autoscaler, statedb, autoreloader, beat, consumer} [2016-01-29 10:14:51,084: debug/mainprocess] | consumer: preparing bootsteps. [2016-01-29 10:14:51,085: debug/mainprocess] | consumer: building graph... [2016-01-29 10:14:51,097: debug/mainprocess] | consumer: new boot order: {connection, events, heart, agent, mingle, gossip, tasks, control, event loop} -------------- celery@kumars-macbook-pro-2.local v3.1.20 (cipater) ---- **** ----- --- * *** * -- darwin-14.5.0-x86_64-i386-64bit -- * - **** --- - ** ---------- [config] - ** ---------- .> app: default:0x109177ed0 (djcelery.loaders.djangoloader) - ** ---------- .> transport: redis://localhost:6379/2 - ** ---------- .> results: redis://localhost:6379/2 - *** --- * --- .> concurrency: 4 (prefork) -- ******* ---- --- ***** ----- [queues] -------------- .> celery exchange=celery(direct) key=celery [tasks] . celery.backend_cleanup . celery.chain . celery.chord . celery.chord_unlock . celery.chunks . celery.group . celery.map . celery.starmap . utils.tasks.add [2016-01-29 10:14:51,113: debug/mainprocess] | worker: starting pool
python manage.py celerycam
-> evcam: taking snapshots djcelery.snapshot.camera (every 1.0 secs.) [2016-01-29 09:58:10,377: info/mainprocess] connected redis://localhost:6379/2
this settings file , task
settings.py
# celery settings djcelery.setup_loader() broker_url = 'redis://localhost:6379/2' celery_result_backend = 'redis://localhost:6379/2' installed_app = ( ... 'djcelery', )
tasks.py
from celery import task @task() def add(x, y): return x + y
when run task, state 'pending'. idea on this?
Comments
Post a Comment