python - Clients keeps waiting for RabbitMQ response -
i using rabbitmq launch processes in remote hosts located in other parts of world. eg, rabbitmq running in oregon host, , receives client message launch processes in ireland , california.
most of time, processes launched, and, when finish, rabbitmq returns output client. but, sometimes, jobs finish rabbitmq hasn't return output client, , client keeps hanging waiting response. these processes can take 10 minutes execute, client 10 minutes hanged waiting response.
i using celery
connect rabbitmq, , client calls blocking using task.get()
. in other words, client hangs until receives response call. understand why client did not response if jobs have finished. how can debug problem?
here celeryconfig.py
import os import sys # add hadoop python env, running sys.path.append(os.path.dirname(os.path.basename(__file__))) # broker configuration # medusa-rabbitmq name of hosts rabbitmq running broker_url = "amqp://celeryuser:celery@medusa-rabbitmq/celeryvhost" celery_result_backend = "amqp" test_runner = 'celery.contrib.test_runner.run_tests' # debug # celery_always_eager = true # module loaded celery_imports = ("medusa.mergedirs", "medusa.medusasystem", "medusa.utility", "medusa.pingdaemon", "medusa.hdfs", "medusa.vote.voting")
Comments
Post a Comment