Using Amazon SQS with Django and Celery
I'm currently working on a new Django project which relies heavily on Celery. I normally use RabbitMQ for these kinds of projects but I decided to give Amazon SQS a try this time as it's very cheap and will simplify my setup.
While the Celery docs does a pretty good job explaining how to set it up, there were a few things that it didn't cover that I'm sure other devs will run into. Here's how I set it up.
1. Create an IAM User and give it access to your SQS region.
Make sure to create a User Policy that has 'CreateQueue' permission as Celery will automatically create the queues. Here's a sample policy that grants full access to an SQS region.
{ "Version": "2012-10-17", "Statement": [ { "Sid": "Stmt1234567890", "Effect": "Allow", "Action": [ "sqs:*" ], "Resource": [ "arn:aws:sqs:us-east-1:1234567890:*" ] } ] }
2. Configure the Celery settings in your Django settings file.
# settings/stage.py AWS_ACCESS_KEY_ID = os.getenv('AWS_ACCESS_KEY_ID') AWS_SECRET_ACCESS_KEY = os.getenv('AWS_SECRET_ACCESS_KEY') BROKER_URL = 'sqs://{0}:{1}@'.format( urllib.quote(AWS_ACCESS_KEY_ID, safe=''), urllib.quote(AWS_SECRET_ACCESS_KEY, safe='') ) BROKER_TRANSPORT_OPTIONS = { 'region': 'us-east-1', 'polling_interval': 3, 'visibility_timeout': 3600, } BROKER_TRANSPORT_OPTIONS['queue_name_prefix'] = 'repricer-stage-' CELERY_SEND_TASK_ERROR_EMAILS = True
The Celery docs explains all these settings. The main thing to note here is the BROKER_URL. The reason for the urllib.quote() part is to properly handle forward slashes in the secret key (more info here).
Now when you start Celery, you should see your queues get created automatically on Amazon SQS.
I've been using this setup for a couple of weeks now and it's working quite well. I'm still undecided, though, whether to keep it or switch back to RabbitMQ as one of the main functionalities missing is support for events. This means you can't use monitoring tools like flower to see the current status and remotely control your tasks/workers.