python - How to setup an AWS Elastic Beanstalk Worker Tier Environment + Cron + Django for periodic tasks? 403 Forbidden error -


the app needs run periodic tasks on background delete expired files. app , running @ web server , @ worker tier environment.

a cron.yaml file @ root of app:

version: 1 cron:  - name: "delete_expired_files"    url: "/networks_app/delete_expired_files"    schedule: "*/10 * * * *"  

the cron url points app view:

def delete_expired_files(request):     users = demouser.objects.all()     user in users:         documents = document.objects.filter(owner=user.id)         if documents:             doc in documents:                 = timezone.now()                 if >= doc.date_published + timedelta(days=doc.owner.group.valid_time):                     doc.delete() 

the django allowed_hosts setting follows:

allowed_hosts = ['127.0.0.1', 'localhost', 'networksapp.elasticbeanstalk.com'] 

the task being scheduled , queries sending requests right url, they're going workerdeadletterqueue

the worker tier environment log file shows 403 error:

"post /networks_app/delete_expired_files http/1.1" 403 1374 "-" "aws-sqsd/2.0"

the task not being executed (expired files aren't being deleted). when access url executes task properly.

i need make work automatically , periodically.

my iam user has policies:

amazonsqsfullaccess

amazons3fullaccess

amazondynamodbfullaccess

administratoraccess

awselasticbeanstalkfullaccess

why isn't task being executed? has iam permission? there missing configuration? how make work? in advance.


Comments

Popular posts from this blog

sublimetext3 - what keyboard shortcut is to comment/uncomment for this script tag in sublime -

java - No use of nillable="0" in SOAP Webservice -

ubuntu - Laravel 5.2 quickstart guide gives Not Found Error -