[Scrapyinghub] AWS SQS feeds parameter to Crawler
Problem:
If you change your parameter in your code, you need to deploy again. That is very annoying !
Solve:
We can feed parameter to our crawler by SQS
Tool: Python3
Libary: boto3
Before you use boto3, you have to get credentials from your AWS account can be found in the IAM Console
Step by Step:
- Set up your SQS
2. Send your parameter to SQS
send_message_to_sqs('gt3',['apple','banana'])
Well done! Let’s go back to your sqs console , select your quene and choose “View/Delete Messages” from Quene Actions and then you will see your messages.
3. Create SQS function on your crawler to receive parameter
receive_message_from_sqs('gt3')
After run this function, you can get your paramter from sqs you just sent.
Now you can use them easy in your crawler
Give it a Go
You can see the full of code in GitHub.
https://github.com/Kiollpt/SQS_scrapying