Recently, because an integration with a third party provider, we needed to setup a FTP/SFTP endpoint, so they can place a file with specific information that we needed to ingest into our system.
We have decided to use DocEvent, which allowed us to expose those endpoints without having a proper FTP server, and those files are directly placed into AWS s3 (much better, indeed) or, the cloud provider that you use (GCP, Azure, etc).
Now, each time that a file was placed into the s3 bucket, we needed to invoke our REST API, which would then ingest that file into the system. In order to do so, we have used a AWS Lambda in which, we have configured the trigger as ObjectCreated on the AWS s3 Bucket, and then this Python code is executed (Runtime Python 3.6):
#####################################
# Author: @facundofarias #
#####################################import json
import boto3
import urllib.parse
import urllib.requestprint(‘Loading function’)
s3 = boto3.client(‘s3’)def lambda_handler(event, context):
print(“Received event: “ + json.dumps(event, indent=2))
bucket = event[‘Records’][0][‘s3’][‘bucket’][‘name’]
key = urllib.parse.unquote_plus(event[‘Records’][0][‘s3’][‘object’][‘key’], encoding=’utf-8')
url = ‘YOUR_URL’
values = {‘Key’ : key, ‘Bucket’ : bucket}
try:
# Set the data to send
data = json.dumps(values)
data = data.encode(‘utf-8’)
# Set the headers
headers = {}
headers[‘Content-Type’] = “application/json”
# Send the request
req = urllib.request.Request(url, data, headers)
resp = urllib.request.urlopen(req)
# Receive the response
respData = resp.read()
print(respData)
except Exception as e:
print(e)
print(‘Error getting object {} from bucket {}.’.format(key, bucket))
raise e
In this way, every time that a file it’s placed in the s3 bucket, our backend is receiving the webhook and processing the file. We have had this setup for almost a year, and it works like a charm :)
