AWS Transfer for SFTP (Secure Shell File Transfer Protocol)

Sumit
Tensult Blogs
Published in
7 min readApr 16, 2019
Ref: https://bit.ly/2XaixvA

AWS Transfer for SFTP is a fully managed service by AWS which enables you to transfer files in and out of AWS S3. As we know, SSH is an internet protocol used for secure transfer of files over the internet and hence, SFTP is also a secure method used by many financial services, healthcare, retail and advertising for exchange of data between business clients. The main advantage of using SFTP is that it reduces the operational burden on the entity managing it because otherwise, there would be numerous tasks like managing SFTP servers, their availability, uptime, and patching which have to be taken care of by the entity. Now that does sound like a useful service, doesn’t it? Of course, it does. Let’s deep dive into AWS SFTP setup and the way it works with Linux and Windows EC2 instances.

Configuring AWS SFTP for a Linux EC2 instance.

Prerequisites for this experiment:

  1. An S3 bucket. (Eg: sftp-testbucket2019)
  2. An SFTP client; in this case, it is going to be a Linux instance.
  3. IAM role for SFTP users with permissions and an updated trust relationship.

Step 1: Create an S3 bucket.

Creation of S3 bucket.

Click on ‘next’ and create an S3 bucket with default settings as this is only for testing purpose. However, do not grant public access to write objects onto the bucket for security reasons.

Step 2: Create 2 EC2 instances with internet access for this testing.

Spin up two EC2 instances as well, one Linux machine and another Windows.

Two EC2 instances with public IP attached.

Step 3: Create an IAM role for SFTP users, with an updated trust relationship

1.Select EC2 in the Create Role option of IAM.
2. Select AmazonS3FullAccess role and click on Next
3. Add a relevant tag and then click on Next
4. Our role has been created.
Updating Trust Relationship

Once the IAM role has been created(here sftp_role), go to the tab called Trust Relationships. As you can see in the above screenshot, under trusted entities we have ec2.amazonaws.com. We need to update that to transfer.amazonaws.com for sftp service to work. Hence, click on ‘Edit trust relationship’ and update the script with the below-given script:

{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": "transfer.amazonaws.com"
},
"Action": "sts:AssumeRole",
"Condition": {}
}
]
}

Update the trust policy using the above policy and click on Update Trust Policy button on corner right side. Now we are done with the prerequisites for this setup. Let’s go ahead and create the sftp service itself now.

Step 4: Create SFTP Server

From ‘AWS Transfer for SFTP’ service, click on Create Server

Leave all options at its default value, like Endpoint type, Identity provider and Logging role and click on Create Server.

Successful creation of sftp server

Once you have created an SFTP server, you will get to see the page just like the above screenshot. It usually takes a couple of minutes for the server to come online, meanwhile, you can check the configuration of the server by clicking on the server ID.

As you can notice, there are no users created yet so click on the button Add User for creating users who will be accessing this sftp service. Clicking on Add User will lead you to another User Configuration page as you can see below:

User Configuration for sftp service-image2

So here we give a relevant username as per choice, say sftp_user and for role, we select the role which we created earlier, sftp_role. After that, under home directory, we select the S3 bucket which we created in Step 1. Next comes the slightly complicated part, which is SSH Public Key. This public key is assigned to the user and the easiest way to generate a public key is using a Linux server or you can also use the application PuTTYgen for the same.

Use the below command in a Linux instance to create key-pair. Then use the command more test-sftp-key.pub to open the public key file and copy the SSH key in your Notepad application. Bring the entire key in a single line and copy-paste it in the SSH Public Key space in our user configuration page that we are working on.

ssh-keygen -P "" -f "sftp-test-key"

Once it is created, copy the key in one single line and paste it in the space given in the user configuration page.

Click on add after and then you will see a page just like the below screenshot:

Notice that our sftp server status has also changed from Starting to Online, which means we are ready to test out the connection as well.

Log in to the Linux instance and use the below command. Kindly note that you would need root permissions for this command to run.

sftp -i sftp-test-key sftp_user@YOUR-SFTP-END-POINT

The value after @ in the above command is the SFTP Endpoint which you can see in server configuration page above. If everything goes well, you will be able to login to the sftp server using CLI and will get the below output:

sftp login successful

You can test the connection by creating a directory using the console and checking it in the AWS S3 console.

created testfolder dir
testfolder dir created in s3 as well.

I can guess your next question, how do we move a file from the local instance to sftp and ultimately to the S3 bucket, right? For that, we use put command.

put command
testfile has been uploaded to S3

As we can see in the above screenshot, using put command moves a file from the local system to the sftp server and it will get automatically synced to S3 bucket.

Now that we have seen how to work on SFTP service using a Linux instance, let me show you how to do the same using Windows. In a Windows instance, we can either use FileZilla application or WinSCP application. I will be using FileZilla for the purpose of this blog.

Step 1: Log in to Windows instance and install FileZilla on it.

Filezilla application

Open FileZilla and you will get the above console. Hope you have the ppk file in this machine as well because we would be needing that to configure the connection on FileZilla.

Step 2: Configure the authentication method in FileZilla and connect

Configure authentication

Go to Edit>Settings>SFTP and then add the ppk file which we just talked about.

Step 3: Input login values and click Quickconnect

Input login values

In the space given for host, give the endpoint value of sftp server from AWS console; for username, use the username we created earlier, here it’s sftp_user. SFTP uses port 22 hence we write 22 there in the space for port. Click on Quickconnect right next to it and see the magic happening.

Successfully connected to sftp

Step 4: Drag and drop a file from Local site to Remote site and check the connectivity.

Copied file to sftp server

I copied the installation file for FileZilla from local machine to remote site and the upload happened quickly.

Successful upload of file using FileZilla

As you can see above, we have successfully uploaded file to S3 bucket using AWS Transfer for SFTP.

Conclusion:

As per my experience of doing this testing, I would say it’s a secure and fast method of syncing data with S3. There is no need to configure separate servers for SFTP and that’s a huge advantage over traditional methods. It is fully compatible with the SFTP standard, connects to Active Directory, LDAP and other identity systems, and also works with Route 53 DNS routing. One more thing to note is the pricing for this service. It’s $0.30 per hour from the time the sftp server is provisioned and an additional $0.40 per GB transferred(upload and download). This pricing can vary depending upon the region that you are working in so kindly access this link for detailed information.

--

--