Access Amazon S3 bucket as a local drive on Windows

Francesco Larghi
4 min readMar 20, 2024

--

In today’s data-driven landscape, accessing cloud storage with the same ease as local files isn’t just convenient — it’s essential. Amazon S3 stands out as a premier option for durable, scalable and cheap cloud storage, yet its smooth integration into everyday workflows poses a hurdle, particularly for Windows users. Overcoming this challenge can greatly boost efficiency and streamline data handling, especially for individuals lacking experience or direct access to S3 through the Management Console, AWS CLI, SDK, or other dedicated tools.

This article cuts straight to the chase, guiding you through the process of mounting an Amazon S3 bucket as a local drive on Windows. Such a setup not only brings the robust features of S3 storage to your fingertips but also aligns it with the familiar and intuitive file management system of Windows, without introducing new tools. Whether for personal use, small businesses, or larger enterprises, the ability to seamlessly integrate S3 buckets with your local environment is a powerful tool in optimizing your data access and management strategies.

Let’s dive into how you can transform the approach to cloud storage, making your S3 buckets as accessible and straightforward to navigate as any local drive.

Prerequisites

  • An Amazon S3 bucket (see here if you want to know how to create one). For the rest of the article we assume it is named my-s3-bucket and is created in the region eu-central-1.
  • An Amazon IAM user (with relative Access Key ID and Secret Access Key) with permission to access my-s3-bucket (see here for more info on that). Just for quick reference, in case a simple full access is the choice (this could be also a read-only access maybe, or read-write without delete), on the AWS Console go to IAM -> Create User (Next, Next, Create user), Create access key (save the keys for later) and Add permission with a policy like the following (substitute my-s3-bucket):
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "S3Access",
"Effect": "Allow",
"Action": [
"s3:*",
],
"Resource": [
"arn:aws:s3:::my-s3-bucket",
"arn:aws:s3:::my-s3-bucket/*"
]
}
]
}

Step 1: Download Rclone

Rclone is a free command-line program to manage files on cloud storage. It’s natively integrated not only with AWS, but basically with any other cloud storage provider, and available for most of the OSs. You can find and download the latest version of the tool here.

Step 2: Configure Rclone on Windows

Extract the downloaded zip in a folder, let’s say in C:\rclone .

Open a command shell in this folder and type:

.\rclone.exe config

Note: it is possible you need to install WinFSP if not already present on your Windows machine.

This will run the complete guided configuration via cli. It’s pretty straightforward, but as reference a typical input sequence should be:

  • my-remote (substitute this with the name you want to give to this remote connection, we will assume this for now on)
  • 4 (S3 storage type)
  • 1 (Amazon S3)
  • 1 (to pass IAM credentials directly in the next step)
  • Access Key ID (substitute with the one created)
  • Secret Access Key (substitute with the one created)
  • 11 (substitute with the number corresponding to your bucket region, eu-central-1 -> 11)
  • [ENTER] (leave blank endpoint step if using default S3 one)
  • 11 (substitute with the number corresponding to the group of region you are using, EU regions-> 11)
  • [ENTER] (leave blank acl step if using default S3 one)
  • [ENTER] (leave blank server_side_encryption step if using default S3 one)
  • [ENTER] (leave blank sse_kms_key_id step if using default S3 one)
  • [ENTER] (leave blank storage_class step if using default S3 one)
  • [ENTER] (leave blank to avoid setting up advanced config)
  • y (confirm the creation of my-remote)
  • q (quit config)

Step 3: Test mount

In the command shell still in the C:\rclone folder type the following command to mount your my-s3-bucket (connected using my-remote) on the drive with letter S: (of course substitute this with the drive letter you want):

.\rclone.exe mount my-remote:my-s3-bucket/ S: --vfs-cache-mode full

It should reply with: The service rclone has been started. In this case the service will stay foreground on the shell, without closing it you can open the file system explorer and navigate to the new drive and check you have inside the data from the S3 bucket. You can now try to copy-paste files or create folders in the new drive (for us S:) to check it is replicated to the S3 bucket and vice-versa.

Step 4: Setup definitive mount

Now that we have checked this is working fine as expected, we can setup this to be automatically run in background each time we startup the Windows PC or Server.

First, let’s create a simple .cmd file to contain our command with the absolute path, I will place it in C:\rclone\rclone-S3.cmd :

C:\rclone\rclone.exe mount my-remote:my-s3-bucket/ S: --vfs-cache-mode full

Now, to automatically launch this in background we can for example create a simple .vbs script to run it at startup. Let’s place the following script in C:\ProgramData\Microsoft\Windows\Start Menu\Programs\Startup\rclone.vbs :

Set WshShell = CreateObject("WScript.Shell" )
WshShell.Run chr(34) & "C:\rclone\rclone-S3.cmd" & Chr(34), 0
Set WshShell = Nothing

This will run the script at each startup in background without opening the command shell.

Conclusion

Now you have configured a local drive file system automatically synchronized with a remote S3 bucket, making it as accessible and straightforward to navigate as any other local drive on Windows!

Note: the same approach can be used to synchronize object storage of any other provider and on any other operating system, this is just a very common combination.

--

--