Using S3 LocalStack with Spring Boot and R2DBC

Pavel Klindziuk
Dandelion Tutorials
3 min readAug 13, 2021

What is LocalStack ?
LocalStack is a fully functional cloud stack. LocalStack provides an easy-to-use test/mocking framework for developing Cloud applications. It spins up a testing environment on your local machine that provides the same functionality and APIs as the real AWS cloud environment.
Yes, that’s true — you can run your Lambda functions, store data to DynamoDB tables, feed events through Kinesis streams, put your application behind an API Gateway, and much more. And all this happens on your local machine, without ever talking to the cloud.

What is Amazon S3?
Amazon Simple Storage Service (Amazon S3) is storage for the Internet. It is designed to make web-scale computing easier.
Amazon S3 has a simple web services interface that you can use to store and retrieve any amount of data, at any time, from anywhere on the web. It gives any developer access to the same highly scalable, reliable, fast, and inexpensive data storage infrastructure that Amazon uses to run its own global network of web sites. The service aims to maximize benefits of scale and to pass those benefits on to developers.

What is Spring WebFlux?
Spring WebFlux is a web framework that’s built on top of Project Reactor, to give you asynchronous I/O, and allow your application to perform better.

What is R2DBC ?
R2DBC is the acronym for Reactive Relational Database Connectivity. R2DBC is an API specification initiative that declares a reactive API to be implemented by driver vendors to access their relational databases

Demo Application

In order to get better acquainted with these technologies, we will develop Spring Boot WebFlux application which will allow us to upload/download files from/to Amazon S3 and save file metadata to the database and retrieve appropriate information.

Prerequisites

  • Install docker
  • Install the AWS cli for manual interactions with LocalStack

LocalStack and Postgres Setup

  • Create init-s3-bucket.sh script to setup Amazon S3 service
  • Put init-s3-bucket.sh script into.script/localstack directory or define any other path if you wish.
  • Create init.sql file to create database table.
  • Put init-sql.sh script into.script/sql directory or define any other path if you wish.
  • Create docker-compose-s3.yml file with LocalStack and Postgres services
  • Mount Volumes with init scripts for LocalStack and Postgres
    to script folder path.
    - For LocalStack service:
    volumes:
    -./script/localstack/s3/init-s3-bucket.sh:/docker-entrypoint-initaws.d/init-s3-bucket.sh
    - For Postgres service:
    volumes:
    - ./script/sql:/docker-entrypoint-initdb.d/
  • Start services
    docker-compose -f docker-compose-s3.yml up
  • Verify that Amazon S3 bucket created
    aws --endpoint-url=http://localhost:4566 --region=us-east-1 s3 ls

Spring Boot WebFlux setup

  • Initialize Spring Boot WebFlux project via Spring initializr
  • Add dependencies to build.gradle
  • Create application.yml or update application.properties file. If you want to use real Amazon S3and Postgres services - just update file with appropriate values.
  • Configure Amazon S3 bean to allow application talk to Amazon S3 service provided by LocalStack
  • Start service and send GET request to health request path to verify that service is running
->  curl -X GET http://localhost:8080/health<-  {"status": "Healthy!"}
  • Send POST request to s3/upload request path to upload file:
->  curl -F 'files=@/path-to-file/file.png'                                 http://localhost:8080/s3/upload <-  [{"id":1,"fileName":"file.png","fileUrl":"http://127.0.0.1:4566/dandelion-s3-bucket/file.png","uploadSuccessFull":true}]
  • Send GET request to s3/view-all request path to verify that file uploaded:
->  curl -X GET http://localhost:8080/s3/view-all<-  
[{"bucketName":"dandelion-s3-bucket","key":"file.png","size":0,"lastModified":"2021-08-13T14:41:51.000+00:00","storageClass":"STANDARD","owner":{"displayName":"webfile","id":"75aa57f09aa0c8caeab4f8c24e99d10f8e7faeebf76c078efc7c6caea54ba06a"},"etag":"d41d8cd98f00b204e9800998ecf8427e"}]
  • Send GET request to s3/view-all-db request path to verify that file info presents in database:
->  curl -X GET http://localhost:8080/s3/view-all-db<-  
[{"id":1,"fileName":"file.png","fileUrl":"http://127.0.0.1:4566/dandelion-s3-bucket/file.png","uploadSuccessFull":true}]
  • Viola!

Source code can be found on GitHub:

--

--