Upload file to GCS, access with URL which expires using Spring Cloud

Anoop Hallimala
Google Cloud - Community
5 min readNov 19, 2020

If you want to upload a file to Google Cloud Storage(GCS) — in or outside GCP — and access the uploaded file with a URL which expires, this is the post for you.

We will be using Spring Boot Reactive library Webflux in conjunction with Spring Cloud libs for accessing Google Cloud Storage.

We will be creating a Bucket and a private key to access the Bucket. We then use this key to access the Bucket from outside the GCP.

Implementation

Part 1: Create the bucket, download the key from the Service Account

Log into your GCP Console. Open the Google Shell and create a Bucket.

gsutil mb gs://some-bucket

After ensuring the Bucket is created, navigate to the Service Account section.

Look for the service account associated with the newly created Bucket.

Add a new key:

The key will be downloaded automatically in the browser.

The content of the json will look something like this.

{
"type": "service_account",
"project_id": "someaccount-gcp-01-7f11684e7047",
"private_key_id": "...",
"private_key": "...",
"client_email": "someaccount-gcp-01-7f11684e7047@someaccount-gcp-01-7f11684e7047.iam.gserviceaccount.com",
"client_id": "...",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/someaccount-gcp-01-7f11684e7047%40someaccount-gcp-01-7f11684e7047.iam.gserviceaccount.com"
}

Part 2: Create a Spring Boot App

pom.xml

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd"
>
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.3.5.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.example</groupId>
<artifactId>springboot-gcs-signed-url</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>springboot-gcs-signed-url</name>
<description>To upload a document to GCS and create a signed URL from it</description>
<properties>
<java.version>1.8</java.version>
<spring-cloud.version>Hoxton.SR9</spring-cloud.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-gcp-starter-storage</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.junit.vintage</groupId>
<artifactId>junit-vintage-engine</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
</dependency>
</dependencies>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-dependencies</artifactId>
<version>${spring-cloud.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>

The key library here is:

spring-cloud-gcp-starter-storage

Create file key.json in resources folder and add the contents of the file downloaded in the browser in Part 1.

Tell the GCP libs to load the credentials from key.json like this:

spring.cloud.gcp.credentials.location=classpath:key.json

Create StorageController.

@RestController
@Slf4j
class StorageController {

@Getter(AccessLevel.PROTECTED)
@Setter(AccessLevel.PROTECTED)
@Autowired
private Storage storage;

@Value("bucketname")
String bucketName;
@Value("subdirectory")
String subdirectory;

@PostMapping(value = "/upload", consumes = MediaType.MULTIPART_FORM_DATA_VALUE)
public Mono<URL> uploadFile(@RequestPart("file") FilePart filePart) {
//Convert the file to a byte array
final byte[] byteArray = convertToByteArray(filePart);

//Prepare the blobId
//BlobId is a combination of bucketName + subdirectiory(optional) + fileName
final BlobId blobId = constructBlobId(bucketName, subdirectory, filePart.filename());

return Mono.just(blobId)
//Create the blobInfo
.map(bId -> BlobInfo.newBuilder(blobId)
.setContentType("text/plain")
.build())
//Upload the blob to GCS
.doOnNext(blobInfo -> getStorage().create(blobInfo, byteArray))
//Create a Signed "Path Style" URL to access the newly created Blob
//Set the URL expiry to 10 Minutes
.map(blobInfo -> createSignedPathStyleUrl(blobInfo, 10, TimeUnit.MINUTES));
}

private URL createSignedPathStyleUrl(BlobInfo blobInfo,
int duration, TimeUnit timeUnit) {
return getStorage()
.signUrl(blobInfo, duration, timeUnit, Storage.SignUrlOption.withPathStyle());
}

/**
* Construct Blob ID
*
*
@param bucketName
* @param subdirectory optional
*
@param fileName
* @return
*/
private BlobId constructBlobId(String bucketName, @Nullable String subdirectory,
String fileName) {
return Optional.ofNullable(subdirectory)
.map(s -> BlobId.of(bucketName, subdirectory + "/" + fileName))
.orElse(BlobId.of(bucketName, fileName));
}

/**
* Here, we convert the file to a byte array to be sent to GCS Libraries
*
*
@param filePart File to be used
*
@return Byte Array with all the contents of the file
*/
@SneakyThrows
private byte[] convertToByteArray(FilePart filePart) {
try (ByteArrayOutputStream bos = new ByteArrayOutputStream()) {
filePart.content()
.subscribe(dataBuffer -> {
byte[] bytes = new byte[dataBuffer.readableByteCount()];
log.trace("readable byte count:" + dataBuffer.readableByteCount());
dataBuffer.read(bytes);
DataBufferUtils.release(dataBuffer);
try {
bos.write(bytes);
} catch (IOException e) {
log.error("read request body error...", e);
}
});

return bos.toByteArray();
}
}

}

Lets break up the above code and understand each part.

First, we convert the FilePart file to an Array of Bytes.

/**
* Here, we convert the file to a byte array to be sent to GCS Libraries
*
*
@param filePart File to be used
*
@return Byte Array with all the contents of the file
*/
@SneakyThrows
private byte[] convertToByteArray(FilePart filePart) {
try (ByteArrayOutputStream bos = new ByteArrayOutputStream()) {
filePart.content()
.subscribe(dataBuffer -> {
byte[] bytes = new byte[dataBuffer.readableByteCount()];
log.trace("readable byte count:" + dataBuffer.readableByteCount());
dataBuffer.read(bytes);
DataBufferUtils.release(dataBuffer);
try {
bos.write(bytes);
} catch (IOException e) {
log.error("read request body error...", e);
}
});

return bos.toByteArray();
}
}

Then, we construct the BlobId.

/**
* Construct Blob ID
*
*
@param bucketName
* @param subdirectory optional
*
@param fileName
* @return
*/
private BlobId constructBlobId(String bucketName, @Nullable String subdirectory,
String fileName) {
return Optional.ofNullable(subdirectory)
.map(s -> BlobId.of(bucketName, subdirectory + "/" + fileName))
.orElse(BlobId.of(bucketName, fileName));
}

This BlobId is used to create a BlobInfo. This BlobInfo is used to upload the file to GCS.

Mono.just(blobId)
//Create the blobInfo
.map(bId -> BlobInfo.newBuilder(blobId)
.setContentType("text/plain")
.build())
//Upload the blob to GCS
.doOnNext(blobInfo -> getStorage().create(blobInfo, byteArray))
//Create a Signed "Path Style" URL to access the newly created Blob
//Set the URL expiry to 10 Minutes
.map(blobInfo -> createSignedPathStyleUrl(blobInfo, 10, TimeUnit.MINUTES));

We then proceed to create a URL with expiry set.

private URL createSignedPathStyleUrl(BlobInfo blobInfo,
int duration, TimeUnit timeUnit) {
return getStorage()
.signUrl(blobInfo, duration, timeUnit, Storage.SignUrlOption.withPathStyle());
}

Note: We are creating a URL using “Path Style”.

Storage.SignUrlOption.withPathStyle()

Generates a path-style URL, which places the bucket name in the path portion of the URL instead of in the hostname, e.g ‘https://storage.googleapis.com/mybucket/...'.

Alternatively, you can use other styles like Virtual Hosted Style, which adds the bucket into the host portion of the URI rather than the path, e.g. ‘https://mybucket.storage.googleapis.com/...'

Finally, add the below properties in application.properties

bucketname=bucketName
subdirectory
=media
spring.cloud.gcp.credentials.location
=classpath:key.json

The package structure will look like this.

We can now run the app and execute the below curl.

curl — location — request POST ‘http://localhost:8080/upload' \
— form ‘file=@/Users/Sample.txt’

You will get a “Path Style” URL.

https://storage.googleapis.com/some-bucket/media/ad7313fc-0c54-4de4-9371-e3990bd2f7b4?GoogleAccessId=sa-gcs-someaccount@some-project.iam.gserviceaccount.com&Expires=1605773822&Signature=FwQjidH9lryP9UjnKMixOsKVzksowirRIYNnthbcL%2FyVmQ8DkLvBXhMGoXys3qAVAuYQAPHEnl2KvetB%2FyVwF5kdHR4o3UeB3AyGvL1zJRjNq2Ymf%2F%2FTOue3bBmZJ6pal4bpxwryWECIzxhJm8CCnAZ1d6TIZtcoU7SzJBILlLwzGTT7n%2FOFthWdbvA7mjMKw%2Fg%2BnlQA8Ltr4mlTIBpfCsmw5Hl2jm%2FbI8zaOJc7%2BtpWhGTdiZww3hgjpEqIZe%2F2r9ZRDQRDPgEntZSymoV3xuI1WBAheYkY0ov7QbRrWi9eH8vGjnBRmwT7znUmS9rj3nUwVKsKfCRziCsOf3jdgw%3D%3D

Paste this in the browser and you will be able to download the file.

Wait 10 minutes to allow the URL to expire and try to access the URL again. You will get an error.

You can find the full project in GitHub here.

--

--

Anoop Hallimala
Google Cloud - Community

I work as a Staff Engineer at vmware. I dabble in Open Source and Cloud-Native tech. I believe Software has to be invisible or beautiful.