Symfony 4: Static Files in AWS S3 with a Heroku Dyno

Today I’m going to explain why we need to use a static storage server, external to our Heroku Dyno (app containers) to have static files in our application. And also, what are the steps needed to accomplish that.

Heroku offices

Let’s imagine that we have a file upload functionality in our application. We can receive the uploaded file, and then move it to a public folder of images, right? No. With Heroku this is not recommended.

Is because all files existing in the application, other than those included in the deploy, are regularly deleted.

Heroku has an ephemeral file system where the dyno is restarted once a day by erasing everything else, other than what wasn’t deployed, and does that also at each deploy.

Note: If you want a test environment instance, then this can be a good thing, since we get a clean application regularly or at each deploy.

Therefore, is necessary to find a way to store files externally to the dyno.

Why S3?

S3 is a cloud computing web service offered by Amazon Web Services. It provides storage of objects through web services interfaces.

It was launched in 2007 by Amazon, his fifth Web Service available to the public.

Is one of the biggest players in the market, is free for not intensive usage, and has a lot of documentation.

That’s why I’ve choose it for this post.

Configurations

Creating Credentials

Although that in Heroku oficial documentation referes that we only need to get the Access Keys, now is recommended to use IAM profiles.

These are more specific profiles, to avoid that we need to use our root user for everything.

So, just create an IAM profile, and then create the access credentials associated with this profile.

Install AWS SDK

Next we will have to install the AWS SDK for PHP. For this, the easiest way is using composer.

Execute the following:

composer require aws/aws-sdk-php:~3.67

Defining Environment Variables

Now we need to provide, for the application, the access key ID, secret access key and bucket name.

But, is not recommended to have this kind of information hard coded in the application. Instead is better to store it in the application environment.

Let’s run the following command for this:

heroku config:set AWS_ACCESS_KEY_ID=aaa AWS_SECRET_ACCESS_KEY=bbb S3_BUCKET=ccc

If you need to check if the environment variable was created correctly you can use getenv().

Code Example

Finally, you can see a simple example of how to upload a file, and the necessary parameters needed.

use Aws\S3\S3Client;
use Aws\Exception\AwsException;
(...)
try {
$s3Client = new S3Client([
'region' => 'some-region',
'version' => 'latest',
'credentials' => CredentialProvider::env()
]);
$result = $s3Client->putObject([
'Bucket' => getenv('S3_BUCKET'),
'Key' => $fileName,
'SourceFile' => $filePath,
]);
} catch (AwsException $e) {
echo $e->getMessage() . "\n";
}

Next you can see, in some detail, how to obtain some of the necessary variables.

Region

The region where your bucket is, is mandatory.

To obtain it, I entered in the bucket page, and in the Overview tab, at the top right, checked the region name.

After that, and to get the corresponding code, I’ve searched for it in here.

Version

Is recommended to use the latest version of each client. We can also use a previous version, and apply updates to the client.

Credentials

Here I used a static method, of CredentialProvider class, CrendentialProvider::env(), that gets the credentials from environment variables, and returns a callable.

Create a Service: SaveUploadFileService

Now is better to create a service in the application, that receives a file that has been uploaded, and then sends it to AWS S3.

And why a service, and not in the controller? Because is conveniente to have thin controllers and fat services.

<?php

namespace
App\Service\File;

use Aws\Credentials\CredentialProvider;
use Aws\Exception\AwsException;
use Aws\S3\S3Client;
use App\Service\ServiceInterface;
use Symfony\Component\HttpFoundation\File\UploadedFile;

class SaveUploadedFileService implements ServiceInterface
{

/**
*
@var UploadedFile
*/
private $uploadedFile;

/**
*
@var string
*/
private $tmpFolderPath;

public function setUploadedFile(UploadedFile $uploadedFile): void
{
$this->uploadedFile = $uploadedFile;
}

public function setTmpFolderPath(string $tmpFolderPath): void
{
$this->tmpFolderPath = $tmpFolderPath;
}

public function call(): bool
{
$bucket = getenv('S3_BUCKET');
$fileName = md5(uniqid($bucket . '_', false)) . '.' . $this->uploadedFile->guessExtension();

$this->uploadedFile->move(
$this->tmpFolderPath,
$fileName
);

try {
$s3Client = new S3Client([
'region' => 'us-east-2',
'version' => 'latest',
'credentials' => CredentialProvider::env()
]);

$s3Client->putObject([
'Bucket' => $bucket,
'Key' => $fileName,
'SourceFile' => $this->tmpFolderPath . $fileName,
]);

} catch (AwsException $e) {
echo $e->getMessage() . "\n";
return false;
}

return true;
}

}

In this article I also talk about the possibility of defining an interface for all the services, something that we also apply here. Check the interface bellow.

<?php

namespace
App\Service;

interface ServiceInterface
{

public function call();
}

What we have is a service that will receive an instance of UploadedFile from and sends it to S3.

The path to the temporary folder is just a transition folder to be subsequently sent to S3. This is because, for we to send files in Heroku to an external process, the file must be stored in a more permanent location, inside of the application folder.

For this, I simply added a temporary folder to the root of the project. Something like this:

mkdir tmp
git add tmp
git commit -m "Add tmp directory."

Then just injected that path to the service.

Show the File: GenerateFileUrlService

Then, the question is, how to show the uploaded image to any user. This isn’t an immediate task because we need to configure the object access permissions.

If you go to the bucket, and click on the public link of the uploaded image, you will get an XML response with the message “AccessDenied”.

We need to define the access permissions to our objects.

And for our use case what makes sense is to give read-only permissions to all anonymous users.

In this article with examples of access policies, we have the appropriate example for our case. Just copy and paste the example in the Permissions > Bucket Policy text area.

This configuration makes all objects in the bucket, public. Here we could also define only a specific folder to have public objects.

Then we can create a service that, receiving a file name, generates is public URL. The format is as follows:

https://s3.<zone>.amazonaws.com/<bucket>/<filename>

Following the same interface that we have used for our services, we have a service like this.

<?php

namespace
App\Service\File;

use App\Service\ServiceInterface;

class GenerateFileUrlService implements ServiceInterface
{

/**
*
@var string
*/
private $fileName;

/**
*
@param string $fileName
*/
public function setFileName(string $fileName): void
{
$this->fileName = $fileName;
}

public function call(): bool
{
return sprintf(
'https://s3.us-east-2.amazonaws.com/%s/%s',
getenv('S3_BUCKET'),
$this->fileName
);
}
}
        );
}
}

Was it useful for you?

Did you find this post useful?

Where do you store static files? Do you also use AWS S3, or another solution?

And if you have any questions or corrections, please feel free to comment.

This article belongs to a series of posts about Symfony 4: