Supporting Legacy PHP project: painless switching to cloud storage

Vlad Reshetylo
3 min readOct 11, 2022

--

Every developer knows this :)

We all know that supporting legacy projects is a pain. Especially if the project is non-framework based and went through a few different teams of developers. As a rule, there is a highly messy code without any documentation. A pretty common situation for any experienced freelancer, right?

In most cases, there are insufficient resources to rewrite everything from scratch. A client has a less or more stable project which brings him money or does the required function. All that the client wants — keep his project on the fly.

There are plenty of tasks that should usually be done with a legacy PHP project in this case — adding composer, version control system, wrapping everything in docker, etc. And one of the popular tasks is moving user-uploaded files to cloud-based file storage like Amazon S3 or DigitalOcean Spaces. Let’s skip the reason for that — it should be obvious.

The problem in that, that such a task can be highly complex. It’s a really typical situation for a legacy PHP project when all file manipulations are done via pure PHP functions like file_get_contents, file_put_contents, fopen, fwrite, etc., without any wrappers. And there can be plenty of such calls through the system! User avatars, message attachments, orders PDFs — there can be an infinite number of system domains that manipulate files. Such calls even can be hidden somewhere in outdated dependencies if the system is built with some kind of CRM. And there is no other way except to rewrite all these places to use the modern file storage library. Or is there..?

Luckily, there is. It’s definitely not the way for a usage in a brand new project, but it can do an excellent job for a huge legacy one. And its name is stream_wrapper_register.

You could know that you can open URL links using the file_get_contents function. In fact, HTTP is just one more protocol supported by this (and almost all other) built-in PHP file functions. And when we are working with files on our local machine — we use the “file” protocol. An amazing fact that we can redefine its handler! And that’s the painless way to force a huge PHP project to use cloud storage.

Let’s talk with a code:

That’s all! We should just implement all protocol handler methods for the usage of our cloud storage (as I remember, Amazon S3 SDK even has such a class out-of-the-box) and register our handler as a “file” protocol stream wrapper.

As a result — we have a system that even doesn’t “know” that it uses cloud-based storage. It still works via plenty of “file_get_contents” and “file_put_contents” calls, and we can be sure that we didn’t break the logic during the movement. Even more — we can implement our custom wrapper that way, that it stores all files in a few places at the same time (backups on the fly!) or notices us in case of some predefined situations (files were not found, or smth like that). That’s cool enough for a legacy PHP project with a tonne of spaghetti code, isn’t it? :)

I tested this way in production when I dealt with an ancient system based on SuiteCRM. That CRM had no built-in way to switch the storage, and it was an excellent way to force it to use Amazon S3 storage with a change in a single PHP file. Even more — for some time, my implementation had a fallback to the local filesystem to be sure that nothing would be lost during the movement of 100+ GB of files.

I hope my story will help someone update his old project at least a little bit :)

--

--