Tracking Cloud Storage usage for each user with Firebase and Cloud Functions

Adam Gerhant
Firebase Developers
4 min readOct 24, 2023

In order to prevent users from abusing your web app, its important to limit access to services such as file uploading. If a limit is not set, a user could upload any amount of data to your server, which could lead to a massive bill. In this article I will explain how I implemented a counter for total bytes uploaded per user, as well as a daily counter which resets.

In order to validate limits for each user, it is important to send all requests through server-side validation logic. In order to create a secure server-side upload process for Firebase, checkout my other article here:

Since there is no built in functionality in Cloud Storage to get the size of all files in a folder, there are 2 ways of implementing this functionality. The simplest way is to use .getFiles({ prefix: userIDPath}), then sum the bytes for each file. However, this is very inefficient, since every time the limit needs to be checked, the metadata for each file needs to be retrieved. Instead, it is much more efficient to store the total file size in parallel, such as in a Firestore document. Whenever a file is added or removed, the counter can be incremented or decremented as necessary. This is much more efficient, since the file size is only retrieved when it is uploaded/deleted. I will be storing this data in a read-only Firestore document, which means changes to the document can only only made by server side code. In order to accomplish this, I set the following security rules for the accountInformation document.

match /users/{userID}/data/accountInformation{
allow read: if request.auth != null && request.auth.uid == userID;
}

In order to initialize the document, I use the auth.user().onCreate Cloud Function trigger. This will run the function to initialize the accountInformation document when a user is created. The maxStorage field represents gigabytes, while the storageUsed will represent bytes:

const functions = require('firebase-functions');
const admin = require('firebase-admin');
const { onObjectFinalized, onObjectDeleted } = require('firebase-functions/v2/storage');

exports.initializeAccountInformation = functions.auth.user().onCreate((user) => {
const isGuestAccount = !user.email //if user does not have email then it is a guest account
let accountInformationData = {}
if(isGuestAccount){
accountInformationData.maxStorage = 0;
accountInformationData.storageUsed = 0;
accountInformationData.dailyStorageUsed = 0;
}
else{
accountInformationData.maxStorage = 5;
accountInformationData.storageUsed = 0;
accountInformationData.dailyStorageUsed = 0;
}
db = admin.firestore()
db.doc("users/"+user.uid+"/data/accountInformation").set(accountInformationData)
});

Next, the code to increment the bytes stored can be added. Since files can only be created or deleted, the onObjectFinalized and onObjectDeleted triggers can be used. These functions will run whenever a file is uploaded or deleted, and provide information about the file. In order to increment/decrement the file size for the correct user, the file path can be parsed to find the folder it is stored in, which represents the userID. Here’s what it looks like:

exports.fileUpload = onObjectFinalized({cpu: 2}, async (event) => {
const fileSize = Number(event.data.size)
const filePath = event.data.name
const pathSegment = filePath.split('/');
const userID = pathSegment[1];

const db = admin.firestore()
db.doc("users/"+userID+"/data/accountInformation").update({
storageUsed: admin.firestore.FieldValue.increment(fileSize)
});
}

The code for decrementing the storageUsed field is nearly identical, the file size just needs to be negative:

storageUsed: admin.firestore.FieldValue.increment(-fileSize)

For demonstration purposes, I will also show how to add a counter that resets daily in order to track daily bytes uploaded. This can be done with the onSchedule trigger. Every document in the users collection is looped through, and the dailyStorageUsed field is reset. It is also important to create a document for each user in the users collection, since it is possible to create the sub documents/collections without it being initialized. This will lead to those documents being skipped, and not resetting:

exports.emailCountReset = onSchedule("every day 12:00", async (event) => {
const db = admin.firestore()
const usersCollection = db.collection('users');

try {
const usersSnapshot = await usersCollection.get();
//maxium batch size is 500 writes
const batchSize = 500;
//create a new batch for each group of 500 users
for (let i = 0; i < usersSnapshot.size; i += batchSize) {
const batch = db.batch();
const batchUsers = usersSnapshot.docs.slice(i, i + batchSize);
batchUsers.forEach((userDoc) => {
const dataCollection = userDoc.ref.collection('data');
const accountInformationDoc = dataCollection.doc('accountInformation');
batch.update(accountInformationDoc, { dailyStorageUsed: 0 });
});
await batch.commit();
}
} catch (error) {
console.error('Error resetting dailyStorageUsed:', error);
}
});

The bytes stored in Cloud Storage for each user are now easily accessible through the storageUsed and dailyStorageUsed fields in Firestore. I use these fields to reject a file upload if the storageUsed exceeds maxStorage, as well as displaying how much storage the user has used. However, the general concepts can be used in many ways. For example, resetting a daily counter is essential to setting daily limits, or storing total analytics across all users.

--

--