Upload and download files from Google Cloud Storage with Google Apps Script without using a service account

Martin Hawksey
Appsbroker CTS Google Cloud Tech Blog
6 min readFeb 15, 2024

A key feature of Google Apps Script is its integration into Google Cloud. The default behaviour when any Apps Script project is created is that an associated Google Cloud project is created and configured. This default project is not accessible to the user and for most scripts, the user doesn’t need to worry about any of the configurations such as enabling APIs and configuring authentication settings.

Other key aspects are identity and authentication. The default behaviour for scripts is usually to run as the account executing the script, Apps Script automatically determining what authorisation is required for different Google services based on an automatic scan of your code or from what scopes have been set explicitly in the Apps Script manifest file.

The last piece in the puzzle is the .getOAuthToken() method which is part of the ScriptApp Service:

Gets the OAuth 2.0 access token for the effective user. … The token returned by this method only includes scopes that the script currently needs. Scopes that were previously authorized but are no longer used by the script are not included in the returned token. If additional OAuth scopes are needed beyond what the script itself requires, they can be specified in the script’s manifest file.

What this means is in script projects we can borrow an access token to use other services that the effective user has access to and have been declared in the script project scopes. For example, if my Google account martin@example.com has been added to another Google Cloud project with the Google Cloud Storage service enabled, I can use Apps Script to generate a token to use the Cloud Storage service in that project.

To help illustrate this, here are two examples for interacting with Google Cloud Storage buckets to upload and download files to Google Drive.

Download Google Cloud Storage Bucket resources to Google Drive

Assuming you have already set up a Cloud Storage Bucket, the first step is to grant permission to the account you will be running your Google Apps Script code. In the steps below we are adding a principal to a bucket-level policy with read-only access. Other Identity and Access Management Configurations are possible.

  1. In the Google Cloud console, go to the Cloud Storage Buckets page.
    Go to Buckets
  2. In the list of buckets, click the name of the bucket for which you want to grant a principal a role.
  3. Select the Permissions tab near the top of the page.
  4. Click the Grant access button.
    The Add principals dialog box appears.
  5. In the New principals field, enter one or more identities that need access to your bucket.
  6. Select a role (or roles) from the Select a role drop-down menu select the Storage Object Viewer role
  7. Click Save.

To be able to access the Cloud Storage Bucket we need to declare the required scope in the Apps Script project by following these steps (or Make a Copy of this script project):

  1. Create a new Script Project
  2. Open the script project in the Apps Script editor.
  3. Click Project Settings.
  4. Select the Show “appsscript.json” manifest file in editor checkbox.

The manifest file appears as a project file named appsscript.json. Open this file in the Script Editor and add the oauthScopes:

"https://www.googleapis.com/auth/script.external_request",
"https://www.googleapis.com/auth/devstorage.read_only",
"https://www.googleapis.com/auth/drive"

An example appsscript.json with these scopes is included below:

{
"timeZone": "Europe/London",
"dependencies": {},
"exceptionLogging": "STACKDRIVER",
"runtimeVersion": "V8",
"oauthScopes": [
"https://www.googleapis.com/auth/script.external_request",
"https://www.googleapis.com/auth/devstorage.read_only",
"https://www.googleapis.com/auth/drive"
]
}

In your Code.gs file add the following code:

// Example usage
function getFile() {
const bucketName = 'my-bucket';
const fileName = 'path/to/my/file.txt';

const blob = getBucketFile_(bucketName, fileName);

if (blob) {
const file = DriveApp.createFile(blob);
console.log(file.getUrl());
}
}


/**
* Retrieves a file from a Google Cloud Storage bucket as a Blob.
*
* @param {string} BUCKET_NAME - The name of the bucket containing the file.
* @param {string} OBJECT_NAME - The name of the file to retrieve.
* @return {Blob} The retrieved file as a Blob object.
*
* @example
* const fileBlob = getBucketFile_('my-bucket', 'path/to/my/file.txt');
*/
function getBucketFile_(BUCKET_NAME, OBJECT_NAME) {
// Base URL for Cloud Storage API
const url = `https://storage.googleapis.com/storage/v1/b/${BUCKET_NAME}/o/${encodeURIComponent(OBJECT_NAME)}?alt=media`;
try {
const response = UrlFetchApp.fetch(url, {
method: 'GET',
headers: {
"Authorization": "Bearer " + ScriptApp.getOAuthToken()
}
});
return response.getBlob();
} catch (error) {
console.error('Error getting file:', error);
}
}
  1. In the Script Editor replace the bucketName and fileName with the bucket and object details that you want to access.
  2. Click Save.
  3. Run the getFile() function

When the script is run the execution log will reference the Google File file link:

Example execution log

Upload a Google Drive file to a Google Cloud Storage Bucket

Again assuming you have already set up a Cloud Storage Bucket, the first step is to grant permission to the account you will be running your Google Apps Script code to write to the bucket. In the steps below we are adding a principal to a bucket-level policy with write-only access. Other Identity and Access Management Configurations are possible.

  1. In the Google Cloud console, go to the Cloud Storage Buckets page.
    Go to Buckets
  2. In the list of buckets, click the name of the bucket for which you want to grant a principal a role.
  3. Select the Permissions tab near the top of the page.
  4. Click the Grant access button.
    The Add principals dialog box appears.
  5. In the New principals field, enter one or more identities that need access to your bucket.
  6. Select a role (or roles) from the Select a role drop-down menu select the Storage Object Creator role (this allows users to create objects, but does not give permission to view, delete, or replace objects).
  7. Click Save.

To be able to access the Cloud Storage Bucket we need to declare the required scope in the Apps Script project by following these steps (or Make a Copy of this script project):

  1. Create a new Script Project
  2. Open the script project in the Apps Script editor.
  3. Click Project Settings.
  4. Select the Show “appsscript.json” manifest file in editor checkbox.

The manifest file appears as a project file named appsscript.json. Open this file in the Script Editor and add the oauthScopes:

"https://www.googleapis.com/auth/script.external_request",
"https://www.googleapis.com/auth/devstorage.read_write",
"https://www.googleapis.com/auth/drive.readonly"

An example appsscript.json with these scopes is included below:

{
"timeZone": "Europe/London",
"dependencies": {},
"exceptionLogging": "STACKDRIVER",
"runtimeVersion": "V8",
"oauthScopes": [
"https://www.googleapis.com/auth/script.external_request",
"https://www.googleapis.com/auth/devstorage.read_write",
"https://www.googleapis.com/auth/drive.readonly"
]
}

In your Code.gs file add the following code:

function addFile() {
const bucketName = 'my-bucket';
const fileName = 'path/to/my/file.txt';

const driveBlob = DriveApp.getFileById('YOUR_DRIVE_FILE_ID').getBlob();

const resource = addBucketFile_(driveBlob, bucketName, fileName);

if (resource) {
console.log(resource);
}
}

/**
* Adds a Blob to a Google Cloud Storage bucket.
*
* @param {Blob} BLOB - The Blob of the file to add to the bucket.
* @param {string} BUCKET_NAME - The name of the bucket containing to add the file to.
* @param {string} OBJECT_NAME - The name/path of the file to add.
* @return {Object} objects#resource
*
* @example
* const fileBlob = addBucketFile_(driveBlob, 'my-bucket', 'path/to/my/file.txt');
*/
function addBucketFile_(BLOB, BUCKET_NAME, OBJECT_NAME) {
const bytes = BLOB.getBytes();

// Base URL for Cloud Storage API
const url = `https://www.googleapis.com/upload/storage/v1/b/${BUCKET_NAME}/o?uploadType=media&name=${encodeURIComponent(OBJECT_NAME)}`;
try {
const response = UrlFetchApp.fetch(url, {
method: 'POST',
contentLength: bytes.length,
contentType: BLOB.getContentType(),
payload: bytes,
headers: {
"Authorization": "Bearer " + ScriptApp.getOAuthToken()
}
});
return JSON.parse(response.getContentText());
} catch (error) {
console.error('Error getting file:', error);
}
}
  1. In the Script Editor replace the bucketName, fileName and YOUR_DRIVE_FILE_ID with the bucket, object filename and path and the Drive file ID you want to upload.
  2. Click Save.
  3. Run the addFile() function

Note: The above sample only deals with single Google Drive files. If you are looking for a solution to upload an entire Google Drive folder, which also handles conversion of Google document formats into Microsoft equivalents, here is a related post from Stéphane Giron on Automatically backup Google Drive folders to Cloud Storage.

Summary

The strategy of using Apps Script access tokens is great for internal process solutions, allowing the use of a specific account for data read/write operations. In addition, this method is not exclusive to Google Cloud Storage but extends to other Google APIs manageable through Google Cloud IAM permissions. To help in identifying the appropriate scopes to include in the Apps Script manifest file, refer to the provided link for OAuth 2.0 Scopes for Google APIs.

For other situations, for example Google Workspace Marketplace Add-ons, where the identity of the account can’t be granted access to your storage bucket, creating and using a service account is required. If this is your situation Amit Agarwal has shared this post on Upload Files from Google Drive to Google Cloud Storage with Google Apps Script, which uses a service account.

--

--

Martin Hawksey
Appsbroker CTS Google Cloud Tech Blog

Google Developers Expert and Google Cloud Champion Innovator in Google Workspace working at CTS