Google Cloud: DML Operation limits in GCP Cloud spanner(mutations)

Bharat Tiwari
Google Cloud - Community
3 min readNov 21, 2022

Cloud Spanner is a fully managed, mission-critical, relational database service that offers transactional consistency at global scale, automatic, synchronous replication for high availability.

As a must know part before developing and architecting applications of Cloud Spanner on GCP, It is a prerequisite to understand the limitations of DML operations in Spanner so that we don’t face problems at later stages of development.

Major point to check in the below mentioned limits is the number of Mutations allowed in DML operations that are 20K. Now these Mutations are counted differently in case of Insert, Update and Delete Operations. Different types of mutation counts are explained below with examples.

Limits for Inserting, updating and deleting data In Cloud Spanner:

  • Commit size (including indexes and change streams): 100 MB
  • Concurrent reads per session:100
  • Mutations per commit (including indexes):20,000
  • Concurrent partitioned DML statements per database:20,000

Insert DML Example:

  • Insert Mutations are 1 per column. If there are 10 columns in a row, there will be 10 mutations per row insertion. Total mutations allowed per operation are 20000.

In the below example, there are 6 columns, so total mutations for each row would be 6.

Total number of inserts that will be allowed in the table would be 20000/6=3333 rows.

Below are some of the test screenshots of DML operations:

Tried inserting million records, query failed because mutations are exceeding (1 Million *6= 6Million):

Tried inserting 3334 records, query failed because mutations are exceeding (3334 * 6= 20004):

Tried inserting 3333 records, query succeeded because mutations are less than 20K (3333 * 6= 19,998):

Update DML Example:

  • Update Mutations are Number of columns updated + Number of Indexes in a row. For example if there is a column update in a row having 2 indexes, there are 3 mutations per row.

Tried Updating 1M rows, query failed because number of mutations are 1 Million( more than 20K):

Tried Updating 1 column in 10001 rows, query failed because number of mutations are (10001*2=200002). Here total mutations are 2 per row (One column that we are adding and one column that we are updating).

Tried Updating 1 column in 10000 rows and query succeeded because 2 mutations per row multiplied by 10K comes exactly 20K:

Delete DML Example:

  • Delete mutations are 1 per row deletion irrespective of indexes.

Deleting 200001 rows:

Below query failed because we tried single operation on 200001 rows (200001 mutations).

Below query succeeded because we tried deleting 20K rows exactly(20K mutations) in a single operation.

Key Note: One way to deal with these limitations is to schedule the queries to perform operations in batches of 20K mutations. Also we have to keep in mind that Commit size including indexes and change stream is less than 100 MB.

--

--