Azure Table storage column size limitations and alternative approach for storing large data

Problem: Azure Table storage has a limitation for each property type, if the property type is byte[] then the column can storage up to 64 KB, if the property type is a string then the column can storage up to 64 KB. Since there are certain limitations for column size in Azure Table storage, we usually encounter below error when storing large amounts of data for a particular partition key.

“The remote server returned an error: (413) The request body is too large and exceeds the maximum permissible limit.”


“Block length does not match with its complement.”


Storing data in Azure Table storage

We use to store configurations in our Table storage. And for some partition keys our data field exceeded the maximum permissible byte[] size.

More information on Azure Table storage and column limitations are here.

To overcome this we stored the data in Blob storage and stored the Blob URI in the respective Table storage data column.

Storing the data in Table storage as follows.

1)Initialize Table and Blob storage.

2)Add an entity to a table.

Have to create a parameter less constructor, else executing table operations throw the following exception.

3) Store byte[] data in Blob storage.

4) Get the Blob URI.

var uri = blockBlob.Uri;

5) Store the Blob URI in Azure table storage. Check if table exists or not.

a) If exists, then retrieve the table data and replace the data with the updated entity.

b) If not exists, then create the table and store the data.

Here is the Table storage data after storing the data in Blob storage and storing the blob URI in Azure Table storage.

Retrieve stored data

While retrieving the data from Azure Table storage read the values from Azure Table storage and then get the actual data from stream.

1)Access the data from Azure Table storage

2) Read the actual data from Blob URI (which is stored in Azure Table storage).

Assign the actual blob data to the Settings entity and return the entity to front end (which is not implemented as part of this blog).

Browse through the code here