Original Photo by Jan Huber on Unsplash

Azure Table Storage: How to simulate a large data set

A small tip on how to create a large data set

Anna Melashkina
medialesson
Published in
1 min readMar 30, 2021

--

Recently I was testing Azure Storage performance. For the stress test, I needed to create 15 million records. As a true C# developer, I created TableBatchOperation and started executing ExecuteBatchAsync. As you might know, TableBatchOperation contains up to 100 operations… After one night of running my code, it was far away from completing :D So, I thought, I definitely need another way of data generation. Today I will tell you my secret ;)

Probably, you are already using Azure Table Explorer. Maybe you didn’t know that it supports CSV upload? CSV upload works so much faster (interesting why?)!

So, when I learned, that CSV upload is much faster than upload via C#, I decided to generate my 15 million records via nodejs:

CSV generation took me 30 min. After that, I imported CSV via Azure Table Explorer during the night. And I’m ready to run my stress tests! Maybe this will help someone, who wants to generate a big amount of data and happy stress testing :)

--

--