Deploying OpenAI Models to Azure with Bicep

Thomas Pentenrieder
medialesson
Published in
Jul 4, 2024

Deploying an Azure OpenAI Service with GPT Models is pretty straight forward. However, I ran into an issue where deploying two models at the same time resulted in the following error most of the time:

“ Another operation is being performed on the parent resource ‘/subscriptions/xxxxxxxxxxxxxxxx/resourceGroups/openaibiceptest/providers/Microsoft.CognitiveServices/accounts/mlbiceptest’. Please try again later.

Turns out Azure does not like deploying multiple models simultaneously to OpenAI, so we need to make sure these run in sqeuence. Luckily this is pretty straight forward to do with bicep. Only thing we need to add is a dependency (here in line 37) from one model to the other to await the deployment before starting the next one.

You could also add even more models like this by just making sure you have a chain of models deployed in sqeuence.

Originally published at https://medienstudio.net on July 4, 2024.

--

--