Microsoft Build 2022 Debrief — Day 1

Ryan Rowston
Taptu
Published in
8 min readMay 26, 2022

Every year since going virtual I have been attending Microsoft Build, Microsoft’s annual developers’ conference. They use this as a platform to announce and talk about new technologies that they have been working on and releasing into the wild for the developer community to get their grubby mitts on.

It has now become tradition that I would summarise what I took away from the conference and bombard my long-suffering colleagues with these summaries on the discussion slack channel. This year it has been suggested that I also package these summaries into a blog post, thus, here we are. In no particular order, here are some of the summaries and thoughts I had at Build 2022.

SPAAAACE

Let’s start off with something that just made me laugh. It is just a case study using existing technologies, and not an announcement of a new technology or anything.

In the keynotes, they were talking about some processing that they were doing using AI on the ISS. It is an AI system to prematurely detect failures before they happen. If this doesn’t ring any bells this is basically that thing that HAL did that ultimately triggered it (him?) to go homicidal. Currently it was just looking at spacesuit integrity at least, not calibrating antennas yet. Having just recently finished reading ‘2001: A Space Odyssey’, I found this both terrifying and hilarious, and hope that those working on this got the reference.

Neural Processing Units (NPUs)

An interesting theme through a number of the talks was on dedicated Neural Processing units (NPUs) for running machine learning models. This is a long way from mainstream as a component currently, but they are pushing it as a vision for these things to become almost as ubiquitous in computers as a dedicated GPU. Time will tell whether that vision sticks.

They have released a few things that they hope will push adoption, or at least make it even possible:

  1. They have announced ‘Project Volterra’ — A dev kit for technologies using these NPUs that looks like an outright clone of a mac mini. It ships with a dedicated NPU in it and is going to be running Windows for ARM (using a Qualcomm Snapdragon processor).
Project Volterra

2. The other key feature they announced to potentially support use of NPUs (amongst other use cases) is a form of hybrid compute using their Onyx Runtime. This runtime will allow a developer to choose where code would be executed at runtime. The example given in their keynote talks for this was that an algorithm (or even slightly different algorithms) could be dynamically targeted to run against the CPU, GPU, NPU, or cloud computing resources in Azure.
I think the core theory behind this is to allow for developers to target an NPU if the client machine has one. However, if there is not an NPU present it would be able to fall back and use cloud resources. If neither is available it could then fail-over to a GPU/CPU-based process, even if it is a less effective algorithm or takes longer. This looks like very young technology and I’m not sure if this is going to catch on, but still interesting and has some possibilities for a range of applications.

Power Platform

Build wouldn’t be complete without huge segments dedicated to the growing capabilities of these systems. And to be fair, they have been improving quite a lot. In power automate alone I have noticed it getting less janky with more features becoming available.

There were several smaller announcements of minor features and case studies, though I haven’t dug deep into the weeds with these. I have gotten an overview and two features jumped out at me.

There is a new power app to join the power platform — Power Pages. This is a website builder that seems to be focused on public facing webpages/websites using your Office 365 data as a source for the CMS. From what I can gather, under the hood this builds on existing tools that Microsoft already had available and packages it in a more useable way focused on ‘citizen developers’ within the power platform.

From the demonstrations that they have shown so far, this may be a decent option when you put it up against some of its competitors in this space. It has a GUI designer to build the interface, structure components and choose what data to surface where. In addition, you can modify the html/css/js of the site to do custom work, either tweaks, or outright custom components.

I think the intention of this is to allow for a GUI design in broad strokes, then fine tuning in code — but we all know how this is going to go down in practice when developers get their hands on it. It will be interesting to see how well their editor handles custom code being added to pages. I expect this to have plenty of issues, particularly in the early days. However, I have yet to try it and I don’t know if it is has a particularly high usability bar to pass if it is attempting to compete with the likes of WordPress and Drupal

Where I think this really has an advantage is that by being part of the O365 environment and Power Platform. This means that potentially the entirety of your Dataverse could be used as source data for the CMS. I can potentially see some integrations with backend data being surfaced in a webpage as being potentially useful.

Oddly enough it sounds like this is based on Bootstrap rather than Fluent (Microsoft’s design language), a choice that raised some eyebrows in the audience during the talk.

There is also a new capability for Power Apps that was announced, but I am somewhat skeptical of how well it will work. It is an integration with Figma or even just raw images that uses an algorithm to transform designs directly into a Power app. They showed this working generating a form interface in power apps.

They conceded that you would need to go and wire/rewire data to the forms that it generated. However, this could accelerate development and give a better starting point for a new app than a blank canvas (which is one of the hardest points to start creating from).

I do wonder how well it would go if you fed it source data that didn’t already look a LOT like a Power App in the first place though.

ThE MeTaVeRsE

I suppose I can’t not talk about this given the amount of screentime it got at the conference and how topical it is. In recent times this has become a pet project of big tech everywhere, and a bugbear of most actual humans. Nothing highlights this better than an announcement of a partnership between Microsoft and Meta for Meta to draw on Azure resources to underpin parts of its AI powered ‘MeTaVeRsE’.

Microsoft focuses on the ‘mEtAvErSe’ from a business and work perspective, presenting it as a way of bringing face to face communication and collaboration in a world of hybrid work. This has a goal of re-enabling casual conversations, ‘Aha’ moments, and the innovation that stems from collaboration.

Call me a philistine, but it looked as janky and/or gimmicky as we have grown to expect from the metaverse.

Legless avatars that looked to be out of the era of the Wii were heavily joked about in the chat during the presentations, but is probably better than if they had legs. During presentations, avatars would perform a near constant robot dance as the avatar chopped between gestures and tried to keep up with motion tracking of its controller.

Maybe this’ll somehow take off and I’ll be looked at in a similar fashion to those who thought web 2.0 was just a fad. But I’m not seeing the appeal in this one, and I’m not alone. This was the only topic at Build that I saw that had a mostly negative reaction, or at least a had a vocal minority that just see it as a bit of a joke.

Microsoft Intelligent Data Platform

The final topic that I should probably mention is data. At Build they launched the Microsoft Intelligent Data Platform. This is a platform that uses several different Azure service offerings to streamline data capture, analytics, and governance.

I wish I could give you more than that, however, I am not overly familiar with Azure’s existing set of data management and manipulation services. Because this all integrates tightly (as a platform) this was too hard for me to follow all the service names and jargon when this one was being presented without having a background in it. Or maybe it is just because it was 2 am. Either way if this sounds of interest I am not going to pretend to understand this one and you will have to look elsewhere.

SQL Server 2022

On the topic of data though; SQL Server 2022 is in preview and coming soon, as well as its counterpart in Azure. These have improvements for query optimisers, particularly where an optimal execution plan and degree of parallelisation for a query changes significantly depending on the supplied query parameters.

There are a bunch of other new features as well. There is a good ‘what’s new’ article in the Microsoft docs. I’ll still highlight a couple that I found interesting.

Object Storage Integration is something that I had never really considered before, but there are likely some use cases for this. This allows an SQL server to interact with a system running an S3 interface to read or write data of an appropriate format. This potentially gives you some interesting ways to control where you are putting data.

There are a few more stored procedures available natively, including one to call a REST endpoint. This means that you could fairly easily have the SQL server make a rest call based on a trigger and use it as part of a data pipeline rather than it just being the end of the line for data.

This probably isn’t actually very useful for any practical use cases, at least none that I can think of. I have just spent so much time over the last few years looking at data integrations and Kafka that any new capability for a data flow, even an impractical one, excites me more than it should.

Day 2

Wait, we’re not finished? That was just from the first day!

The adventure continues in the next article for Day 2.

--

--