Google Cloud Next 2023 — Experience, Announcements, and Summaries of Favourite Sessions

Dazbo (Darren Lester)
Google Cloud - Community
10 min readSep 5, 2023

The Slide Deck

Every year I create a set of slides summarising the Google Cloud Next sessions I attended. The goal of the deck is to save you a load of time. In the deck, you’ll find:

  • A summary of significant annoucements (including which are in Preview and which are GA).
  • A summary write up of all the sessions I found particularly interesting. (And in more detail than you’ll find in this blog.)
  • Each write-up also includes a “summary-on-a-page” (SoaP), which you can use to get the headlines, and decide if you want to look at this session in more detail.
  • Links to the YouTube videos for these sessions.

But in addition — in this blog — you’ll find a load of extra stuff, including: my experience of visiting San Francisco and of the Moscone centre itself, and a bunch of brief session summaries.

Enjoy!

Visiting San Francisco

I was lucky enough to be able to attend Google Next 2023 in San Francisco. I live in the UK, so the trip started with a flight that was a little over 10 hours.

I should say that the highlight was the Google event itself. But when I was a kid in the 90s, I used to love playing flight simulators on my Amiga A500. And in particular, I loved flying under the Golden Gate Bridge in the game “F/A-18 Interceptor”. Since then, I’ve ALWAYS wanted to visit the bridge. So, I went one day early, and visited the bridge with my friends, on the bank holiday Monday.

It did not disappoint!!

Me visiting the Golden Gate Bridge in SF
It’s unreal, isn’t it? But these are my genuine photos from my visit!

The Hotel

I stayed at the Hilton at Union Square. What I wasn’t expecting, on arriving at the hotel, was the need to haggle!! The guy on the check-in desk told me that I had a room at the lowest possible level, with no view. He told me I could upgrade to a room with a great view and a balcony, for an extra $180 per night!

I didn’t want my first ever California experience to be in a crappy room, and so — possibly influenced by the migraine I was struggling with at the time — I haggled with the guy. We eventually settled at an extra $60 dollars per night. (This did NOT go on the company card!!)

I think the upgrade was probably worth it, because I had this view of the sunrise every morning:

Sunrise from my room

The other cool thing was that attendees of Google Next were given special “Next”-branded hotel key cards!

Next-branded hotel key card

The Event

Google Next 2023 was located at the Moscone Centre in SF, and spread over three days. I used the Google Next site to pre-book the sessions I wanted to attend. And then I used the official Next ’23 app to manage my agenda, and see where to go for each session.

The Moscone centre was busy, but not overly busy. The first morning looked a little crazy…

The Keynote

The keynote was a little frustrating. It was due to start at 0900 and I arrived at the centre at 0840. But the hall was already full, so we were directed towards overflow rooms. But the wifi was down in the centre, so we were unable to watch the event live. (I caught up at lunch.)

The keynote was all about AI. Surprise! And, in particular, Generative AI. For those who don’t know what Gen AI is, the keynote summarises it as:

“A type of AI that can create new content — text, images, speech, code, audio — by learning the patterns and structures of existing data, and creating new data with similar characteristics.”

Google Cloud partnership with Nvidia

For me, the key takeaways from the keynote were…

We’re now in the next chapter of digital transformation. And the key ingredients for this chapter are:

  • World-class AI infrastructure
  • Duet AI — the always-on AI collaborator
  • Google’s investment with the broad partner ecosystem.

Keynote headlines:

  • Google Cloud is “AI-first”. Making AI available to everyone.
  • Some infrastructure announcements. In particular, a complete stack partnership with Nvidia, to provide blistering AI performance.
  • New announcements for: digital watermarking, PaLM 2, Med-PaLM.
  • 3rd party open source models (like Meta’s Llama 2 and Code Llama) now available in the Vertex AI Garden.
  • Vertex AI Search & Conversations now GA.
  • Duet AI — a core product of the event — now GA in Workspace; soon to be GA for Google Cloud.
  • Duet AI now powers Data Migration Services (DMS).
  • Duet AI now integrated into Chronicle SIEM.
  • BigQuery Studio — natively integrating BigQuery, Looker and Vertex AI into one console.

“Together, we’re building the new way to cloud”.

Other Sessions I Attended

All of the following sessions have a full write up in my deck. Here I’ll just capture some of the headline points…

Optimize your BigQuery Data Platform for Performance and Cost

From my summary deck
  • General cost optimisation tips for BigQuery.
  • Is BQ the right product? Do you have enough data to justify it?
  • Compute optimisations: such as using max slots, and per-project quotas.
  • Table design optimisations: such as partitioning, clustering and sharding.
  • Storage optimisations: such as moving ageing data to GCS archive storage, and using the new compressed storage pricing model.
  • Architectural optimisations: such as moving some ETL processing out of BigQuery, and using micro-batching for ingestion, rather than streaming ingestion.
  • Monitoring and alerting: use the Monitoring tab, setup dashboards, and setup alerting.

Five Practical Considerations for Adopting AI

From my summary deck
  • Use case prioritisation — e.g. where we can decreases costs or increase revenue, where we have data available, and where the capabilities built might be reusable.
  • Create a culture of experimentation.
  • Measurement and improvement.
  • Security, privacy and responsibility. Including discussion of the shared responsibility model, and the new Secure AI Framework (SAIF).
  • MLOps, to deploy models at scale.

Build a Differentiated Generative AI Application on Google Cloud

Here we saw a comparison of buildling the same Gen AI solution using three different approaches, form the most customisable to the easiest to implement.

Common Business Use Cases for Generative AI

This was a panel discussion, covering a number of Gen AI use cases, including:

  • Customer service chatbot
  • Product catalogue data classification
  • Code generation, e.g. for migration use cases
  • AlphaFold, for drug design

Transformation, Data, People, and Migration: A Comprehensive Regional Bank Modernization

This was a case study, showing how Arvest Bank have made great strides into their cloud migration journey, using Google Cloud and Slalom.

Their approach:

  • Establish their transformation goals.
  • Establish their cloud provider and partner.
  • Conduct vulnerability assessment — what are their weak spots? E.g. weak customer touch points.
  • Conduct technology assessment.
  • Conduct data maturity assessment — do we know about our data? Do we know its quality? Can we get to it?
  • Define a data-centric strategy (using data products) and a transformation plan.
  • Get Exec buy-in.
  • Come up with a way to bring along the team. Including: upskilling and cross-training.
  • Application migration assessments and categorisation.
  • Create a small set of landing zones.
  • Map target technologies.

Oracle to PostgreSQL Migrations with Google Database Migration Service

From my deck
  • First, an overview of the typical challenges of a heterogeneous DB migration to the Cloud. Traditionally, code conversion is the most difficult phase.
  • Next, a summary of Google’s investment in database migration capability: including the tooling (DMS), the expertise, and migration seed funding.
  • Oracle DB to PostgreSQL (whether Cloud SQL or AlloyDB) is now GA!
  • Super-high automated conversion rates, and now — Duet AI can help further!
  • A dive into DMS, with a live demo.
  • Then, a case study from HSBC, who have the goal of migrating 10000 databases, including 6000 Oracle databases occupying 26PB of data!

Dynamic Workloads at the Right Price: Cloud Run & GKE

From my deck
  • A useful session describing when Cloud Run is more suitable than GKE, and vice versa.
  • Cloud Run is great for simple architectures and standalone services, at ANY scale.
  • GKE is perfect for more complex architectures, e.g. multiple services, or where we need control of compute types.

Santander Bank Modernizes and Moves Mainframe Workloads to the Cloud

This was a great session describing the successes that Santander have had using Dual Run to migrate their mainframe workloads to Google Cloud.

From my deck

We start with an overview of the options of how to deal with mainframe workloads:

  • Augment (with Connector from cloud to mainframe)
  • Replatform (Dual Run)
  • Refactor (G4 + Gen AI)
  • Replace

A quick dive into each technology. And then, a deeper dive into the Dual Run case study with Santander. Dual Run allows the parallel running and validation of mainframe workloads, both on mainframe and in cloud. In the cloud environment, Dual Run provides a Microfocus environment for running Cobol workloads, and migrates mainframe DB2/Z to DB2 LUW. (I asked a few questions on this in the Q&A section, and captured them in the deck.)

Optimise Your Google Cloud Costs

This was a panel session on optimising cloud costs, led by the hilarious Pathik Sharma. (I like this guy a lot!)

From my deck
  • We start with a FinOps overview, including the Google Cloud FinOps cycle.
  • Then we discuss the three main categories of cost optimisation: resource optimisation (i.e. what you’re consuming), rate optimisation (i.e. the price you pay for what you’re consuming), and architectural optimisation (i.e. the extent to which you leverage managed and native services, eliminating traditional overheads).
  • We look at the Effort vs Benefit chart.
  • Then, some wins and lessons learned from the panel members, representing CME Group, General Mills, and Priceline.

Google Cloud FinOps Tools

This session included some of the latest announcements in Google Cloud FinOps tooling to help optimise costs and understand spend.

From my deck

The session covered:

  • Top priorities for required FinOps capabilities.
  • Google Cloud FinOps Cycle.
  • Inform enhancements: e.g. the new “detailed BigQuery billing export” and Duet AI integration
  • Optimise enhancements: e.g. the new FinOps hub — providing recommendations, rationale, and benefits, all in one place.
  • Operate enhancements: e.g. project owners can now create and manage their own budgets and budget alerting.

A Few More Pictures

From the main exhibition area in the Moscone centre:

EPAM was given the prestigious Google Cloud Partner of the Year Award. There were awards in a few different categories, but EPAM received it for Social Impact, in recognition of EPAM’s global initiative in support of Ukraine.

EPAM — Google Cloud Partner of the Year. Showing off their virtual AI, Vivien.
EPAM — Partner of the Year

From the Partner Lounge:

Wednesday night, before LL Cool J arrives!

I know a Woowoo when I see one!

Closing Remarks

I hope you found this useful. Do check out my presentation, as it might save you a lot of time, and help you identify topics for further study.

I wrote most of the deck on the flight back to the UK. It was worth paying for the Wifi!

Leaving SF, to head back to the UK

Before You Go

  • Please share this with anyone that you think will be interested. It might help them, and it really helps me!
  • Feel free to leave a comment 💬.
  • Follow and subscribe, so you don’t miss any of my content. Go to my Profile Page, and click on these icons:
Follow and Subscribe

--

--

Dazbo (Darren Lester)
Google Cloud - Community

Cloud Architect and moderate geek. Google Cloud evangelist. I love learning new things, but my brain is tiny. So when something goes in, something falls out!