Google Cloud Platform Technology Nuggets — August 1–15, 2024 Edition

Romin Irani
Google Cloud - Community
9 min readAug 19, 2024

--

Welcome to the August 1–15, 2024 edition of Google Cloud Technology Nuggets.

Please feel free to give feedback on this issue and share the subscription form with your peers.

Infrastructure

Compute Engine’s X4 and C3 bare-metal machine types are now generally available. These machine types address the unique needs of Enterprise workloads, especially for large memory requirements and bare metal respectively. As the blog post states:

  1. The new X4 instance series includes three new instance types to address extra-large in-memory databases such as SAP HANA.
  2. Three new C3 bare-metal shapes cater to a variety of applications, such as commercial and custom hypervisors, that require direct access to CPU and memory resources.

Customers

Flipkart is without any doubt, one of the largest e-commerce players in India and is well known for its Big Billion Days, a special sales period with large discounts that attracts a volume that is several times magnitude its typical day volumes. One of the key foundational aspects to address this need is their own Flipkart Data Platform that was built in partnership with Google Cloud. Key in this platform is fStream, its in-house common streaming platform that operates seamlessly on Apache Spark and Apache Flink using Dataproc. Check out this story of how Flipkart migrated one of the key components of fStream, a durable StateStore, from Apache HBase on-premise to Bigtable on Google Cloud.

Identity and Security

We’ve got several updates in the Security area in this edition.

The 2nd Cloud CISO perspective is out for July. Safe drinking water is such a critical part of our lives and yet have we thought about safeguarding that from cyber attacks? Check out this edition of CISO perspectives that discusses that and much more.

If you are invested in Google Cloud Ecosystem and Security is your area, you should sign up for Google Cloud Security Summit on August 20. Check out the blog post that highlights some of the key sessions.

CIS Controls and CIS Benchmarks, globally-recognized best practices for securing IT systems and data. When it comes to implementing these across provider-specific implementations, specifically GKE, it helps that the platform itself takes care of addressing most of these and offers a security posture dashboard too. Check out this blog post, that talks about updated CIS Benchmarks for GKE and GKE Autopilot.

In keeping with the general principle of providing security at multiple layers, when it comes to GKE, Google Cloud custom Org Policy and Policy Controller are two effective and complementary controls. Custom Org Policies (for e.g. Enforce Binary Authorization, Enable Workload Identity for new clusters, etc) nare used to centralize controls, enforce them hierarchically and ensure only compliant resources are permitted in your organization. In addition to that, think of Policy Controller as programmable policies for your GKE clusters, which can be applied at admission time, to audit at runtime or from CI/CD pipelines to get early feedback for your code against policies. Check out the blog post for more details.

The use of Generative AI technologies to aid in security is definitely an interesting combination to watch. With the ability of models like Gemini to have a long context window of 1M+ tokens, it presents an option to address code vulnerability scanning in a novel way. Check out this technical deep dive that discusses one such experiment.

Machine Learning

The world of Generative AI is moving at such a rapid pace that it is difficult to keep a track of newer versions of models, what additional features they bring to the table and more. It is no different for the Gemini family of models and posts like this are welcome that help us to understand some of the recent improvements in Gemini. The post highlights some of the key updates to Gemini like:

  • Support for more models in Model Garden with the latest models from Mistral and Llama being available in a fully managed version.
  • Support for 100+ new languages in Gemini Flash 1.5 and Gemini 1.5 Pro.
  • A feature named Provisioned Throughput (available through allowlist) that assures customers of capacity and price, while scaling the Gen AI workloads.
  • As we speak, Gemini 1.5 Flash has reduced the input costs by up to ~85% and output costs by up to ~80%.

Check out the post for more details.

Databases

There are several updates in the Databases section in this edition and we start with Cloud SQL updates first.

Cloud SQL Studio, a lightweight tool to query your database directly from the console is now in General Availability (GA). It supports all three database engines: MySQL, PostgreSQL, and SQL Server. And most importantly, it is integrated with Gemini to help you be more productive in the studio vis-a-vis writing queries, understanding/optimizing them and more. Check out the blog post.

Cloud SQL Enterprise Plus for SQL Server is now in GA. Specific features include:

  • Two new machine families for enhanced performance and higher memory per vCPU.
  • A data cache for improved read performance.
  • Advanced disaster recovery (DR) capabilities and 99.99% availability SLA for business continuity.

Next up are a series of Spanner updates and some important ones, especially the one around Spanner Graph, which brings Graph capabilities to Spanner.

At the start, let us look at a nice table that highlights various Spanner editions and their capabilities. This can help to understand which edition would be the right fit for your workload. The blog post goes into detail for each of the editions.

Next up is Spanner Graph, by which we now have a unified database that provides ISO Graph Query Language (GQL) interface, integrates graph and relational (Full interoperability between GQL and SQL) databases , search (Rich vector and full-text search), and AI capabilities (Deep integration with Vertex AI) coupled with Spanner’s scalability.

Check out the blog post for more details on Spanner Graph and another post that highlights Spanner’s support for Approximate Nearest Neighbor (ANN) search.

Next up is Bigtable support for GoogleSQL, an ANSI-compliant SQL dialect used by Google products such as Spanner and BigQuery. As the blog post states, “now you can use the same SQL with Bigtable to write applications for AI, fraud detection, data mesh, recommendations, or any other application that would benefit from real-time data.” Check out the blog post that provides information on this feature, how it works and the typical use cases.

Data Analytics

BigQuery already has close integration with Vertex AI that allows you to use the Vertex AI platform with access to Gemini models and other AI Services. Recently there have been updates to BigQuery’s Gemini support that addresses support for newer Gemini models, grounding and fine tuning support and specifically:

  • The ML.GENERATE_TEXT SQL function now supports Gemini 1.5 Pro and Gemini 1.5 Flash foundation models. Support for this new multi-modal model provides much better support for process unstructured data, especially media files.
  • The ML.GENERATE_TEXT SQL function now supports grounding with Google search and customizable safety settings for responsible AI (RAI) responses.
  • You can now fine-tune and evaluate Gemini 1.0 Pro models.

Check out the post for more details.

The Data Analytics portfolio in Google Cloud continues to march full steam with 100s of features being released in the year and more exciting features coming ahead. Check out this blog post that highlights features like Gemini in BigQuery, Gemini in Looker, BigQuery’s unified data platform and more. If you have been looking for a single post that captures key updates to Data Analytics in Google Cloud, this is it.

I am saving the best for last, a feature called “BigQuery continuous queries”. Let’s start with the use case first and I state it exactly as is from the blog post: “You’ve poured your heart into creating a fantastic product, attracted potential customers to your website, and they’ve even added items to their cart. But then, they vanish without completing the purchase. Frustrating, right? Shopping cart abandonment is a widespread issue; the average cart abandonment rate hovers around a disheartening 70% according to the Baymard Institute. One solution? Real-time engagement that rekindles customer interest with a BigQuery continuous query.”

The solution, again as stated in the blog post: “a BigQuery table that logs our website’s abandoned cart events and captures: customer’s contact information, the abandoned cart contents, and the abandonment time. We’ll run a BigQuery continuous query that constantly monitors this abandoned cart table for new events, then sends any new abandoned carts through Vertex AI to generate a tailored promotional email for each customer, complete with product suggestions and perhaps a limited-time discount, and publishes the personalized email content to a Pub/Sub topic. Lastly we’ll use a simple Application Integration platform trigger to send an email for each Pub/Sub message received.”

API Management

In a microservices environment, latency is a key factor and over the years, gRPC is considered as a high-performance communication protocol and is widely adopted for its low latency, efficient serialization and strongly typed messages. However, we know that a majority of the service to service communication is HTTP APIs. Would you want to change your services to gRPC and run the risk of your developers coming up to speed with it and asking your clients to change the protocol. Or is there a way for an automated layer that could do the grunt work of providing a interface between HTTP and gRPC. Enter the gRPC-to-HTTP gateway built in conjunction with Apigee API Platform to achieve that. Check out the blog post for more details.

Developers and Practitioners

If you are developing LangChain applications and integrating with Google Cloud databases like AlloyDB and Cloud SQL for PostgreSQL, you can now look at a new managed integration with LangChain on Vertex AI for AlloyDB and Cloud SQL for PostgreSQL. The integration provides application templates, deploying it on Reasoning Engine , much faster prototyping and streamlined security integrations with Google Cloud. Check out the blog post for more details and specifically recommend, taking a look at a table that provides a comparison of developer workflow steps with and without LangChain on Vertex AI for AlloyDB and Cloud SQL for PostgreSQL.

DevOps and SRE

Looking to setup a Continuous Delivery pipeline to automate software delivery from code commit to production release? If you are using Gitlab, you can check out the new Google Cloud Gitlab integration that helps you set this up on Cloud Run using Gitlab CI/CD and Cloud Deploy. Check out the blog post for more details.

Learn Google Cloud

We still have some way to go before 2024 ends. This has definitely been the year of Gen AI and if you haven’t yet started on this path, check out a curated list of courses that can get you up to speed on this.

This is another reason to join hte Cloud Innovators program? The program provides you 35 credits every month to use towards courses and hands-on labs, including the curated list above.

Join the Innovators program at no cost today!

Stay in Touch

Have questions, comments, or other feedback on this newsletter? Please send Feedback.

If any of your peers are interested in receiving this newsletter, send them the Subscribe link.

Want to keep tabs on new Google Cloud product announcements? We have a handy page that you should bookmark → What’s new with Google Cloud.

--

--