Get Started with Generative AI: Free Courses Offered by Google Cloud

Learn the fundamentals of cutting-edge AI applications for non-technical enthusiasts on Udacity

Lotus Lin
6 min readMay 19, 2023
Image by Google

Earlier this week, Udacity released four free courses on Google Cloud, providing an introduction to the fundamentals of Generative Artificial Intelligence (Generative AI or GenAI). These courses are designed to be beginner-friendly, requiring no prior experience in AI. In less than three hours, I completed all four courses and gained a concise understanding of Generative AI, Large Language Models (LLMs), and the BERT Model.

As a product manager with limited exposure to AI projects, these courses served as an ideal starting point for me to grasp essential concepts. The courses presented relatable use cases and provided accessible resources, allowing me to easily comprehend complex topics. If you, like me, have an interest in the field of AI and aspire to stay updated on the latest trends, but feel unfamiliar with AI development, I highly recommend exploring the course overview below. It will help you determine if investing your time in these courses is worthwhile.

Images by Udacity Twitter and Udacity Generative AI Landing Page

1. Introduction to Generative AI with Google Cloud

Course components: a 22-minute video, suggested readings, and 5 quiz questions

Keywords: Artificial Intelligence (AI), Machine Learning (ML), Deep Learning (DL), Supervised and Unsupervised Learning, Generative and Discriminative models, Generative AI, Prompt Design

The course covers the following topics:

  • Establishing the relationship among AI, ML, and DL, providing a foundation for understanding their interconnectedness.
  • Highlighting the distinctions between supervised and unsupervised learning, helping you grasp their different approaches and applications.
  • Investigating Generative and Discriminative models, shedding light on their unique characteristics and how they contribute to the AI landscape.
  • Defining Generative AI and unraveling its inner workings, enabling you to comprehend how it generates new and innovative outputs.
  • Addressing challenges specific to Generative AI, including the notion of hallucinations, and emphasizing the importance of prompt design in mitigating such challenges.
  • Exploring various Generative AI model types, expanding your knowledge of the diverse approaches within this field.
  • Introducing Google Tools like Bard and GenAI Studio, showcasing practical resources that can enhance your Generative AI journey.
Introduction to Generative AI with Google Cloud

2. Introduction to Large Language Models with Google Cloud

Course components: a 15-minute video, suggested readings, and 4 quiz questions

Keywords: Large Language Models (LLMs), Pathways Language Model (PaLM), Questions Answering (QA), Prompt, Tuning, and Parameter-Efficient Tuning methods (PETM)

The course covers the following topics:

  • Defining Large Language Models (LLMs) and discovering how they intersect with the field of Generative AI.
  • Uncovering the benefits of utilizing LLMs, highlighting the advantages they bring to various applications and domains.
  • Exploring the Pathways Language Model (PaLM), a Google-designed model for LLMs.
  • Comparing LLM development using pre-trained APIs versus traditional development methods, providing insights into different approaches and their implications.
  • Examining Questions Answering (QA) in Natural Language Processing and delving into Generative QA, offering valuable insights into the mechanisms of answering questions using LLMs.
  • Understanding the importance of prompt design and prompt engineering, and how they impact the performance and output of LLMs.
  • Exploring the three main categories of LLMs: Generic (or raw) language models, instruction tuned models, and dialog tuned models, providing an overview of their specific applications and use cases.
  • Going deeper into the concepts of tuning, fine-tuning, observation, and Parameter-Efficient Tuning methods (PETM), uncovering strategies to enhance and optimize the performance of LLMs.
Introduction to Large Language Models with Google Cloud

3. Attention Mechanism with Google Cloud

Course components: a 5-minute video and 7 quiz questions

Keywords: Translation model, encoder-decoder models, Attention Mechanism, Neural Networks, Machine Translation

The course covers the following topics:

  • Introducing a Translation model based on the encoder-decoder framework, providing an overview of its structure and functionality.
  • Understanding how Attention Mechanism differs from traditional models, and unraveling the inner workings of this innovative approach.
  • Exploring the mechanism through which Attention Mechanism improves translations, shedding light on its transformative impact on the field of Machine Translation.
Attention Mechanism with Google Cloud

4. Transformer Models and BERT Model with Google Cloud

Note: This course is more technical and is better suited for individuals who have a foundational understanding of data structure and programming.

Course components: two 11-minute videos, 9 quiz questions, and lab resources on GitHub

Keywords: Transformer Models, Bidirectional Encoder Representations from Transformers (BERT), Natural Language Processing, Masked language modeling (MLM), Next sentence prediction (NPS)

The course covers the following topics:

  • Tracing the history of language modeling and providing a brief overview of its evolution.
  • Introducing Transformer Models and understanding their mechanisms.
  • Exploring various pre-trained transformer models, such as encoder-decoder models (BART), decoder-only models (GPT-3), and encoder-only models (BERT).
  • Understanding the BERT Model and its significant impact on enhancing Google search capabilities.
  • Unpacking the two core tasks performed by BERT: Masked language modeling (MLM) and Next sentence prediction (NPS).
  • Investigating the three essential embeddings utilized by BERT: token, segment, and position.
  • Examining examples of downstream tasks where BERT excels, highlighting the model’s versatility and applicability.
  • Walking through practical implementation guidance with a lab walkthrough of the BERT Model.
Transformer Models and BERT Model with Google Cloud

Selected Readings from the Courses

Here are some suggested readings to deepen your understanding of specific topics covered in the courses:

Enjoy your learning journey!

--

--

Lotus Lin

A PM who is captivated by the sparks of technology and design.