Unlock the power of Generative AI in your applications with Gemini

Jay Whitsitt
Firebase Developers
6 min readJan 1, 2024

Need content for your app? Looking to make your app a little more dynamic? Does your app need a little extra spark? Generative AI has been a big buzzword but when used right, it can make for a unique, memorable experience. It’s now easier than ever to integrate purpose-built machine learning into your applications.

Photo from https://blog.google/technology/ai/google-gemini-pro-imagen-duet-ai-update/

Vertex AI is Google’s platform for building applications with either pre-built or custom machine learning models. It encapsulates the entire workflow, from simple generative AI to more complex training, deployment, and refinement of custom models. Whether you’re new to software development or experienced with AI and ML, Vertex AI can help simplify the process for you with multiple ready-to-use models like Google’s latest multimodal model called Gemini, earlier ones such as PaLM 2 which only support text, and others.

There is a lot of information about these new models and what they can do, so in this post, we will focus on how to integrate Google’s APIs into your applications. This example will build a fully usable application where a Flutter app sends user input to a Firebase Functions backend which communicates with the Vertex AI APIs, then Functions responds to the frontend with the formatted result.

There are four parts to this walkthrough:

  1. Testing your ideas
  2. Create the Firebase project
  3. Set up your local environment
  4. Build your Functions

Testing your ideas

Using Google’s Vertex AI APIs does require some setup and both the development and cost overhead makes it not ideal for testing ideas. For just ideation or testing prompts quickly, Maker Suite is a great option.

Sample prompt and response from https://makersuite.google.com

It’s also a great way to adjust parameters like temperature, maximum output length, and top-k value, as well as get sample implementation code. You can request different formats of output, such as JSON, markdown, bullet lists, etc. We won’t get into prompt engineering or tuning these parameters here, but it’s worth looking into for full-production applications.

Your prompts don’t have to be perfect to start building. Once you’re comfortable, let’s set up the project.

Set up the Firebase project

To connect Firebase Functions to any Google Cloud APIs, including Vertex AI which gives access to the Gemini and other models, you’ll need to:

  1. If you don’t already have a project, create one at https://console.firebase.google.com
  2. Under Build, choose Functions.
  3. If you haven’t enabled billing yet, click “Upgrade project” and select a billing account to link to your Google Cloud project.
https://console.firebase.google.com

Using Firebase Functions requires the pay-as-you-go Blaze plan, but usage for this walkthrough should stay within the free quotas. The Gemini model’s pricing structure allows for 60 queries per minute for free. The billing account attached to the Cloud project will only incur costs if usage exceeds these free limits.

Set up your local environment

While Vertex AI API is publicly accessible and can be consumed by any application with the proper authentication, using Firebase Functions or Cloud Functions provides a big benefit over other cloud solutions: no need to manage API keys. If you looked at the sample code in Maker Suite, you may have noticed the placeholder for an API key. When you deploy a Firebase Functions or Cloud Functions application, the Cloud Platform SDKs use your project’s default service account that already has authorization to other Cloud resources.

First, a couple of prerequisites to install for Functions to work.

  • Install Node (I recommend using a version manager like nvm) if you haven’t already. As of this writing, Firebase requires at least node version 18.
  • Ensure you have the latest Firebase CLI installed and logged in.
npm install --global firebase-tools
firebase login

Now let’s set up your local project.

cd the/folder/where/you/want/your/code
firebase init

For this demo, you need to enable Functions and press Enter.

? Which Firebase features do you want to set up for this directory? Press Space 
to select features, then Enter to confirm your choices. (Press <space> to
select, <a> to toggle all, <i> to invert selection, and <enter> to proceed)
◯ Realtime Database: Configure a security rules file for Realtime Database and
(optionally) provision default instance
◯ Firestore: Configure security rules and indexes files for Firestore
❯◉ Functions: Configure a Cloud Functions directory and its files
◯ Hosting: Configure files for Firebase Hosting and (optionally) set up GitHub
Action deploys
◯ Hosting: Set up GitHub Action deploys

Choose “Use an existing project” then your Firebase project. The default options for the remaining questions are fine. We’ll use Javascript as the language for this demo.

? Please select an option: (Use arrow keys)
❯ Use an existing project
Create a new project
Add Firebase to an existing Google Cloud Platform project
Don't set up a default project

Now you’re ready to code!

Build your Functions application

If you open the functions/index.js file, you’ll see the “hello world” sample. Running it now will deploy one endpoint, /helloworld . At any point, you can deploy or start the local emulators.

# Deploy changes
firebase deploy

# Or start emulators - this only works from the functions directory
cd functions
npm run serve

Google has two Node.js client SDKs available to connect to their AI APIs:

Both work. This will use the newer one, @google-cloud/vertexai, only because the syntax is more straightforward. Personally, I’ve not found any functional differences, so it’s ultimately just preference at this point.

Calling the APIs

First, add the npm module to the project with the following from the functions directory. Be sure you’re not in the Firebase project root folder.

npm install @google-cloud/vertexai

Once installed, you can use the SDK to build a new function. Add the following to your index.js.

const { VertexAI } = require('@google-cloud/vertexai');

exports.prompt = onRequest(async (request, response) => {
const vertex_ai = new VertexAI({project: process.env.GCLOUD_PROJECT, location: 'us-central1'});

// Available models: https://cloud.google.com/vertex-ai/docs/generative-ai/learn/models
const model = 'gemini-pro';

const generativeModel = vertex_ai.preview.getGenerativeModel({
model: model,
generation_config: { // Test impact of parameters: https://makersuite.google.com
"max_output_tokens": 2048,
"temperature": 0.9,
"top_p": 1
},
});

const prompt = `Tell me a joke`;
const req = {
contents: [{role: 'user', parts: [{text: prompt}]}],
};

const content = await generativeModel.generateContent(req);
const result = content.response.candidates.at(0).content.parts.at(0).text;
response.send(result);
});

To test your changes, use the Functions emulator. Run the following from the functions directory.

npm run serve

If you haven’t changed the ports, you should be able to confirm the function response by opening http://127.0.0.1:5001/aiopoly/us-central1/prompt in your browser.

Sample response from Vertex AI

Making responses more dynamic

At this point, you have a deployable endpoint that responds with a generated value, but it’s not very dynamic. Let’s modify the code to take an input to customize the joke. Change the prompt line to this:

const prompt = `Tell me a joke about ${request.query.theme}`;

Now open http://127.0.0.1:5001/aiopoly/us-central1/prompt?theme=planes in a browser. Feel free to change the GET parameter value.

Generated joke relevant to the theme of “planes”

Any dynamic content within the prompt will need to be set up this way. Take the input and build your prompt based on that value, typically as a variable directly in the prompt string.

When you’re ready, deploy your function to Firebase.

firebase deploy

The output should give you the public URL(s) for your application.

What’s next

You can see a fully working Functions and companion Flutter app using these APIs. The Flutter frontend takes a theme as user input and displays Monopoly properties for the given input. The prompt asks for the response in a specific JSON structure which is then parsed by the frontend app.

This walkthrough and the samples linked above are just examples of using generative AI to build or enhance an application. This should give you the tools you need to integrate Vertex AI APIs into your applications to make them just that much better.

Since originally writing this, Google has released client SDKs for more platforms and languages to allow calling Vertex AI directly from the frontend. You can see how I updated the AIopoly app to use this SDK here.

Further reading

--

--

Jay Whitsitt
Firebase Developers

Mobile developer, community organizer, and Kansas City native