Build a Google Cloud FinOps Assistant Agent with ADK & MCP Toolbox for Databases
Enterprises have a variety of data that helps them to report and manage Cloud Costs. Asking simple questions like Which SKU costs the most? turns into digging through tables, writing SQL, and managing credentials. You might have to sift through a web console to search for the right place to ask the query too. In other words, extracting meaningful insights from it often demands deep technical expertise, slowing down decision-making and innovation.
That’s where the next wave of FinOps agents comes in. Today, these agents are conversational — able to answer questions like What’s the cheapest Compute Engine SKU in Europe? directly from your billing data.
Tomorrow, these same agents could become autonomous, proactively optimizing costs, detecting anomalies, or even approving budget adjustments on their own.
This article shows the first step on that path — how to turn dead billing data into a conversational asset using: MCP Toolbox for Databases and Agent Development Kit(ADK).
The Solution
MCP Toolbox connects agents to BigQuery via secure SQL tools; ADK builds conversational agents. Together, they let you talk to data — query SKUs, explain tiers — without APIs or auth hassles. Focus on insights, not infrastructure.
Key components:
- MCP Toolbox: Defines tools in a YAML file (e.g., get_sku_pricing for unnested tiered rates). Runs as a local server, using Application Default Credentials for auth — no API keys to manage.
- ADK Agent: Built with Python, integrates MCP tools, and uses models like Gemini 2.0 Flash for reasoning. Instructions define workflows.
This setup shifts focus: Users gain insights instantly, while developers avoid API integrations and auth hassles. Enterprises benefit from scalable, secure agents deployable to Vertex AI, turning dead data into a conversational asset.
If you are looking to get started with MCP Toolbox for databases and Agent Development Kit, check out the resources given below.



Use Case: Simplifying Access to Cloud Billing and Pricing Data
Traditionally, interacting with Cloud Billing data is challenging. Users rely on the Google Cloud console or APIs (e.g., Billing APIs for SKUs), worrying about rate limits, authentication (OAuth 2.0 tokens), and authorization (IAM policies).
Querying BigQuery involves writing SQL for specific tables like cloud_pricing_export, handling partitions, and joining datasets — a process that’s error-prone and time-intensive.
For instance, to get tiered pricing for an SKU, you’d manually unnest arrays and filter currencies, often requiring multiple tools (console, CLI, scripts) and risking compliance issues with VPC controls.
With MCP Toolbox for Databases and ADK, this is streamlined:
- Scenario: A FinOps manager needs to analyze Compute Engine pricing in a region, check CUD eligibility, and convert to local currency without SQL.
- Traditional Pain: Use Billing API for SKUs, BigQuery for queries, handle auth via service accounts, and script unnesting — hours of work.
- MCP + ADK Solution: The agent (PricingBotV4) uses MCP tools to query cloud_pricing_export.
Ask: List SKUs for Compute Engine in us-east1 on 2025–09–01. The agent lists SKUs, flags CUDs, and explains: Tiered pricing: From 0 units, $0.046/hour (INR ~3.84/hour). No auth worries — MCP handles it securely.
- Efficiency Gains: Abstracts schema (e.g., auto-unnests tiered_rates), supports workflows (tiered explanations, CUD hints), and focuses on insights. Deploy to Vertex AI for team access, reducing tool fragmentation.
This use case demonstrates how MCP + ADK democratizes data, making billing management proactive and user-friendly.
Prerequisites
- Google Cloud project with BigQuery/Vertex AI APIs enabled.
- Billing data exported (Learn how to?).
- A Python virtual environment. Create one using python -m venv env and activate it
(source env/bin/activate). Then, runpip install google-cloud-adk google-cloud-toolboxto install the required dependencies. - gcloud auth application-default login.
Step 1: Prepare your BigQuery Dataset
For this article we are going to be using the Cloud Pricing dataset that can help developers easily find out prices about various Google Cloud Services. To know more about how the transferring your cloud billing data to BigQuery, check out the article given below.
Run a sample query to verify data:
SELECT sku.id, sku.description, list_price.*
FROM 'your_dataset_id'
WHERE sku.id = '2DA5-55D3-E679' LIMIT 1;Step 2: Define Your Tools (tools.yaml)
The MCP Toolbox for Databases requires a file called tools.yaml, which is a YAML file that defines which database you’re connecting to and which SQL queries will be exposed as tools. These tools query the cloud_pricing_export table, abstracting schema complexity and enabling secure, parameterized access for the PricingBot agent.
sources:
billing-pricing-bq:
kind: bigquery
project: your_project_id
tools:
list_services:
kind: bigquery-sql
source: billing-pricing-bq
description: Lists all available Google Cloud services in the pricing catalog.
statement: |
SELECT DISTINCT service.description
FROM `your_dataset_id`
ORDER BY service.descriptionWe start off by declaring our sources for the tools file following by declaring our first tool list_services. The list_services tool queries the cloud_pricing_export table to retrieve a list of all Google Cloud services (e.g., BigQuery, Compute Engine) available on a specified date.
It simplifies discovery by providing a starting point for users to explore services without writing SQL, addressing the dead data problem by making the billing catalog accessible.
list_skus_for_service:
kind: bigquery-sql
source: billing-pricing-bq
description: Lists up to 20 SKUs for a given Google Cloud service.
parameters:
- name: service_name
type: string
description: The name of the service (case-insensitive).
statement: |
SELECT
sku.id,
sku.description,
service.description as service_name
FROM
`your_dataset_id`
WHERE
LOWER(service.description) = LOWER(@service_name)
ORDER BY
sku.description
LIMIT 20The list_skus_for_service tool lists up to 20 SKUs for a given Google Cloud service (e.g., Cloud Storage) on a specified date. It includes SKU IDs, descriptions, and CUD eligibility, simplifying service-specific pricing exploration and turning dead data into a navigable resource.
list_skus_for_service_and_region:
kind: bigquery-sql
source: billing-pricing-bq
description: Lists up to 20 SKUs for a given Google Cloud service in a specific region.
parameters:
- name: service_name
type: string
description: The name of the service (case-insensitive).
- name: region
type: string
description: The region to filter by (e.g., 'us-east1').
statement: |
SELECT
sku.id,
sku.description,
service.description as service_name,
geo_taxonomy.regions
FROM
`your_dataset_id`
WHERE
LOWER(service.description) = LOWER(@service_name)
AND @region IN UNNEST(geo_taxonomy.regions)
ORDER BY
sku.description
LIMIT 20The list_skus_for_service_and_region tool fetches up to 20 SKUs for a service in a specific region (e.g., us-east1) on a given date. It includes CUD indicators, enabling region-specific pricing queries and making dead data actionable for targeted analysis.
get_sku_pricing:
kind: bigquery-sql
source: billing-pricing-bq
description: Retrieves the pricing information for a specific SKU.
parameters:
- name: sku_id
type: string
description: The ID of the SKU.
statement: |
SELECT
sku.id,
sku.description,
service.description as service_name,
pricing_unit,
list_price,
account_currency_code
FROM
`your_dataset_id`
WHERE
sku.id = @sku_idThe get_sku_pricing tool retrieves detailed pricing for a specific SKU, including tiered rates, USD, and local currency amounts, on a given date. It supports tiered pricing explanations and CUD detection, transforming dead data into clear, actionable insights.
toolsets:
billing-pricing-tools-v4:
- list_services
- list_skus_for_service
- list_skus_for_service_and_region
- get_sku_pricingIn the end we collect all our tools and list them under one toolset.
Run the Toolbox Server
./toolbox --tools-file="tools.yaml"Now, your billing database is ready to talk.
Step 3: Build the ADK-powered Agent
With the MCP Toolbox configured to provide secure, parameterized access to the cloud_pricing_export table, the next step is to build the ADK agent, PricingBotV4, using the Agent Development Kit (ADK).
This agent integrates with the MCP tools to enable conversational queries, transforming dead billing data into a dynamic, user-friendly resource. The agent uses Google Cloud’s generative AI capabilities (e.g., Gemini 2.0 Flash model) to reason over data and follow predefined workflows for tasks like listing SKUs, retrieving pricing, explaining tiered rates, handling currency conversions, and detecting Committed Use Discounts (CUDs).
from google.adk.agents import Agent
from toolbox_core import ToolboxSyncClient
toolbox = ToolboxSyncClient("http://127.0.0.1:5000")
# Load all the tools
tools = toolbox.load_toolset('billing-pricing-tools-v4')This code imports the libraries needed to build the agent and connects to the MCP Toolbox server, loading the billing-pricing-tools-v4 toolset. It sets up secure access to BigQuery billing data, eliminating the need for manual API setup or authentication, making dead data accessible.
This segment creates the PricingBotV4 agent using the Agent class from ADK, configuring it with the gemini-2.0-flash model for natural language processing and reasoning.
The description field positions it as a Google Cloud pricing expert, while the instruction provides detailed workflows to handle various query types: listing SKUs (with or without region filters), retrieving pricing, explaining tiered rates (by inspecting list_price.tiered_rates), handling currency conversions (using account_currency_amount), detecting CUDs (via “Commitment” in SKU descriptions), and providing service pricing overviews.
The tools parameter links to the MCP toolset (billing-pricing-tools-v4), enabling secure BigQuery queries. This setup addresses dead data by abstracting complex SQL (e.g., unnesting arrays, filtering partitions) and security concerns (e.g., IAM, VPC rules), allowing users to ask natural questions like —
What’s the price of SKU 578C-5A2A-71EF in INR? and receive clear, formatted responses (e.g., “$0.05/hour, INR ~4.38/hour, not tiered”).
The workflows ensure consistency and accuracy, transforming inaccessible billing data into a dynamic resource for users and enterprises.
Run and Test the Agent
To run locally, start the MCP Toolbox for Databases server and execute adk web. For production, deploy to Vertex AI Agent Engine or Cloud Run for scalability. This setup enables conversational queries, turning dead billing data into actionable insights without manual SQL or API hassles.
Test the agent with queries like List services on 2025–09–01 or Get pricing for SKU 578C-5A2A-71EF in INR. These validate its ability to list SKUs, explain pricing, handle currencies, and flag CUDs, making billing data accessible without SQL expertise. Ensure the server is running, permissions are set, and views handle schema changes, keeping billing data accessible.
The agent successfully lists all Google Cloud services, then fetches the first 20 SKUs for BigQuery — showing how easily you can explore pricing data without manual API calls.
The agent lists available Compute Engine vCPU SKUs across regions and offers to find the cheapest one — eliminating the need to manually compare prices or check discounts.
The agent lists BigQuery storage SKUs in the US, fetches their prices, and highlights the cheapest option — turning a complex task into a simple query.
Conclusion
MCP Toolbox for Databases + ADK simplifies how you work with data by turning it into a conversational experience. You no longer need to manage API calls, credentials, or complex SQL queries — just connect your data and start asking questions. This makes it easier for developers, FinOps teams, and enterprises to get insights from their data without dealing with the usual technical friction.
Deploy the agent today to unlock actionable insights, streamline cost management, and make your billing data work for you.
You can check out the code by checking out the link given below.
Feel free to reach out if you have any issues/feedback at aryanirani123@gmail.com.

