This repository provides code examples and resources for building a Looker Extension that integrates with Vertex AI Large Language Models (LLMs). This extension allows users to leverage the power of LLMs to enhance data exploration and analysis within Looker.
Note: For the Looker Explore Assistant, visit https://github.com/looker-open-source/looker-explore-assistant/.
The Looker GenAI Extension offers two key functionalities:
1. Generative Explore:
- Ask natural language questions about your data in Looker Explores.
- The LLM will automatically generate explores with the appropriate fields, filters, sorts, pivots, and limits.
- Visualize results using a variety of charts and dashboards.
2. Generative Insights on Dashboards:
- Analyze data from a Looker dashboard by asking natural language questions.
- The LLM considers all data from the dashboard tiles for context-aware insights.
The solution leverages the following components:
The extension supports multiple LLM integration options:
- BQML Remote Models: (Default) Uses native BigQuery ML integration for simple and quick deployment.
- BQML Remote UDF with Vertex AI: (Recommended) Uses Google Cloud Functions with Vertex AI for greater flexibility and production-ready scenarios.
- Custom Fine Tune Model: (Optional) Enables training a customized fine-tuned model for tailored responses.
Workflow for BQML Remote Models:
Workflow for BQML Remote UDF with Vertex AI:
Workflow for Custom Fine Tune Model:
This section guides you through deploying the necessary infrastructure using Terraform.
-
Clone the Repository:
cloudshell_open --repo_url "https://github.com/looker-open-source/extension-gen-ai" --page "shell" --open_workspace "deployment/terraform" --force_new_clone
Alternatively, open directly in Cloud Shell:
-
Set Project ID:
gcloud config set project PROJECT-ID
-
IAM Roles:
- Ensure the following IAM roles are assigned at the project level:
roles/browser
roles/cloudfunctions.developer
roles/iam.serviceAccountUser
roles/storage.admin
roles/bigquery.user
roles/bigquery.connectionAdmin
roles/resourcemanager.projectIamAdmin
roles/iam.serviceAccountCreator
For more detailed IAM information, see deployment/terraform/iam-issues.md.
- Ensure the following IAM roles are assigned at the project level:
-
Create Terraform State Buckets:
sh scripts/create-state-bucket.sh
-
Initialize Terraform Modules:
terraform init
-
Deploy Resources:
terraform apply -var="project_id=YOUR_PROJECT_ID"
-
Create Looker Project:
- Log into Looker and create a new project named
looker-genai
. - Use "Blank Project" as the "Starting Point."
- Log into Looker and create a new project named
-
Copy Extension Files:
- Drag and drop the following files from the
looker-project-structure
folder into your Looker project:manifest.lkml
looker-genai.model
bundle.js
- Drag and drop the following files from the
-
Configure BigQuery Connection:
- Modify
looker-genai.model
to include a Looker connection to BigQuery. - You can either create a new connection or use an existing one. If using an existing connection, ensure the service account has the necessary IAM permissions.
- Modify
-
Connect to Git:
- Set up a Git repository and connect your Looker project to it.
-
Commit and Deploy:
- Commit your changes and deploy them to production.
-
Project Permissions:
- Grant the project permission to use the selected BigQuery connection.
-
Service Account Permissions:
- Verify that the service account associated with the connection has permission to access the
llm
dataset in your GCP project.
- Verify that the service account associated with the connection has permission to access the
-
Test and Debug:
- Test the extension and use the browser's Web Developer Console to troubleshoot any errors.
- Review the
explore_logs
table in BigQuery to monitor queries.
Store example prompts in the llm.explore_prompts
table:
INSERT INTO `llm.explore_prompts`
VALUES("Top 3 brands in sales", "What are the top 3 brands that had the most sales price in the last 4 months?", "thelook.order_items", "explore")
Values:
name of example
prompt
model.explore
(LookML explore name)type
(explore
ordashboard
)
Settings are managed in the llm.settings
table. You can adjust these settings in the "Developer Settings" tab of the extension.
- Console Log Level: Controls the verbosity of logs.
- Use Native BQML or Remote UDF: Choose between native BigQuery ML functions or custom remote UDFs.
- Custom Prompt: Optionally set a custom prompt for your user ID.
Modify Settings with SQL:
-
Change settings for all users:
UPDATE `llm.settings` SET config = (SELECT config from `llm.settings` WHERE userId = "YOUR_USER_ID") WHERE True
-
Change settings for the default user:
UPDATE `llm.settings` SET config = (SELECT config from `llm.settings` WHERE userId = "YOUR_USER_ID") WHERE userId IS NULL
yarn install
yarn develop
The development server will run at https://localhost:8080/bundle.js
.
yarn build
This will generate the dist/bundle.js
file. Replace the URL in your LookML manifest with the production bundle.js
.
This section describes how to train and deploy a custom fine-tuned model using the provided Terraform scripts.
-
Infrastructure Setup:
- The provided Terraform code sets up Vertex AI, Cloud Functions, and BigQuery resources.
- It also includes the necessary IAM permissions.
-
Fine-Tuning:
- Execute the Cloud Workflow:
gcloud workflows execute fine_tuning_model
- Execute the Cloud Workflow:
-
Update BigQuery Endpoint:
- Modify the BigQuery endpoint to point to your custom fine-tuned model.
Note: The code for fine-tuned model integration is currently in progress and needs to be refactored for optimal use.