Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Marketplace deploy #149

Open
wants to merge 88 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
88 commits
Select commit Hold shift + click to select a range
5376a53
Feat make smarter (#91)
LukaFontanilla Sep 11, 2024
40e802d
Update useSendVertexMessage.ts
dbarrbc Sep 17, 2024
caed43a
Fixed issue with parsing cleaned url
dbarrbc Sep 18, 2024
679d906
Origin/feature generate examples (#1)
colin-roy-ehri Oct 3, 2024
c2286a1
Error handling
colin-roy-ehri Oct 7, 2024
8998ee2
Merge remote-tracking branch 'personal-fork/main'
colin-roy-ehri Oct 7, 2024
2cb58eb
terraform bug fix
colin-roy-ehri Oct 7, 2024
b01afb2
Handle url queryPrompt parameter
colin-roy-ehri Oct 10, 2024
775c316
generate_exmples bug fix
colin-roy-ehri Oct 10, 2024
51c1525
Added number filter documentation
colin-roy-ehri Oct 21, 2024
3c924f9
work in progress - append examples
colin-roy-ehri Oct 22, 2024
665aefb
working and tested new concat function.
colin-roy-ehri Oct 22, 2024
2787e98
tested
colin-roy-ehri Oct 22, 2024
c1cb48b
Update looker_filter_doc.md
kalib-brayer Nov 14, 2024
79cc54e
Add files via upload
kalib-brayer Nov 14, 2024
f8ab7ec
Update useSendVertexMessage.ts
kalib-brayer Nov 14, 2024
fbbafaa
Update ExploreFilterHelper.ts
kalib-brayer Nov 14, 2024
42c4c1a
Merge branch 'looker-open-source:main' into make-it-smarter-bytecode
kalib-brayer Nov 14, 2024
b3315e4
mostly working with new configs
colin-roy-ehri Nov 22, 2024
f7d70c3
working settings!
colin-roy-ehri Dec 5, 2024
61101f1
refactoring to use lookml queries
colin-roy-ehri Dec 5, 2024
8d61945
working with cloud function and new lookml model
colin-roy-ehri Dec 5, 2024
a2d83f8
Update useSendVertexMessage.ts
colin-roy-ehri Dec 10, 2024
8dc2375
made settings admin-only and hide them for regular users.
colin-roy-ehri Dec 10, 2024
5736fbb
committing timeframe filter logic
kalib-brayer Dec 10, 2024
517bbca
secure fetchProxy added
colin-roy-ehri Dec 10, 2024
19c5dfb
working with Bigquery!
colin-roy-ehri Dec 12, 2024
aca5b63
Fixed problem with variability
colin-roy-ehri Dec 12, 2024
2ede46f
remove indeterminacy, fix filter bug
colin-roy-ehri Dec 13, 2024
0b7a5cc
Merge branch 'looker-open-source:main' into make-it-smarter-more-cont…
colin-roy-ehri Dec 13, 2024
7cca008
Merge pull request #1 from bytecodeio/make-it-smarter-more-context-by…
colin-roy-ehri Dec 13, 2024
e5b16f7
more context with restored filters call
colin-roy-ehri Dec 16, 2024
108cfce
bug fix
colin-roy-ehri Dec 17, 2024
b9b2d2d
bug fix
colin-roy-ehri Dec 19, 2024
16243c0
add back in filter mods
colin-roy-ehri Dec 19, 2024
15f8d54
handle NOT NULL better
colin-roy-ehri Dec 19, 2024
11fb052
Merge pull request #2 from bytecodeio/make-it-smarter-more-context-by…
colin-roy-ehri Dec 19, 2024
2ab4023
Merge branch 'make-it-smarter-bytecode' into marketplace_deploy
colin-roy-ehri Dec 20, 2024
b45c133
merge fixes
colin-roy-ehri Dec 20, 2024
530b8f1
rm trusted dashboards
colin-roy-ehri Dec 26, 2024
f9afacd
work in progress on be installer
colin-roy-ehri Dec 26, 2024
a6540e0
testing cloud shell script
colin-roy-ehri Dec 26, 2024
a2fa6d9
testing cloud console run
colin-roy-ehri Dec 26, 2024
c940392
testing
colin-roy-ehri Dec 26, 2024
2d076f8
readme updated
colin-roy-ehri Dec 26, 2024
001c48b
test
colin-roy-ehri Dec 26, 2024
0130352
testing
colin-roy-ehri Dec 26, 2024
5b18b8b
test
colin-roy-ehri Dec 26, 2024
474258d
test
colin-roy-ehri Dec 26, 2024
dc9ef34
test
colin-roy-ehri Dec 26, 2024
c0a29a2
test
colin-roy-ehri Dec 26, 2024
6ded395
test
colin-roy-ehri Dec 26, 2024
9e867d9
test
colin-roy-ehri Dec 26, 2024
7a1d2f4
test
colin-roy-ehri Dec 26, 2024
54204aa
test
colin-roy-ehri Dec 26, 2024
5444dd2
readme edits
colin-roy-ehri Dec 26, 2024
a3a32db
updated readme
colin-roy-ehri Dec 26, 2024
4ad93bb
readme update
colin-roy-ehri Dec 26, 2024
3180fb6
readme
colin-roy-ehri Dec 27, 2024
9a309e4
Fixed READMEs
colin-roy-ehri Dec 31, 2024
b262728
testing
colin-roy-ehri Jan 2, 2025
bf8f7c9
testing
colin-roy-ehri Jan 2, 2025
11af103
security setup
colin-roy-ehri Jan 2, 2025
0b77af4
updates for security testing
colin-roy-ehri Jan 2, 2025
11d6730
error handling fixes and example script updates
colin-roy-ehri Jan 2, 2025
394cf72
readme updates
colin-roy-ehri Jan 2, 2025
121aa7a
MIS changes
colin-roy-ehri Jan 4, 2025
e569083
changes from MIS
colin-roy-ehri Jan 4, 2025
a547c32
added important note to the backend BQ model creation
colin-roy-ehri Jan 9, 2025
1c1efcc
readme updated
colin-roy-ehri Jan 13, 2025
8e7ee93
update readme
colin-roy-ehri Jan 14, 2025
ea7bde3
Update init.sh to have sed command for backend file to pass in projec…
kate-bytecode Jan 16, 2025
8e502bb
Update backend-gcs.tf
kate-bytecode Jan 16, 2025
54b49ba
wip
colin-roy-ehri Jan 24, 2025
b752f84
Merge branch 'marketplace_deploy' of https://github.com/bytecodeio/lo…
colin-roy-ehri Jan 24, 2025
d8181c2
more idempotent
colin-roy-ehri Jan 24, 2025
cf125b7
another terraform mod
colin-roy-ehri Jan 24, 2025
ad5712c
better init
colin-roy-ehri Jan 24, 2025
e3e9875
tf module lifecycle reversion
colin-roy-ehri Jan 24, 2025
588ed50
fixed bug
colin-roy-ehri Jan 24, 2025
e71ae6f
added waiting instructions
colin-roy-ehri Jan 27, 2025
3836baf
new path attempt for cf
colin-roy-ehri Jan 27, 2025
faa06aa
better source path
colin-roy-ehri Jan 27, 2025
b835ff2
bf
colin-roy-ehri Jan 27, 2025
2ee4180
bf
colin-roy-ehri Jan 27, 2025
c0509d0
reversion of ziplocation
colin-roy-ehri Jan 27, 2025
e06e399
Merge remote-tracking branch 'bytecode-fork/main' into marketplace_de…
colin-roy-ehri Feb 6, 2025
088f6d0
changelog
colin-roy-ehri Feb 10, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ terraform.tfstate*
*.tfstate
.venv
node_modules
looker.ini

.vertex_cf_auth_token
dist
Expand Down
25 changes: 25 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,29 @@
# Changelog


## V4.1 (Marketplace PR)
- Instead of using sql runner queries, the required data is modeled in LookML. This prevents the need for escalated privileges for users (ie use_sql_runner permission).
- Requires the use of a Looker block or remote project import.
- Environment variables previously built into the Frontend application are now entered in an admin menu of the UI and stored as user attributes.
- The compiled .js is now portable.
- Instead of calling a cloud function endpoint through the browser, this call is proxied through Looker.
- Allows a closed network design.
- Backend install flow available in Cloud Console (GCP Cloud Shell).
- Local IDE setup is now unnecessary.
- Cloud console flow for backend setup includes step by step instructions as well as Terraform script.
- This makes installation in an existing Google Project possible.


## V4.0 (Making it smarter)
- Updated Readme to provide clearer instructions and troubleshooting tips
- Simplified setup with a single shared environment file
- Improved bash scripts for example generation
- Added the ability to generate training prompts from trusted dashboards
- Improved error messages (added CORS headers) to improve the setup process
- Improved the accuracy of Explore Assistant returning the correct results by adding relevant context. Specific improvements include:
- Date filters: Accuracy for basic, range, and complex date filters increased significantly (details in table below).
- Field selection: Accuracy for selecting fields with pivots and dates has also improved.

## v3.1

### Added
Expand Down
52 changes: 8 additions & 44 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,14 @@ Additionally, the extension provides:
- Insight Summarization
- Dynamic Explore Selection

## Setup

Please follow these steps in order for a successful installation.
1. Backend Setup - setup the GCP backend for communicating with the Vertex API [using these instructions.](./explore-assistant-backend/README.md)
2. Looker Connection - setup a Looker connection to the BigQuery dataset created in step 1 [using these instructions.](https://cloud.google.com/looker/docs/db-config-google-bigquery)
3. Example generation - generate a list of examples and upload them to BigQuery [using these instructions.](./explore-assistant-examples/README.md)
4. Frontend Setup - setup Looker Extension Framework Applications [using these instructions.](./explore-assistant-extension/README.md)

### Technologies Used
#### Frontend
- [React](https://reactjs.org/)
Expand All @@ -36,50 +44,6 @@ Additionally, the extension provides:
- [Vertex AI](https://cloud.google.com/vertex-ai)
- [Cloud Functions](https://cloud.google.com/functions)

## Get Started

Getting started involves (*in this order*):
1. Clone or download a copy of this repository to your development machine.
If you have a git ssh_config:
```bash
# cd ~/ Optional. your user directory is usually a good place to git clone to.
git clone [email protected]:looker-open-source/looker-explore-assistant.git
```

If not:
```bash
# cd ~/ Optional. your user directory is usually a good place to git clone to.
git clone https://github.com/looker-open-source/looker-explore-assistant.git
```
Alternatively, open up this repository in:  
[![Open in Cloud Shell](https://gstatic.com/cloudssh/images/open-btn.svg)](https://shell.cloud.google.com/cloudshell/editor?cloudshell_git_repo=https://github.com/looker-open-source/looker-explore-assistant.git&cloudshell_workspace=explore-assistant-extension)
2. Make sure [pip](https://pip.pypa.io/en/stable/cli/pip_install/) is installed on your computer to run the `pip install -r requirements.txt` command line in the setup section.
3. Install [`google-cloud-sdk`](https://cloud.google.com/sdk/docs/install) in the looker-explore-assistant directory to install Google Cloud SDK before the backend setup.
>To install google-cloud-sdk, you can use this command `brew install —cask google-cloud-sdk`. Ensure you have [Homebrew](https://brew.sh/) installed first
4. Create a GCP Project (you’ll need the ID later). It does not have to be the same project as the prompt tables but it is recommended for simplicity
5. Create a Looker connection for that BigQuery project
6. Create an empty Looker project
- Add the connection name to the model file
- Configure git
- That’s all you need to do for now. This is where the extension framework will be deployed. The connection should be the same as the one that holds the prompts

The local cloud function backend and example generation require some python packages. It is recommended to create a python virtual environment and install the dependencies:

```bash
# Use python3 on Mac OS
python -m venv .venv
source .venv/bin/activate
pip install -r ./explore-assistant-examples/requirements.txt
pip install -r ./explore-assistant-cloud-function/requirements.txt
```
> If you hit a blocker with directory permissions, use `chmod +x <FILE NAME>` to allow write permissions.

## Setup

1. Backend Setup - setup the GCP backend for communicating with the Vertex API [using these instructions.](./explore-assistant-backend/README.md)
2. Example generation - generate a list of examples and upload them to BigQuery [using these instructions.](./explore-assistant-examples/README.md)
3. Frontend Setup - setup Looker Extension Framework Applications by following [these instructions](./explore-assistant-extension/README.md).

## Recommendations for fine tuning the model

This app uses a one shot prompt technique for fine tuning a model, meaning that all the metadata for the model is contained in the prompt. It's a good technique for a small dataset, but for a larger dataset, you may want to use a more traditional fine tuning approach. This is a simple implementation, but you can also use a more sophisticated approach that involves generating embeddings for explore metadata and leveraging a vector database for indexing.
Expand Down
19 changes: 17 additions & 2 deletions explore-assistant-backend/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,16 +6,32 @@ This Terraform configuration establishes a backend for the Looker Explore Assist

The Explore Assistant also uses a set of examples to improve the quality of its answers. We store those examples in BigQuery. Please see the comparisons below when deciding which deployment approach to use.

## Google Project

A Google Cloud Project is necessary to provision backend resources. A new google project simplifies installation. If an existing google project is used, Terraform will not be usable.

## Cloud Shell Setup

To simplify the backend installation, you can use the following link to open a Google Cloud Shell.

[![Open in Cloud Shell](https://gstatic.com/cloudssh/images/open-btn.svg)](https://ssh.cloud.google.com/cloudshell/editor?cloudshell_git_repo=https://github.com/bytecodeio/looker-explore-assistant&cloudshell_tutorial=./explore-assistant-backend/cloudshell_README.md&shellonly=true&cloudshell_git_branch=marketplace_deploy)
Within the cloud shell, these [installation instructions](./cloudshell_README.md) will be shown.

## Development Setup

Alternately, follow the below directions for a manual compilation and install.

### What backend should I use?

Here we list the reasons and tradeoffs of each deployment approach in an effort to scope the right backend deployment approach based on individual preferences and existing setups.
Here we list the reasons and tradeoffs of each deployment approach in an effort to scope the right backend deployment approach based on individual preferences and existing setups. The backend setup will default for a Cloud Run installation.

**Regardless of Backend**:
* Any Looker database connection can be used for fetching the actual data returned from the natural language query url
* They implement the same API, as in no Looker Credentials are stored in the backends and the arguments are the same (*ie. model parameters and a prompt*)
* By default both approaches fetch examples from a BigQuery table out of simplicity. For Cloud Functions you can modify [this React Hook](../explore-assistant-extension/src/hooks/useBigQueryExamples.ts) and change the `connection_name` on line 18 to point to the non BQ database connection in Looker that houses your example prompts/training data.

**For Cloud Function/Run**:
* Default method.
* Generally speaking, this approach is recommended for folks who want more development control on the backend
* Your programming language of choice can be used
* Workflows for custom codeflow like using custom models, combining models to improve results, fetching from external datastores, etc. are supported
Expand All @@ -32,7 +48,6 @@ Here we list the reasons and tradeoffs of each deployment approach in an effort

## Prerequisites

- Terraform installed on your machine.
- Access to a GCP account with permission to create and manage resources.
- A GCP project where the resources will be deployed.

Expand Down
236 changes: 236 additions & 0 deletions explore-assistant-backend/cloudshell_README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,236 @@
# Looker Explore Assistant Backend Service

This is an automatic installer of the GCP Cloud Run backend service.
This is intended to be installed in an empty google project. If you are using a project that already has resources in it, see the step-by-step configuration section below.
To begin, please execute:
```
cd explore-assistant-backend/terraform && ./init.sh
```

## Caution
If resources will be deleted or destroyed by Terraform, please abort the process to avoid destroying existing resources. Use the step-by-step configuration section below instead. For more information on the choice of backend, please see [the Backend README.](./README.md)

# Step by Step Configuration

## 1: Set up Environment Variables
Make sure to replace PROJECT_ID and REGION values with actual values. The other values can be left as default.
``` bash
export PROJECT_ID="your-project-id"
export REGION="us-central1"
export DATASET_ID="explore_assistant"
export CLOUD_RUN_SERVICE_NAME="explore-assistant-api"
export VERTEX_CF_AUTH_TOKEN=$(openssl rand -base64 32)
gcloud config set project $PROJECT_ID
echo $VERTEX_CF_AUTH_TOKEN
```
Please copy the Vertex CF Auth Token that is printed out. This will be used later in the frontend setup.

## 2: Enable Required APIs
(for Both backend types)
``` bash
gcloud services enable serviceusage.googleapis.com \
cloudresourcemanager.googleapis.com \
iam.googleapis.com \
aiplatform.googleapis.com \
bigquery.googleapis.com \
cloudapis.googleapis.com \
cloudbuild.googleapis.com \
cloudfunctions.googleapis.com \
run.googleapis.com \
storage-api.googleapis.com \
storage.googleapis.com \
compute.googleapis.com \
secretmanager.googleapis.com \
aiplatform.googleapis.com
```
THEN, you must wait a bit. These APIs need time to activate. Have a coffee, drink some water, then come back and proceed with the next step. Let the scripts do the heavy lifting, but give them time.

## 3: Create Service Account
Did you remember to take a break before starting this step? Please do, the APIs need time to activate & propogate change to the regionless services like IAM. :coffee:
(for Both backend types)
```bash
gcloud iam service-accounts create explore-assistant-cf-sa \
--display-name "Looker Explore Assistant Cloud Function SA"

gcloud projects add-iam-policy-binding $PROJECT_ID \
--member "serviceAccount:explore-assistant-cf-sa@$PROJECT_ID.iam.gserviceaccount.com" \
--role "roles/aiplatform.user"
```

## 4: Create Secret in Secret Manager
(for Cloud Function backend ONLY)
``` bash
echo -n $VERTEX_CF_AUTH_TOKEN | gcloud secrets create VERTEX_CF_AUTH_TOKEN \
--replication-policy=user-managed \
--locations=$REGION \
--data-file=-

gcloud secrets add-iam-policy-binding VERTEX_CF_AUTH_TOKEN \
--member "serviceAccount:explore-assistant-cf-sa@$PROJECT_ID.iam.gserviceaccount.com" \
--role "roles/secretmanager.secretAccessor"
```

## 5: Create Storage Bucket for Cloud Function Source
(for Cloud Function backend ONLY)
``` bash
BUCKET_NAME="${PROJECT_ID}-gcf-source-$(openssl rand -hex 4)"
gsutil mb -p $PROJECT_ID -l US gs://$BUCKET_NAME/
```

## 6: Upload Cloud Function Source Code
(for Cloud Function backend ONLY)
```bash
cd ./explore-assistant-cloud-function && zip -r ../function-source.zip * && cd ..
gsutil cp function-source.zip gs://$BUCKET_NAME/
```

## 7: Create Artifact Registry Repository
(for Cloud Function backend ONLY)
```bash
gcloud artifacts repositories create explore-assistant-repo \
--repository-format=docker \
--location=$REGION \
--project=$PROJECT_ID \
--description="Docker repository for Explore Assistant"
```

## 8: Deploy Cloud Function
(for Cloud Function backend ONLY)
```bash
gcloud functions deploy $CLOUD_RUN_SERVICE_NAME \
--gen2 \
--region=$REGION \
--project=$PROJECT_ID \
--runtime=python310 \
--entry-point=cloud_function_entrypoint \
--trigger-http \
--source=gs://$BUCKET_NAME/function-source.zip \
--set-env-vars=REGION=$REGION,PROJECT=$PROJECT_ID \
--set-secrets=VERTEX_CF_AUTH_TOKEN=VERTEX_CF_AUTH_TOKEN:latest \
--max-instances=10 \
--memory=4Gi \
--timeout=60s \
--service-account=explore-assistant-cf-sa@$PROJECT_ID.iam.gserviceaccount.com
```

## 9: Make Cloud Function Public
(for Cloud Function backend ONLY)
```bash
gcloud functions add-iam-policy-binding $CLOUD_RUN_SERVICE_NAME \
--region=$REGION \
--project=$PROJECT_ID \
--member="allUsers" \
--role="roles/cloudfunctions.invoker"
gcloud functions describe $CLOUD_RUN_SERVICE_NAME --region=$REGION --format='value(httpsTrigger.url)'
```
The Cloud function url will be printed out and should be copied. This will be used in the frontend installation.

## 10: Create BigQuery Dataset
(for Both backend types)
``` bash
bq --location=$REGION mk --dataset $PROJECT_ID:$DATASET_ID
```

## 11: Create BigQuery Tables
(for Both backend types)
``` bash
bq query --use_legacy_sql=false --location=$REGION \
"CREATE OR REPLACE TABLE \`${DATASET_ID}.explore_assistant_examples\` (
explore_id STRING OPTIONS (description = 'Explore id of the explore to pull examples for in a format of -> lookml_model:lookml_explore'),
examples STRING OPTIONS (description = 'Examples for Explore Assistant training. JSON document with list hashes each with input and output keys.')
)"

bq query --use_legacy_sql=false --location=$REGION \
"CREATE OR REPLACE TABLE \`${DATASET_ID}.explore_assistant_refinement_examples\` (
explore_id STRING OPTIONS (description = 'Explore id of the explore to pull examples for in a format of -> lookml_model:lookml_explore'),
examples STRING OPTIONS (description = 'Examples for Explore Assistant training. JSON document with list hashes each with input and output keys.')
)"

bq query --use_legacy_sql=false --location=$REGION \
"CREATE OR REPLACE TABLE \`${DATASET_ID}.explore_assistant_samples\` (
explore_id STRING OPTIONS (description = 'Explore id of the explore to pull examples for in a format of -> lookml_model:lookml_explore'),
samples STRING OPTIONS (description = 'Samples for Explore Assistant Samples displayed in UI. JSON document with listed samples with category, prompt and color keys.')
)"
```

## 12: Create BigQuery Connection and Model
(For BigQuery backend install ONLY)
> **Note:** This process is very expensive. It is better to use the Cloud Function Backend
``` bash
gcloud services enable bigqueryconnection.googleapis.com

bq mk --connection \
--connection_type=CLOUD_RESOURCE \
--project_id=$PROJECT_ID \
--location=$REGION \
explore_assistant_llm

gcloud projects add-iam-policy-binding $PROJECT_ID \
--member "serviceAccount:$(bq show --format=json --connection --project_id=$PROJECT_ID --location=$REGION explore_assistant_llm | jq -r .cloudResource.serviceAccountId)" \
--role "roles/aiplatform.user"

bq query --use_legacy_sql=false --location=$REGION \
"CREATE OR REPLACE MODEL \`${DATASET_ID}.explore_assistant_llm\`
REMOTE WITH CONNECTION \`$PROJECT_ID.$REGION.explore_assistant_llm\`
OPTIONS (endpoint = 'gemini-1.5-flash')"
```

## 13: Optional: Configure Security Settings
If you want to restrict access to your Cloud Function to Looker's specific IP ranges, you can run these additional steps.
First, determine the list of your [Looker IPs](https://cloud.google.com/looker/docs/enabling-secure-db-access#:~:text=The%20list%20of%20IP%20addresses,(es)%20that%20are%20shown.)

## 13.1: Set variables
Please modify the below command to include your [Looker IPs](https://cloud.google.com/looker/docs/enabling-secure-db-access#:~:text=The%20list%20of%20IP%20addresses,(es)%20that%20are%20shown.).
```bash
export ALLOWED_IP_ADDRESSES="your.ip.address/32,second.ip.address/32,third.ip.address/32"
export VPC_NETWORK_NAME=explore-assistant-vpc
export SUBNET_NAME=explore-assistant-subnet
export VPC_CONNECTOR_NAME=eavpcconnector
export SECURITY_POLICY_NAME=eapolicy
```
## 13.2: Create network and subnet
```bash
gcloud compute networks create $VPC_NETWORK_NAME \
--subnet-mode=custom

gcloud compute networks subnets create $SUBNET_NAME \
--network=$VPC_NETWORK_NAME \
--region=$REGION \
--range=10.0.0.0/24
```
## 13.3 Create VPC Connector (takes a while)
```bash
gcloud compute networks vpc-access connectors create $VPC_CONNECTOR_NAME \
--network $VPC_NETWORK_NAME \
--region $REGION \
--range 10.8.0.0/28
```
## 13.4 Update Cloud Function to use VPC Connector
```bash
gcloud run services update $CLOUD_RUN_SERVICE_NAME \
--vpc-connector $VPC_CONNECTOR_NAME \
--region $REGION \
--project $PROJECT_ID
```

## 13.5 Create and configure security policy
```bash
gcloud compute security-policies create $SECURITY_POLICY_NAME \
--description "Restrict access to specific IP addresses"

gcloud compute security-policies rules create 1000 \
--security-policy $SECURITY_POLICY_NAME \
--description "Allow Looker IP addresses" \
--src-ip-ranges $ALLOWED_IP_ADDRESSES \
--action allow

gcloud compute security-policies rules create 2000 \
--security-policy $SECURITY_POLICY_NAME \
--description "Deny all other IP addresses" \
--src-ip-ranges="0.0.0.0/0" \
--action deny-403
```

## 13.6: Done with optional security restrictions

For step-by-step debugging or manual configuration of the security settings, refer to the Google Cloud documentation on [Cloud Armor](https://cloud.google.com/armor/docs) and [VPC Service Controls](https://cloud.google.com/vpc-service-controls/docs).
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
terraform {
backend "gcs" {
bucket = "project-id-terraform-state"
prefix = "terraform/state"
}
}
}
Loading