-
Notifications
You must be signed in to change notification settings - Fork 50
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Implementing support to use VPC Network Peering (#219)
* predicting for only the users with traffic in the past 72h - purchase propensity * running inference only for users events in the past 72h * including 72h users for all models predictions * considering null values in TabWorkflow models * deleting unused pipfile * upgrading lib versions * implementing reporting preprocessing as a new pipeline * adding more code documentation * adding important information on the main README.md and DEVELOPMENT.md * adding schedule run name and more code documentation * implementing a new scheduler using the vertex ai sdk & adding user_id to procedures for consistency * adding more code documentation * adding code doc to the python custom component * adding more code documentation * fixing aggregated predictions query * removing unnecessary resources from deployment * Writing MDS guide * adding the MDS developer and troubleshooting documentation * fixing deployment for activation pipelines and gemini dataset * Update README.md * Update README.md * Update README.md * Update README.md * removing deprecated api * fixing purchase propensity pipelines names * adding extra condition for when there is not enough data for the window interval to be applied on backfill procedures * adding more instructions for post deployment and fixing issues when GA4 export was configured for less than 10 days * removing unnecessary comments * adding the number of past days to process in the variables files * adding comment about combining data from different ga4 export datasets to data store * fixing small issues with feature engineering and ml pipelines * fixing hyper parameter tuning for kmeans modeling * fixing optuna parameters * adding cloud shell image * fixing the list of all possible users in the propensity training preparation tables * additional guardrails for when there is not enough data * adding more documentation * adding more doc to feature store * add feature store documentation * adding ml pipelines docs * adding ml pipelines docs * adding more documentation * adding user agent client info * fixing scope of client info * fix * removing client_info from vertex components * fixing versioning of tf submodules * reconfiguring meta providers * fixing issue 187 * chore(deps): upgrade terraform providers and modules version * chore(deps): set the provider version * chore: formatting * fix: brand naming * fix: typo * fixing secrets issue * implementing secrets region as tf variable * implementing secrets region as tf variable * last changes requested by lgrangeau * documenting keys location better * implementing vpc peering network --------- Co-authored-by: Carlos Timoteo <[email protected]> Co-authored-by: Laurent Grangeau <[email protected]>
- Loading branch information
1 parent
b4d0fa7
commit ec07683
Showing
8 changed files
with
288 additions
and
18 deletions.
There are no files selected for viewing
Large diffs are not rendered by default.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -2,25 +2,29 @@ | |
name = "ma-components" | ||
version = "1.0.0" | ||
description = "contains components used in marketing analytics project. the need is to package the components and containerise so that they can be used from the python function based component" | ||
authors = ["Christos Aniftos <[email protected]>"] | ||
authors = ["Marketing Analytics Solutions Architects <[email protected]>"] | ||
license = "Apache 2.0" | ||
readme = "README.md" | ||
packages = [{include = "ma_components"}] | ||
|
||
[tool.poetry.dependencies] | ||
python = ">=3.8,<3.11" | ||
pip = "23.3" | ||
kfp = "2.4.0" | ||
## Fixing this error: https://stackoverflow.com/questions/76175487/sudden-importerror-cannot-import-name-appengine-from-requests-packages-urlli | ||
kfp = "2.0.0-rc.2" | ||
#kfp = "2.0.0-rc.2" | ||
#kfp = {version = "2.0.0-b12", allow-prereleases = true} | ||
#kfp = {version = "2.0.0-b16", allow-prereleases = true} | ||
kfp-server-api = "2.0.0-rc.1" | ||
kfp-server-api = "2.0.5" | ||
#kfp-server-api = "2.0.0-rc.1" | ||
#kfp-server-api = "2.0.0.a6" | ||
#kfp-server-api = "2.0.0b1" | ||
urllib3 = "1.26.18" | ||
toml = "^0.10.2" | ||
docker = "^6.0.1" | ||
google-cloud-bigquery = "2.30.0" | ||
google-cloud-aiplatform = "1.52.0" | ||
#google-cloud-aiplatform = "1.52.0" | ||
google-cloud-aiplatform = "1.70.0" | ||
shapely = "<2.0.0" | ||
google-cloud-pubsub = "2.15.0" | ||
#google-cloud-pipeline-components = "1.0.33" | ||
|
@@ -35,6 +39,8 @@ pyarrow = "15.0.2" | |
google-auth-oauthlib = "^1.2.1" | ||
oauth2client = "^4.1.3" | ||
google-cloud-core = "^2.4.1" | ||
sympy="1.13.1" | ||
google-cloud-resource-manager="1.13.0" | ||
|
||
[build-system] | ||
requires = ["poetry-core>=1.0.0"] | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters