Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Do not merge] DAI e2e test flow to support migration #22

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
387 changes: 387 additions & 0 deletions 6 DAI:Migration Test.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,387 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "2d6f9dba",
"metadata": {},
"source": [
"## Create an AI engine\n",
"Create a Driverless AI engine for access to automated machine learning to build models for us on our data. \n",
"\n",
"See the `2 AI Engines` tutorial for more details on how to use and interact with **Steam** for creating and managing your AI Engines.\n"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "7737ea3f",
"metadata": {},
"outputs": [],
"source": [
"from h2o_ai_cloud import steam_client\n",
"from h2osteam.clients import DriverlessClient"
]
},
{
"cell_type": "markdown",
"id": "9dd339a5",
"metadata": {},
"source": [
"## Securely connect \n",
"We first connect to the H2O AI Cloud using our platform token to create a token provider object. We can then use this object to log into Steam and other APIs."
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "6df7371e",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Visit https://cloud-dev.h2o.ai/auth/get-platform-token to get your platform token\n"
]
}
],
"source": [
"steam = steam_client()"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "5092a663",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"['jg-driverless-kubernetes', 'aaa', 'admin-driverless-kubernetes', 'dai-with-h2o-drive-k8s', 'default-driverless-kubernetes-low-mem', 'gbq-driverless-kubernetes', 'default-driverless-kubernetes', 'visitor-driverless-kubernetes', 'se-driverless-kubernetes', 'dai-multinode-tmpfs', 'dai-multinode-tmpfs-14cores', 'multinode-single-user', 'multinode-single-user-fixed-workers', 'multinode-single-user-no-dask', 'multinode-single-user-no-gpu', 'default-driverless-kubernetes-migration', 'multinode-single-user-cpu', 'default-driverless-kubernetes-max_cores8', 'custom-driverless-cpu-test', 'custom-driverless-cpu-test-4core']\n"
]
}
],
"source": [
"# Profiles determin the resourcs that an AI Engine will have access to - see the Steam tutorial for more information\n",
"dai_profiles = [profile[\"name\"] for profile in steam.get_profiles() if profile[\"profile_type\"] == \"driverless_kubernetes\"]\n",
"\n",
"print(dai_profiles)"
]
},
{
"cell_type": "code",
"execution_count": 12,
"id": "2398dfce",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"['default-driverless-kubernetes-migration']\n"
]
}
],
"source": [
"# Get the migration profile\n",
"migration_profile = \"default-driverless-kubernetes-migration\"\n",
"dai_profiles = [profile for profile in dai_profiles if profile == migration_profile]\n",
"print(dai_profiles)"
]
},
{
"cell_type": "code",
"execution_count": 26,
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"DAI status is running\n",
"DAI run status is running\n"
]
}
],
"source": [
"# Create DAI instance\n",
"DAI_VERSION = \"1.10.4.99\"\n",
"dai_instance_name = \"joby-dai-e2e-migration\"\n",
"\n",
"try:\n",
" dai_instance = DriverlessClient(steam).get_instance(name=dai_instance_name)\n",
"except Exception as e:\n",
" print(f'Starting new DAI {DAI_VERSION} instance')\n",
" dai_instance = DriverlessClient(steam).launch_instance(\n",
" name = dai_instance_name,\n",
" version = DAI_VERSION,\n",
" profile_name=dai_profiles[0],\n",
" )\n",
"status = dai_instance.status()\n",
"print(f'DAI status is {status}')\n",
"\n",
"if status == \"stopped\":\n",
" dai_instance.start()\n",
"\n",
"print(f'DAI run status is {dai_instance.status()}')"
],
"metadata": {
"collapsed": false
}
},
{
"cell_type": "code",
"execution_count": 28,
"outputs": [],
"source": [
"# Get the DAI python client\n",
"dai_client = dai_instance.connect()"
],
"metadata": {
"collapsed": false
}
},
{
"cell_type": "markdown",
"source": [
"## Driverless python client APIs\n",
"DAI functionality with python client"
],
"metadata": {
"collapsed": false
}
},
{
"cell_type": "code",
"execution_count": 32,
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"1.10.4\n",
"['upload', 'file', 'hdfs', 's3', 'recipe_file', 'recipe_url', 'h2o_drive', 'feature_store']\n"
]
}
],
"source": [
"print(dai_client.server.version)\n",
"print(dai_client.connectors.list())"
],
"metadata": {
"collapsed": false
}
},
{
"cell_type": "code",
"execution_count": 39,
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"1.10.2-walmart-catboost-byor (c74a42bc-aeb1-11ec-8f4b-0242ac120002)\n"
]
}
],
"source": [
"# List of experiments available\n",
"list_of_experiments = dai_client.experiments.list()\n",
"experiment = [exp for exp in list_of_experiments if '1.10.2' in exp.name][0]\n",
"print(experiment)"
],
"metadata": {
"collapsed": false
}
},
{
"cell_type": "markdown",
"source": [
"### Fetch experiment datasets\n"
],
"metadata": {
"collapsed": false
}
},
{
"cell_type": "code",
"execution_count": 48,
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"{'train_dataset': <class 'Dataset'> ff759814-aeaf-11ec-b224-0242ac120002 walmart_train.csv, 'validation_dataset': None, 'test_dataset': None}\n",
"ff759814-aeaf-11ec-b224-0242ac120002\n"
]
}
],
"source": [
"assert experiment.is_running() is False\n",
"print(experiment.datasets)\n",
"dataset = experiment.datasets.get('train_dataset')\n",
"print(dataset.key)"
],
"metadata": {
"collapsed": false
}
},
{
"cell_type": "code",
"execution_count": 55,
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"dai_e2e_test_migration\n"
]
}
],
"source": [
"# Create a test project\n",
"project_name = \"dai_e2e_test_migration\"\n",
"projects = dai_client.projects.list()\n",
"\n",
"list_of_project = [p for p in projects if project_name in p.name]\n",
"if len(list_of_project) == 0 :\n",
" project = dai_client.projects.create(\n",
" name = project_name,\n",
" description = \"Test migration\",\n",
" )\n",
"else:\n",
" project = list_of_project[0]\n",
"print(project.name)"
],
"metadata": {
"collapsed": false
}
},
{
"cell_type": "code",
"execution_count": 56,
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
" | Type | Key | Name\n",
"----+------------+--------------------------------------+------------------------------\n",
" 0 | Experiment | c74a42bc-aeb1-11ec-8f4b-0242ac120002 | 1.10.2-walmart-catboost-byor\n",
"{'test_datasets': | Type | Key | Name\n",
"----+--------+-------+--------, 'train_datasets': | Type | Key | Name\n",
"----+---------+--------------------------------------+-------------------\n",
" 0 | Dataset | ff759814-aeaf-11ec-b224-0242ac120002 | walmart_train.csv, 'validation_datasets': | Type | Key | Name\n",
"----+--------+-------+--------}\n"
]
}
],
"source": [
"# Attach experiment and dataset to the project\n",
"project.link_experiment(experiment = experiment)\n",
"\n",
"print(project.experiments)\n",
"print(project.datasets)"
],
"metadata": {
"collapsed": false
}
},
{
"cell_type": "markdown",
"source": [
"### Query for the experiment artifacts in mlops\n"
],
"metadata": {
"collapsed": false
}
},
{
"cell_type": "code",
"execution_count": 58,
"outputs": [],
"source": [
"from h2o_ai_cloud import mlops_client"
],
"metadata": {
"collapsed": false
}
},
{
"cell_type": "code",
"execution_count": 60,
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Visit https://cloud-dev.h2o.ai/auth/get-platform-token to get your platform token\n"
]
}
],
"source": [
"mlops = mlops_client()"
],
"metadata": {
"collapsed": false
}
},
{
"cell_type": "code",
"execution_count": 63,
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"27498d52-15c5-4a48-b2fa-f136bc90522c dai_e2e_test_migration\n",
"492b8ec5-5a48-4551-870c-edcc606fc5c3 dai_e2e_migration\n",
"cca74583-bf46-492b-b017-4070197f7ae3 test_project\n"
]
}
],
"source": [
"mlops_projects = mlops.storage.project.list_projects(body={}).project\n",
"for p in mlops_projects:\n",
" print(p.id, p.display_name)"
],
"metadata": {
"collapsed": false
}
},
{
"cell_type": "code",
"execution_count": null,
"outputs": [],
"source": [
"# Stop DAI instance\n",
"dai_instance.stop()"
],
"metadata": {
"collapsed": false
}
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.15"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,8 @@ These tutorials were explictly tested last in H2O AI Cloud v22.07.0 and python 3

```bash
pip install h2o_authn==0.1.1
pip install https://enterprise-steam.s3.amazonaws.com/release/1.8.12/python/h2osteam-1.8.12-py2.py3-none-any.whl
pip install https://s3.amazonaws.com/artifacts.h2o.ai/releases/ai/h2o/mlops/rel-0.56.1/2/h2o_mlops_client-0.56.1%2Bdd66f93.rel0.56.1.2-py2.py3-none-any.whl
pip install https://enterprise-steam.s3.amazonaws.com/release/1.8.14/python/h2osteam-1.8.14-py2.py3-none-any.whl
pip install https://s3.amazonaws.com/artifacts.h2o.ai/releases/ai/h2o/mlops/rel-0.57.2/2/h2o_mlops_client-0.57.2%2B7b19723.rel0.57.2.2-py2.py3-none-any.whl
pip install https://h2o-release.s3.amazonaws.com/h2o/rel-zumbo/2/Python/h2o-3.36.1.2-py2.py3-none-any.whl

```
Expand Down
Loading