Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Handle changes from Databricks Python SDK 0.37.0 #320

Merged
merged 6 commits into from
Nov 15, 2024

Conversation

JCZuurmond
Copy link
Member

@JCZuurmond JCZuurmond commented Nov 15, 2024

Handle changes from Databricks Python SDK 0.37.0: LakeviewAPI now works with a Dashboard object

  • Requires change in Python SDK --> Implemented workaround, see this issue about removing the workaround when the SDK resolved the LakeviewAPI deployment issues

@JCZuurmond JCZuurmond marked this pull request as ready for review November 15, 2024 10:22
Copy link

github-actions bot commented Nov 15, 2024

❌ 33/36 passed, 3 failed, 4 skipped, 4m41s total

❌ test_runtime_backend_errors_handled[\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors import NotFound\nbackend = RuntimeBackend()\ntry:\n query_response = backend.fetch("SELECT * FROM TEST_SCHEMA.__RANDOM__")\n return "FAILED"\nexcept NotFound as e:\n return "PASSED"\n]: assert '{"ts": "2024...]}}\n"PASSED"' == 'PASSED' (20.956s)
... (skipped 14294 bytes)
Using Databricks Metadata Service authentication
10:31 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
<   "active": true,
<   "displayName": "labs-runtime-identity",
<   "emails": [
<     {
<       "primary": true,
<       "type": "work",
<       "value": "**REDACTED**"
<     }
<   ],
<   "entitlements": [
<     {
<       "value": "**REDACTED**"
<     },
<     "... (1 additional elements)"
<   ],
<   "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
<   "groups": [
<     {
<       "$ref": "Groups/300667344111082",
<       "display": "labs.scope.runtime",
<       "type": "direct",
<       "value": "**REDACTED**"
<     }
<   ],
<   "id": "4643477475987733",
<   "name": {
<     "givenName": "labs-runtime-identity"
<   },
<   "schemas": [
<     "urn:ietf:params:scim:schemas:core:2.0:User",
<     "... (1 additional elements)"
<   ],
<   "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
10:31 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmpgcxhfdi7/working-copy in /tmp/tmpgcxhfdi7
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/wheels/databricks_labs_lsql-0.13.1+720241115103101-py3-none-any.whl
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
<   "error_code": "RESOURCE_DOES_NOT_EXIST",
<   "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/wheels) does not exist."
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/wheels
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
>   "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/wheels"
> }
< 200 OK
< {}
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684589
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/version.json
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684595
< }
10:31 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 8.0,
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_memory_mb": 32768,
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "[email protected]",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "[email protected]",
<     "DatabricksInstanceGroupId": "-8854613105865987054",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "[email protected]",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.8.17",
<     "instance_id": "925dca160e3b4a04b67c09f0c33a089e",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "880251be7d884b56b843a65cec9b0b08",
<     "private_ip": "10.179.10.17",
<     "public_dns": "",
<     "start_timestamp": 1731666196640
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D8as_v4",
<   "effective_spark_version": "16.0.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1731666293600,
<   "last_restarted_time": 1731666236329,
<   "last_state_loss_time": 1731666236300,
<   "node_type_id": "Standard_D8as_v4",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 5394234943045964788,
<   "spark_version": "16.0.x-scala2.12",
<   "spec": {
<     "autotermination_minutes": 60,
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "16.0.x-scala2.12"
<   },
<   "start_time": 1731598210709,
<   "state": "RUNNING",
<   "state_message": ""
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/contexts/create
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "8982472358871536790"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=8982472358871536790
< 200 OK
< {
<   "id": "8982472358871536790",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=8982472358871536790: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=8982472358871536790
< 200 OK
< {
<   "id": "8982472358871536790",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=8982472358871536790: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=8982472358871536790
< 200 OK
< {
<   "id": "8982472358871536790",
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "get_ipython().run_line_magic('pip', 'install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959... (110 more bytes)",
>   "contextId": "8982472358871536790",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "65fb120b399340a98b3eea4b2147da45"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=65fb120b399340a98b3eea4b2147da45&contextId=8982472358871536790
< 200 OK
< {
<   "id": "65fb120b399340a98b3eea4b2147da45",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=65fb120b399340a98b3eea4b2147da45, context_id=8982472358871536790: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=65fb120b399340a98b3eea4b2147da45&contextId=8982472358871536790
< 200 OK
< {
<   "id": "65fb120b399340a98b3eea4b2147da45",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=65fb120b399340a98b3eea4b2147da45, context_id=8982472358871536790: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=65fb120b399340a98b3eea4b2147da45&contextId=8982472358871536790
< 200 OK
< {
<   "id": "65fb120b399340a98b3eea4b2147da45",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=65fb120b399340a98b3eea4b2147da45, context_id=8982472358871536790: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~3s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=65fb120b399340a98b3eea4b2147da45&contextId=8982472358871536790
< 200 OK
< {
<   "id": "65fb120b399340a98b3eea4b2147da45",
<   "results": {
<     "data": "Processing /Workspace/Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/wheels/databricks_labs_ls... (3270 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "import json\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors ... (204 more bytes)",
>   "contextId": "8982472358871536790",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "f6be8b10548a4b6e8594f1ec92adf849"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=f6be8b10548a4b6e8594f1ec92adf849&contextId=8982472358871536790
< 200 OK
< {
<   "id": "f6be8b10548a4b6e8594f1ec92adf849",
<   "results": {
<     "data": "{\"ts\": \"2024-11-15 10:31:20,603\", \"level\": \"ERROR\", \"logger\": \"SQLQueryContextLogger\", \"msg\": \"[... (13306 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 WARNING [databricks.sdk] cannot parse converted return statement. Just returning text
Traceback (most recent call last):
  File "/home/runner/work/lsql/lsql/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/commands.py", line 123, in run
    return json.loads(results.data)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/__init__.py", line 346, in loads
    return _TEST_SCHEMA_decoder.decode(s)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/decoder.py", line 340, in decode
    raise JSONDecodeError("Extra data", s, end)
json.decoder.JSONDecodeError: Extra data: line 2 column 1 (char 13394)
10:30 DEBUG [databricks.sdk] Loaded from environment
10:30 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
10:30 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
10:30 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
10:31 INFO [databricks.sdk] Using Databricks Metadata Service authentication
10:31 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
<   "active": true,
<   "displayName": "labs-runtime-identity",
<   "emails": [
<     {
<       "primary": true,
<       "type": "work",
<       "value": "**REDACTED**"
<     }
<   ],
<   "entitlements": [
<     {
<       "value": "**REDACTED**"
<     },
<     "... (1 additional elements)"
<   ],
<   "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
<   "groups": [
<     {
<       "$ref": "Groups/300667344111082",
<       "display": "labs.scope.runtime",
<       "type": "direct",
<       "value": "**REDACTED**"
<     }
<   ],
<   "id": "4643477475987733",
<   "name": {
<     "givenName": "labs-runtime-identity"
<   },
<   "schemas": [
<     "urn:ietf:params:scim:schemas:core:2.0:User",
<     "... (1 additional elements)"
<   ],
<   "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
10:31 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmpgcxhfdi7/working-copy in /tmp/tmpgcxhfdi7
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/wheels/databricks_labs_lsql-0.13.1+720241115103101-py3-none-any.whl
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
<   "error_code": "RESOURCE_DOES_NOT_EXIST",
<   "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/wheels) does not exist."
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/wheels
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
>   "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/wheels"
> }
< 200 OK
< {}
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684589
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/version.json
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684595
< }
10:31 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 8.0,
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_memory_mb": 32768,
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "[email protected]",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "[email protected]",
<     "DatabricksInstanceGroupId": "-8854613105865987054",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "[email protected]",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.8.17",
<     "instance_id": "925dca160e3b4a04b67c09f0c33a089e",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "880251be7d884b56b843a65cec9b0b08",
<     "private_ip": "10.179.10.17",
<     "public_dns": "",
<     "start_timestamp": 1731666196640
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D8as_v4",
<   "effective_spark_version": "16.0.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1731666293600,
<   "last_restarted_time": 1731666236329,
<   "last_state_loss_time": 1731666236300,
<   "node_type_id": "Standard_D8as_v4",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 5394234943045964788,
<   "spark_version": "16.0.x-scala2.12",
<   "spec": {
<     "autotermination_minutes": 60,
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "16.0.x-scala2.12"
<   },
<   "start_time": 1731598210709,
<   "state": "RUNNING",
<   "state_message": ""
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/contexts/create
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "8982472358871536790"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=8982472358871536790
< 200 OK
< {
<   "id": "8982472358871536790",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=8982472358871536790: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=8982472358871536790
< 200 OK
< {
<   "id": "8982472358871536790",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=8982472358871536790: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=8982472358871536790
< 200 OK
< {
<   "id": "8982472358871536790",
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "get_ipython().run_line_magic('pip', 'install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959... (110 more bytes)",
>   "contextId": "8982472358871536790",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "65fb120b399340a98b3eea4b2147da45"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=65fb120b399340a98b3eea4b2147da45&contextId=8982472358871536790
< 200 OK
< {
<   "id": "65fb120b399340a98b3eea4b2147da45",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=65fb120b399340a98b3eea4b2147da45, context_id=8982472358871536790: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=65fb120b399340a98b3eea4b2147da45&contextId=8982472358871536790
< 200 OK
< {
<   "id": "65fb120b399340a98b3eea4b2147da45",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=65fb120b399340a98b3eea4b2147da45, context_id=8982472358871536790: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=65fb120b399340a98b3eea4b2147da45&contextId=8982472358871536790
< 200 OK
< {
<   "id": "65fb120b399340a98b3eea4b2147da45",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=65fb120b399340a98b3eea4b2147da45, context_id=8982472358871536790: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~3s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=65fb120b399340a98b3eea4b2147da45&contextId=8982472358871536790
< 200 OK
< {
<   "id": "65fb120b399340a98b3eea4b2147da45",
<   "results": {
<     "data": "Processing /Workspace/Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/wheels/databricks_labs_ls... (3270 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "import json\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors ... (204 more bytes)",
>   "contextId": "8982472358871536790",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "f6be8b10548a4b6e8594f1ec92adf849"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=f6be8b10548a4b6e8594f1ec92adf849&contextId=8982472358871536790
< 200 OK
< {
<   "id": "f6be8b10548a4b6e8594f1ec92adf849",
<   "results": {
<     "data": "{\"ts\": \"2024-11-15 10:31:20,603\", \"level\": \"ERROR\", \"logger\": \"SQLQueryContextLogger\", \"msg\": \"[... (13306 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 WARNING [databricks.sdk] cannot parse converted return statement. Just returning text
Traceback (most recent call last):
  File "/home/runner/work/lsql/lsql/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/commands.py", line 123, in run
    return json.loads(results.data)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/__init__.py", line 346, in loads
    return _TEST_SCHEMA_decoder.decode(s)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/decoder.py", line 340, in decode
    raise JSONDecodeError("Extra data", s, end)
json.decoder.JSONDecodeError: Extra data: line 2 column 1 (char 13394)
[gw0] linux -- Python 3.10.15 /home/runner/work/lsql/lsql/.venv/bin/python
❌ test_runtime_backend_errors_handled[\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors import NotFound\nbackend = RuntimeBackend()\ntry:\n backend.execute("SELECT * FROM TEST_SCHEMA.__RANDOM__")\n return "FAILED"\nexcept NotFound as e:\n return "PASSED"\n]: assert '{"ts": "2024...]}}\n"PASSED"' == 'PASSED' (21.482s)
... (skipped 14294 bytes)
Using Databricks Metadata Service authentication
10:31 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
<   "active": true,
<   "displayName": "labs-runtime-identity",
<   "emails": [
<     {
<       "primary": true,
<       "type": "work",
<       "value": "**REDACTED**"
<     }
<   ],
<   "entitlements": [
<     {
<       "value": "**REDACTED**"
<     },
<     "... (1 additional elements)"
<   ],
<   "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
<   "groups": [
<     {
<       "$ref": "Groups/300667344111082",
<       "display": "labs.scope.runtime",
<       "type": "direct",
<       "value": "**REDACTED**"
<     }
<   ],
<   "id": "4643477475987733",
<   "name": {
<     "givenName": "labs-runtime-identity"
<   },
<   "schemas": [
<     "urn:ietf:params:scim:schemas:core:2.0:User",
<     "... (1 additional elements)"
<   ],
<   "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
10:31 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmp0ue9srb7/working-copy in /tmp/tmp0ue9srb7
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/wheels/databricks_labs_lsql-0.13.1+720241115103101-py3-none-any.whl
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
<   "error_code": "RESOURCE_DOES_NOT_EXIST",
<   "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/wheels) does not exist."
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/wheels
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
>   "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/wheels"
> }
< 200 OK
< {}
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684590
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/version.json
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684594
< }
10:31 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 8.0,
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_memory_mb": 32768,
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "[email protected]",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "[email protected]",
<     "DatabricksInstanceGroupId": "-8854613105865987054",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "[email protected]",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.8.17",
<     "instance_id": "925dca160e3b4a04b67c09f0c33a089e",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "880251be7d884b56b843a65cec9b0b08",
<     "private_ip": "10.179.10.17",
<     "public_dns": "",
<     "start_timestamp": 1731666196640
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D8as_v4",
<   "effective_spark_version": "16.0.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1731666293600,
<   "last_restarted_time": 1731666236329,
<   "last_state_loss_time": 1731666236300,
<   "node_type_id": "Standard_D8as_v4",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 5394234943045964788,
<   "spark_version": "16.0.x-scala2.12",
<   "spec": {
<     "autotermination_minutes": 60,
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "16.0.x-scala2.12"
<   },
<   "start_time": 1731598210709,
<   "state": "RUNNING",
<   "state_message": ""
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/contexts/create
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "7323052038081084640"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7323052038081084640
< 200 OK
< {
<   "id": "7323052038081084640",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7323052038081084640: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7323052038081084640
< 200 OK
< {
<   "id": "7323052038081084640",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7323052038081084640: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7323052038081084640
< 200 OK
< {
<   "id": "7323052038081084640",
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "get_ipython().run_line_magic('pip', 'install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959... (110 more bytes)",
>   "contextId": "7323052038081084640",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "9ed8ee611424496ebc4535affb6a5db4"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=9ed8ee611424496ebc4535affb6a5db4&contextId=7323052038081084640
< 200 OK
< {
<   "id": "9ed8ee611424496ebc4535affb6a5db4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=9ed8ee611424496ebc4535affb6a5db4, context_id=7323052038081084640: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=9ed8ee611424496ebc4535affb6a5db4&contextId=7323052038081084640
< 200 OK
< {
<   "id": "9ed8ee611424496ebc4535affb6a5db4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=9ed8ee611424496ebc4535affb6a5db4, context_id=7323052038081084640: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=9ed8ee611424496ebc4535affb6a5db4&contextId=7323052038081084640
< 200 OK
< {
<   "id": "9ed8ee611424496ebc4535affb6a5db4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=9ed8ee611424496ebc4535affb6a5db4, context_id=7323052038081084640: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~3s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=9ed8ee611424496ebc4535affb6a5db4&contextId=7323052038081084640
< 200 OK
< {
<   "id": "9ed8ee611424496ebc4535affb6a5db4",
<   "results": {
<     "data": "Processing /Workspace/Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/wheels/databricks_labs_ls... (3270 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "import json\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors ... (189 more bytes)",
>   "contextId": "7323052038081084640",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "741685f193fd40aba06f633767eb504c"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=741685f193fd40aba06f633767eb504c&contextId=7323052038081084640
< 200 OK
< {
<   "id": "741685f193fd40aba06f633767eb504c",
<   "results": {
<     "data": "{\"ts\": \"2024-11-15 10:31:21,146\", \"level\": \"ERROR\", \"logger\": \"SQLQueryContextLogger\", \"msg\": \"[... (13306 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 WARNING [databricks.sdk] cannot parse converted return statement. Just returning text
Traceback (most recent call last):
  File "/home/runner/work/lsql/lsql/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/commands.py", line 123, in run
    return json.loads(results.data)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/__init__.py", line 346, in loads
    return _TEST_SCHEMA_decoder.decode(s)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/decoder.py", line 340, in decode
    raise JSONDecodeError("Extra data", s, end)
json.decoder.JSONDecodeError: Extra data: line 2 column 1 (char 13394)
10:30 DEBUG [databricks.sdk] Loaded from environment
10:30 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
10:30 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
10:30 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
10:31 INFO [databricks.sdk] Using Databricks Metadata Service authentication
10:31 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
<   "active": true,
<   "displayName": "labs-runtime-identity",
<   "emails": [
<     {
<       "primary": true,
<       "type": "work",
<       "value": "**REDACTED**"
<     }
<   ],
<   "entitlements": [
<     {
<       "value": "**REDACTED**"
<     },
<     "... (1 additional elements)"
<   ],
<   "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
<   "groups": [
<     {
<       "$ref": "Groups/300667344111082",
<       "display": "labs.scope.runtime",
<       "type": "direct",
<       "value": "**REDACTED**"
<     }
<   ],
<   "id": "4643477475987733",
<   "name": {
<     "givenName": "labs-runtime-identity"
<   },
<   "schemas": [
<     "urn:ietf:params:scim:schemas:core:2.0:User",
<     "... (1 additional elements)"
<   ],
<   "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
10:31 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmp0ue9srb7/working-copy in /tmp/tmp0ue9srb7
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/wheels/databricks_labs_lsql-0.13.1+720241115103101-py3-none-any.whl
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
<   "error_code": "RESOURCE_DOES_NOT_EXIST",
<   "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/wheels) does not exist."
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/wheels
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
>   "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/wheels"
> }
< 200 OK
< {}
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684590
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/version.json
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684594
< }
10:31 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 8.0,
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_memory_mb": 32768,
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "[email protected]",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "[email protected]",
<     "DatabricksInstanceGroupId": "-8854613105865987054",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "[email protected]",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.8.17",
<     "instance_id": "925dca160e3b4a04b67c09f0c33a089e",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "880251be7d884b56b843a65cec9b0b08",
<     "private_ip": "10.179.10.17",
<     "public_dns": "",
<     "start_timestamp": 1731666196640
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D8as_v4",
<   "effective_spark_version": "16.0.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1731666293600,
<   "last_restarted_time": 1731666236329,
<   "last_state_loss_time": 1731666236300,
<   "node_type_id": "Standard_D8as_v4",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 5394234943045964788,
<   "spark_version": "16.0.x-scala2.12",
<   "spec": {
<     "autotermination_minutes": 60,
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "16.0.x-scala2.12"
<   },
<   "start_time": 1731598210709,
<   "state": "RUNNING",
<   "state_message": ""
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/contexts/create
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "7323052038081084640"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7323052038081084640
< 200 OK
< {
<   "id": "7323052038081084640",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7323052038081084640: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7323052038081084640
< 200 OK
< {
<   "id": "7323052038081084640",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7323052038081084640: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7323052038081084640
< 200 OK
< {
<   "id": "7323052038081084640",
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "get_ipython().run_line_magic('pip', 'install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959... (110 more bytes)",
>   "contextId": "7323052038081084640",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "9ed8ee611424496ebc4535affb6a5db4"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=9ed8ee611424496ebc4535affb6a5db4&contextId=7323052038081084640
< 200 OK
< {
<   "id": "9ed8ee611424496ebc4535affb6a5db4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=9ed8ee611424496ebc4535affb6a5db4, context_id=7323052038081084640: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=9ed8ee611424496ebc4535affb6a5db4&contextId=7323052038081084640
< 200 OK
< {
<   "id": "9ed8ee611424496ebc4535affb6a5db4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=9ed8ee611424496ebc4535affb6a5db4, context_id=7323052038081084640: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=9ed8ee611424496ebc4535affb6a5db4&contextId=7323052038081084640
< 200 OK
< {
<   "id": "9ed8ee611424496ebc4535affb6a5db4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=9ed8ee611424496ebc4535affb6a5db4, context_id=7323052038081084640: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~3s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=9ed8ee611424496ebc4535affb6a5db4&contextId=7323052038081084640
< 200 OK
< {
<   "id": "9ed8ee611424496ebc4535affb6a5db4",
<   "results": {
<     "data": "Processing /Workspace/Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/wheels/databricks_labs_ls... (3270 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "import json\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors ... (189 more bytes)",
>   "contextId": "7323052038081084640",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "741685f193fd40aba06f633767eb504c"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=741685f193fd40aba06f633767eb504c&contextId=7323052038081084640
< 200 OK
< {
<   "id": "741685f193fd40aba06f633767eb504c",
<   "results": {
<     "data": "{\"ts\": \"2024-11-15 10:31:21,146\", \"level\": \"ERROR\", \"logger\": \"SQLQueryContextLogger\", \"msg\": \"[... (13306 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 WARNING [databricks.sdk] cannot parse converted return statement. Just returning text
Traceback (most recent call last):
  File "/home/runner/work/lsql/lsql/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/commands.py", line 123, in run
    return json.loads(results.data)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/__init__.py", line 346, in loads
    return _TEST_SCHEMA_decoder.decode(s)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/decoder.py", line 340, in decode
    raise JSONDecodeError("Extra data", s, end)
json.decoder.JSONDecodeError: Extra data: line 2 column 1 (char 13394)
[gw1] linux -- Python 3.10.15 /home/runner/work/lsql/lsql/.venv/bin/python
❌ test_runtime_backend_errors_handled[\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors import NotFound\nbackend = RuntimeBackend()\ntry:\n query_response = backend.fetch("DESCRIBE __RANDOM__")\n return "FAILED"\nexcept NotFound as e:\n return "PASSED"\n]: assert '{"ts": "2024...]}}\n"PASSED"' == 'PASSED' (19.868s)
... (skipped 14138 bytes)
Using Databricks Metadata Service authentication
10:31 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
<   "active": true,
<   "displayName": "labs-runtime-identity",
<   "emails": [
<     {
<       "primary": true,
<       "type": "work",
<       "value": "**REDACTED**"
<     }
<   ],
<   "entitlements": [
<     {
<       "value": "**REDACTED**"
<     },
<     "... (1 additional elements)"
<   ],
<   "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
<   "groups": [
<     {
<       "$ref": "Groups/300667344111082",
<       "display": "labs.scope.runtime",
<       "type": "direct",
<       "value": "**REDACTED**"
<     }
<   ],
<   "id": "4643477475987733",
<   "name": {
<     "givenName": "labs-runtime-identity"
<   },
<   "schemas": [
<     "urn:ietf:params:scim:schemas:core:2.0:User",
<     "... (1 additional elements)"
<   ],
<   "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
10:31 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmp6vyfvfx8/working-copy in /tmp/tmp6vyfvfx8
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/wheels/databricks_labs_lsql-0.13.1+720241115103121-py3-none-any.whl
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
<   "error_code": "RESOURCE_DOES_NOT_EXIST",
<   "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/wheels) does not exist."
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/wheels
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
>   "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/wheels"
> }
< 200 OK
< {}
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684624
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/version.json
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684626
< }
10:31 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 8.0,
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_memory_mb": 32768,
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "[email protected]",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "[email protected]",
<     "DatabricksInstanceGroupId": "-8854613105865987054",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "[email protected]",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.8.17",
<     "instance_id": "925dca160e3b4a04b67c09f0c33a089e",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "880251be7d884b56b843a65cec9b0b08",
<     "private_ip": "10.179.10.17",
<     "public_dns": "",
<     "start_timestamp": 1731666196640
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D8as_v4",
<   "effective_spark_version": "16.0.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1731666293600,
<   "last_restarted_time": 1731666236329,
<   "last_state_loss_time": 1731666236300,
<   "node_type_id": "Standard_D8as_v4",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 5394234943045964788,
<   "spark_version": "16.0.x-scala2.12",
<   "spec": {
<     "autotermination_minutes": 60,
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "16.0.x-scala2.12"
<   },
<   "start_time": 1731598210709,
<   "state": "RUNNING",
<   "state_message": ""
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/contexts/create
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "4908315292167573232"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=4908315292167573232
< 200 OK
< {
<   "id": "4908315292167573232",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=4908315292167573232: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=4908315292167573232
< 200 OK
< {
<   "id": "4908315292167573232",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=4908315292167573232: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=4908315292167573232
< 200 OK
< {
<   "id": "4908315292167573232",
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "get_ipython().run_line_magic('pip', 'install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959... (110 more bytes)",
>   "contextId": "4908315292167573232",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "657f3f2567d044deb834f4cdde1dcfc4"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=657f3f2567d044deb834f4cdde1dcfc4&contextId=4908315292167573232
< 200 OK
< {
<   "id": "657f3f2567d044deb834f4cdde1dcfc4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=657f3f2567d044deb834f4cdde1dcfc4, context_id=4908315292167573232: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=657f3f2567d044deb834f4cdde1dcfc4&contextId=4908315292167573232
< 200 OK
< {
<   "id": "657f3f2567d044deb834f4cdde1dcfc4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=657f3f2567d044deb834f4cdde1dcfc4, context_id=4908315292167573232: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=657f3f2567d044deb834f4cdde1dcfc4&contextId=4908315292167573232
< 200 OK
< {
<   "id": "657f3f2567d044deb834f4cdde1dcfc4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=657f3f2567d044deb834f4cdde1dcfc4, context_id=4908315292167573232: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~3s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=657f3f2567d044deb834f4cdde1dcfc4&contextId=4908315292167573232
< 200 OK
< {
<   "id": "657f3f2567d044deb834f4cdde1dcfc4",
<   "results": {
<     "data": "Processing /Workspace/Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/wheels/databricks_labs_ls... (3270 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "import json\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors ... (191 more bytes)",
>   "contextId": "4908315292167573232",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "b9f2d30da7e34f8280e2603fc8bcc90a"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=b9f2d30da7e34f8280e2603fc8bcc90a&contextId=4908315292167573232
< 200 OK
< {
<   "id": "b9f2d30da7e34f8280e2603fc8bcc90a",
<   "results": {
<     "data": "{\"ts\": \"2024-11-15 10:31:41,015\", \"level\": \"ERROR\", \"logger\": \"SQLQueryContextLogger\", \"msg\": \"[... (13170 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 WARNING [databricks.sdk] cannot parse converted return statement. Just returning text
Traceback (most recent call last):
  File "/home/runner/work/lsql/lsql/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/commands.py", line 123, in run
    return json.loads(results.data)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/__init__.py", line 346, in loads
    return _TEST_SCHEMA_decoder.decode(s)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/decoder.py", line 340, in decode
    raise JSONDecodeError("Extra data", s, end)
json.decoder.JSONDecodeError: Extra data: line 2 column 1 (char 13258)
10:31 DEBUG [databricks.sdk] Loaded from environment
10:31 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
10:31 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
10:31 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
10:31 INFO [databricks.sdk] Using Databricks Metadata Service authentication
10:31 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
<   "active": true,
<   "displayName": "labs-runtime-identity",
<   "emails": [
<     {
<       "primary": true,
<       "type": "work",
<       "value": "**REDACTED**"
<     }
<   ],
<   "entitlements": [
<     {
<       "value": "**REDACTED**"
<     },
<     "... (1 additional elements)"
<   ],
<   "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
<   "groups": [
<     {
<       "$ref": "Groups/300667344111082",
<       "display": "labs.scope.runtime",
<       "type": "direct",
<       "value": "**REDACTED**"
<     }
<   ],
<   "id": "4643477475987733",
<   "name": {
<     "givenName": "labs-runtime-identity"
<   },
<   "schemas": [
<     "urn:ietf:params:scim:schemas:core:2.0:User",
<     "... (1 additional elements)"
<   ],
<   "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
10:31 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmp6vyfvfx8/working-copy in /tmp/tmp6vyfvfx8
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/wheels/databricks_labs_lsql-0.13.1+720241115103121-py3-none-any.whl
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
<   "error_code": "RESOURCE_DOES_NOT_EXIST",
<   "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/wheels) does not exist."
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/wheels
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
>   "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/wheels"
> }
< 200 OK
< {}
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684624
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/version.json
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684626
< }
10:31 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 8.0,
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_memory_mb": 32768,
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "[email protected]",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "[email protected]",
<     "DatabricksInstanceGroupId": "-8854613105865987054",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "[email protected]",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.8.17",
<     "instance_id": "925dca160e3b4a04b67c09f0c33a089e",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "880251be7d884b56b843a65cec9b0b08",
<     "private_ip": "10.179.10.17",
<     "public_dns": "",
<     "start_timestamp": 1731666196640
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D8as_v4",
<   "effective_spark_version": "16.0.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1731666293600,
<   "last_restarted_time": 1731666236329,
<   "last_state_loss_time": 1731666236300,
<   "node_type_id": "Standard_D8as_v4",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 5394234943045964788,
<   "spark_version": "16.0.x-scala2.12",
<   "spec": {
<     "autotermination_minutes": 60,
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "16.0.x-scala2.12"
<   },
<   "start_time": 1731598210709,
<   "state": "RUNNING",
<   "state_message": ""
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/contexts/create
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "4908315292167573232"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=4908315292167573232
< 200 OK
< {
<   "id": "4908315292167573232",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=4908315292167573232: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=4908315292167573232
< 200 OK
< {
<   "id": "4908315292167573232",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=4908315292167573232: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=4908315292167573232
< 200 OK
< {
<   "id": "4908315292167573232",
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "get_ipython().run_line_magic('pip', 'install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959... (110 more bytes)",
>   "contextId": "4908315292167573232",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "657f3f2567d044deb834f4cdde1dcfc4"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=657f3f2567d044deb834f4cdde1dcfc4&contextId=4908315292167573232
< 200 OK
< {
<   "id": "657f3f2567d044deb834f4cdde1dcfc4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=657f3f2567d044deb834f4cdde1dcfc4, context_id=4908315292167573232: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=657f3f2567d044deb834f4cdde1dcfc4&contextId=4908315292167573232
< 200 OK
< {
<   "id": "657f3f2567d044deb834f4cdde1dcfc4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=657f3f2567d044deb834f4cdde1dcfc4, context_id=4908315292167573232: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=657f3f2567d044deb834f4cdde1dcfc4&contextId=4908315292167573232
< 200 OK
< {
<   "id": "657f3f2567d044deb834f4cdde1dcfc4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=657f3f2567d044deb834f4cdde1dcfc4, context_id=4908315292167573232: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~3s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=657f3f2567d044deb834f4cdde1dcfc4&contextId=4908315292167573232
< 200 OK
< {
<   "id": "657f3f2567d044deb834f4cdde1dcfc4",
<   "results": {
<     "data": "Processing /Workspace/Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/wheels/databricks_labs_ls... (3270 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "import json\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors ... (191 more bytes)",
>   "contextId": "4908315292167573232",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "b9f2d30da7e34f8280e2603fc8bcc90a"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=b9f2d30da7e34f8280e2603fc8bcc90a&contextId=4908315292167573232
< 200 OK
< {
<   "id": "b9f2d30da7e34f8280e2603fc8bcc90a",
<   "results": {
<     "data": "{\"ts\": \"2024-11-15 10:31:41,015\", \"level\": \"ERROR\", \"logger\": \"SQLQueryContextLogger\", \"msg\": \"[... (13170 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 WARNING [databricks.sdk] cannot parse converted return statement. Just returning text
Traceback (most recent call last):
  File "/home/runner/work/lsql/lsql/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/commands.py", line 123, in run
    return json.loads(results.data)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/__init__.py", line 346, in loads
    return _TEST_SCHEMA_decoder.decode(s)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/decoder.py", line 340, in decode
    raise JSONDecodeError("Extra data", s, end)
json.decoder.JSONDecodeError: Extra data: line 2 column 1 (char 13258)
[gw1] linux -- Python 3.10.15 /home/runner/work/lsql/lsql/.venv/bin/python

Running from acceptance #450

Copy link
Collaborator

@nfx nfx left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@nfx nfx merged commit 69c6e97 into main Nov 15, 2024
8 of 9 checks passed
@nfx nfx deleted the fix/databricks-sdk-0370-changes branch November 15, 2024 10:56
github-merge-queue bot pushed a commit to databrickslabs/ucx that referenced this pull request Nov 18, 2024
Pin Databricks SDK to minor version to circumvent the problems with the
[0.37.0
release](https://github.com/databricks/databricks-sdk-py/releases/tag/v0.37.0).
Will unpin later when we resolved that issues.

Waiting on solution in:
- [ ] [lsql](databrickslabs/lsql#320)
- [ ] [Databricks python
sdk](databricks/databricks-sdk-py#824)
github-merge-queue bot pushed a commit to databrickslabs/ucx that referenced this pull request Nov 18, 2024
Pin Databricks SDK to minor version to circumvent the problems with the
[0.37.0
release](https://github.com/databricks/databricks-sdk-py/releases/tag/v0.37.0).
Will unpin later when we resolved that issues.

Waiting on solution in:
- [ ] [lsql](databrickslabs/lsql#320)
- [ ] [Databricks python
sdk](databricks/databricks-sdk-py#824)
@asnare
Copy link
Contributor

asnare commented Nov 18, 2024

@nfx, @JCZuurmond : Should this PR have included a change to pyproject.toml ? After these changes I think we require databricks-sdk~=0.37 ?

@JCZuurmond
Copy link
Member Author

Is within range already: "databricks-sdk>=0.29.0",

JCZuurmond pushed a commit that referenced this pull request Nov 19, 2024
…331)

This PR updates the project specification to require a Databricks SDK
version compatible with 0.37.0. This is needed since the changes in
#320.

Resolves #330.
JCZuurmond added a commit that referenced this pull request Nov 19, 2024
* Changes to work with Databricks SDK `v0.38.0` ([#350](#350)). In this release, the open-source library has been updated to be compatible with Databricks SDK version 0.38.0, addressing issues [#349](#349) to [#332](#332). The changes include modifying the `create_dashboard` function, which now directly passes the `SDKDashboard` instance to the `ws.lakeview.create` and `ws.lakeview.update` methods, eliminating the need for dictionary conversions. A new SQL query, `NYC_TAXI_TRIPS_LIMITED`, has been introduced, and the `test_sql_execution` method has been updated to use this query. The `test_sql_execution_partial` and `test_sql_execution_as_iterator` methods have been removed, and the `test_fetch_one_works` method now includes an assertion to verify the returned row's `pickup_zip` value. These updates improve the library's compatibility with the latest Databricks SDK, refactor SQL query-based tests, and enhance test reliability.
* Specify the minimum required version of `databricks-sdk` as 0.37.0 ([#331](#331)). In this release, we have updated the minimum required version of the `databricks-sdk` package to 0.37.0, as specified in the project's `pyproject.toml` file. This update is necessary due to the modifications made in pull request [#320](#320), which constrains the `databricks-sdk` version to 0.37.x for compatible updates within the same minor version. This change also resolves issue [#330](#330). It is important to note that no new methods have been added or existing functionality changed in the codebase as part of this commit. Therefore, the impact on the existing functionality should be minimal and confined to the interaction with the `databricks-sdk` package.
@JCZuurmond JCZuurmond mentioned this pull request Nov 19, 2024
JCZuurmond added a commit that referenced this pull request Nov 19, 2024
* Changes to work with Databricks SDK `v0.38.0` ([#350](#350)). In this release, we have upgraded the Databricks SDK to version 0.38.0 from version 0.37.0 to ensure compatibility with the latest SDK and address several issues. The update includes changes to make the code compatible with the new SDK version, removing the need for `.as_dict()` method calls when creating or updating dashboards and utilizing a `sdk_dashboard` variable for interacting with the Databricks workspace. We also updated the dependencies to "databricks-labs-blueprint[yaml]" package version greater than or equal to 0.4.2 and `sqlglot` package version greater than or equal to 22.3.1. The `test_core.py` file has been updated to address multiple issues ([#349](#349) to [#332](#332)) related to the Databricks SDK and the `test_dashboards.py` file has been revised to work with the new SDK version. These changes improve integration with Databricks' lakeview dashboards, simplify the code, and ensure compatibility with the latest SDK version, resolving issues [#349](#349) to [#332](#332).
* Specify the minimum required version of `databricks-sdk` as 0.37.0 ([#331](#331)). In this release, we have updated the minimum required version of the `databricks-sdk` package to 0.37.0 from 0.29.0 in the `pyproject.toml` file to ensure compatibility with the latest version. This change was made necessary due to updates made in issue [#320](#320). To accommodate any patch release of `databricks-sdk` with a major and minor version of 0.37, we have updated the dependency constraint to use the `~=` operator, resolving issue [#330](#330). These changes are intended to enhance the compatibility and stability of our software.
@JCZuurmond JCZuurmond mentioned this pull request Nov 19, 2024
JCZuurmond added a commit that referenced this pull request Nov 19, 2024
* Changes to work with Databricks SDK `v0.38.0`
([#350](#350)). In this
release, we have upgraded the Databricks SDK to version 0.38.0 from
version 0.37.0 to ensure compatibility with the latest SDK and address
several issues. The update includes changes to make the code compatible
with the new SDK version, removing the need for `.as_dict()` method
calls when creating or updating dashboards and utilizing a
`sdk_dashboard` variable for interacting with the Databricks workspace.
We also updated the dependencies to "databricks-labs-blueprint[yaml]"
package version greater than or equal to 0.4.2 and `sqlglot` package
version greater than or equal to 22.3.1. The `test_core.py` file has
been updated to address multiple issues
([#349](#349) to
[#332](#332)) related to
the Databricks SDK and the `test_dashboards.py` file has been revised to
work with the new SDK version. These changes improve integration with
Databricks' lakeview dashboards, simplify the code, and ensure
compatibility with the latest SDK version, resolving issues
[#349](#349) to
[#332](#332).
* Specify the minimum required version of `databricks-sdk` as 0.37.0
([#331](#331)). In this
release, we have updated the minimum required version of the
`databricks-sdk` package to 0.37.0 from 0.29.0 in the `pyproject.toml`
file to ensure compatibility with the latest version. This change was
made necessary due to updates made in issue
[#320](#320). To
accommodate any patch release of `databricks-sdk` with a major and minor
version of 0.37, we have updated the dependency constraint to use the
`~=` operator, resolving issue
[#330](#330). These changes
are intended to enhance the compatibility and stability of our software.
gueniai added a commit that referenced this pull request Nov 19, 2024
* Changes to work with Databricks SDK `v0.38.0` ([#350](#350)). In this release, we have updated the Databricks SDK from version 0.37 to 0.38 to ensure compatibility with the latest SDK version. This update includes modifications to the `test_dashboards.py` file, where the `create_dashboard` function has been updated to use the new `sdk_dashboard` object directly, eliminating the need for dictionary conversion using the `as_dict()` method. This change has been implemented in both the `ws.lakeview.create` and `ws.lakeview.update` methods. These updates address several issues, as listed in the commit message, which were related to compatibility and functionality with the previous SDK version. Additionally, new methods such as `test_sql_execution`, `test_sql_execution_as_iterator`, and an updated `test_fetch_one_works` method have been introduced to improve compatibility with the Databricks SDK version 0.38.0. These changes are essential for adopters of the project who wish to ensure compatibility with the latest version of the Databricks SDK.
* Release v0.14.1 ([#352](#352)). 0.14.1 release brings updates for compatibility with Databricks SDK v0.38.0, addressing several issues and improving code compatibility with the new SDK version. It removes the need for `.as_dict()` method calls when creating or updating dashboards, introduces a `sdk_dashboard` variable for interacting with the Databricks workspace, and updates dependencies to include the "databricks-labs-blueprint[yaml]" package version 0.4.2 or greater and `sqlglot` package version 22.3.1 or greater. Test files have been revised to address multiple issues related to the Databricks SDK, and the minimum required version of `databricks-sdk` has been updated to 0.37.0 from 0.29.0. These changes improve integration with Databricks' lakeview dashboards, simplify the code, and enhance compatibility and stability of the software.
* Specify the minimum required version of `databricks-sdk` as 0.37.0 ([#331](#331)). In this release, we have updated the minimum required version of the `databricks-sdk` library to a version compatible with 0.37.0. This update is necessary due to changes introduced in pull request [#320](#320). We now specify the `databricks-sdk` version using the "~=" operator, indicating that any version with a major version number of 0 and minor version number of 37 is acceptable. This allows for future updates while ensuring compatibility with the required features. This change enhances the reliability and compatibility of the project by ensuring that it utilizes a recent and supported version of the `databricks-sdk`.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working internal
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants