You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I use Terraform databricks_job to submit a job with many tasks. IFAIK, each task is a separated Spark Application with a specific applicationId (https://spark.apache.org/docs/3.1.3/api/python/reference/api/pyspark.SparkContext.applicationId.html). These tasks are working fine except they have the same applicationId because Databricks use app_yyyyMMddHHmmss as application and because all these apps are submitted at the same time by Terraform, hence they all have the same application. Due to same applicationId I'm not able to filter metrics in LogAnalytics Workspace, anyone face the same problem ?
The text was updated successfully, but these errors were encountered:
Hi, I use Terraform
databricks_job
to submit a job with many tasks. IFAIK, each task is a separated Spark Application with a specific applicationId (https://spark.apache.org/docs/3.1.3/api/python/reference/api/pyspark.SparkContext.applicationId.html). These tasks are working fine except they have the same applicationId because Databricks useapp_yyyyMMddHHmmss
as application and because all these apps are submitted at the same time by Terraform, hence they all have the same application. Due to same applicationId I'm not able to filter metrics in LogAnalytics Workspace, anyone face the same problem ?The text was updated successfully, but these errors were encountered: