-
Notifications
You must be signed in to change notification settings - Fork 0
/
spark-submit_issue.txt
153 lines (135 loc) · 11.1 KB
/
spark-submit_issue.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
./bin/spark-submit \
--master k8s://https://k8s-api-pa1.cloud.garr.it:443 \
--deploy-mode cluster \
--name spark-pi \
--class org.apache.spark.examples.SparkPi \
--conf spark.executor.instances=2 \
--conf spark.kubernetes.container.image=dr4thmos/spark:0.5 \
--conf spark.kubernetes.namespace=g-thomascecconello-unimibit \
--conf spark.kubernetes.authenticate.driver.serviceAccountName=spark \
local:///$SPARK_HOME/examples/jars/spark-examples_2.12-3.0.0.jar 100000
---------- SOLVED --------------
I think is something related to
"ERROR Config: Failed to parse the kubeconfig." and
"WARN Config: Failed to read openstack env vars: Missing environment variable [OS_AUTH_URL]"
Because this "missing environment variable [OS_AUTH_URL]" is a message that occurs in other uncorrelated command such as:
>> export SPARK_SERVICE_IP=$(kubectl get svc --namespace g-thomascecconello-unimibit spark-webui)
Failed to read openstack env vars: Missing environment variable [OS_AUTH_URL]
Unable to connect to the server: getting credentials: exec: exit status 1
In reverse, this command works (that it's part of the above one)
>> kubectl get svc --namespace g-thomascecconello-unimibit spark-webui
Command under accuse:
./bin/spark-submit \
--master k8s://https://k8s-api-pa1.cloud.garr.it:443 \
--deploy-mode cluster \
--name spark-pi \
--class org.apache.spark.examples.SparkPi \
--conf spark.executor.instances=2 \
--conf spark.kubernetes.container.image=dr4thmos/spark:0.5 \
local:///home/neanias/spark/spark-3.0.0-bin-hadoop2.7/examples/jars/spark-examples_2.12-3.0.0.jar 100000
20/07/31 10:29:00 ERROR Config: Failed to parse the kubeconfig.
io.fabric8.kubernetes.client.KubernetesClientException: An error has occurred.
at io.fabric8.kubernetes.client.KubernetesClientException.launderThrowable(KubernetesClientException.java:64)
at io.fabric8.kubernetes.client.KubernetesClientException.launderThrowable(KubernetesClientException.java:53)
at io.fabric8.kubernetes.client.utils.Serialization.unmarshal(Serialization.java:248)
at io.fabric8.kubernetes.client.utils.Serialization.unmarshal(Serialization.java:199)
at io.fabric8.kubernetes.client.utils.Serialization.unmarshal(Serialization.java:186)
at io.fabric8.kubernetes.client.Config.loadFromKubeconfig(Config.java:574)
at io.fabric8.kubernetes.client.Config.tryKubeConfig(Config.java:491)
at io.fabric8.kubernetes.client.Config.autoConfigure(Config.java:230)
at io.fabric8.kubernetes.client.Config.<init>(Config.java:214)
at io.fabric8.kubernetes.client.Config.autoConfigure(Config.java:225)
at org.apache.spark.deploy.k8s.SparkKubernetesClientFactory$.createKubernetesClient(SparkKubernetesClientFactory.scala:80)
at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.$anonfun$run$3(KubernetesClientApplication.scala:215)
at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2538)
at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.run(KubernetesClientApplication.scala:215)
at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.start(KubernetesClientApplication.scala:188)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: com.fasterxml.jackson.databind.exc.MismatchedInputException: No content to map due to end-of-input
at [Source: (BufferedInputStream); line: 1, column: 1]
at com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:59)
at com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:4344)
at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4189)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3250)
at io.fabric8.kubernetes.client.utils.Serialization.unmarshal(Serialization.java:246)
... 19 more
20/07/31 10:29:00 WARN Config: Failed to read openstack env vars: Missing environment variable [OS_AUTH_URL]
20/07/31 10:29:00 ERROR Config: Failed to parse the kubeconfig.
io.fabric8.kubernetes.client.KubernetesClientException: An error has occurred.
at io.fabric8.kubernetes.client.KubernetesClientException.launderThrowable(KubernetesClientException.java:64)
at io.fabric8.kubernetes.client.KubernetesClientException.launderThrowable(KubernetesClientException.java:53)
at io.fabric8.kubernetes.client.utils.Serialization.unmarshal(Serialization.java:248)
at io.fabric8.kubernetes.client.utils.Serialization.unmarshal(Serialization.java:199)
at io.fabric8.kubernetes.client.utils.Serialization.unmarshal(Serialization.java:186)
at io.fabric8.kubernetes.client.Config.loadFromKubeconfig(Config.java:574)
at io.fabric8.kubernetes.client.Config.tryKubeConfig(Config.java:491)
at io.fabric8.kubernetes.client.Config.autoConfigure(Config.java:230)
at io.fabric8.kubernetes.client.Config.autoConfigure(Config.java:226)
at org.apache.spark.deploy.k8s.SparkKubernetesClientFactory$.createKubernetesClient(SparkKubernetesClientFactory.scala:80)
at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.$anonfun$run$3(KubernetesClientApplication.scala:215)
at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2538)
at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.run(KubernetesClientApplication.scala:215)
at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.start(KubernetesClientApplication.scala:188)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: com.fasterxml.jackson.databind.exc.MismatchedInputException: No content to map due to end-of-input
at [Source: (BufferedInputStream); line: 1, column: 1]
at com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:59)
at com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:4344)
at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4189)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3250)
at io.fabric8.kubernetes.client.utils.Serialization.unmarshal(Serialization.java:246)
... 18 more
20/07/31 10:29:00 INFO KerberosConfDriverFeatureStep: You have not specified a krb5.conf file locally or via a ConfigMap. Make sure that you have the krb5.conf locally on the driver image.
20/07/31 10:29:01 WARN WatchConnectionManager: Exec Failure: HTTP 401, Status: 401 - Unauthorized
java.net.ProtocolException: Expected HTTP 101 response but was '401 Unauthorized'
at okhttp3.internal.ws.RealWebSocket.checkResponse(RealWebSocket.java:229)
at okhttp3.internal.ws.RealWebSocket$2.onResponse(RealWebSocket.java:196)
at okhttp3.RealCall$AsyncCall.execute(RealCall.java:203)
at okhttp3.internal.NamedRunnable.run(NamedRunnable.java:32)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Exception in thread "main" io.fabric8.kubernetes.client.KubernetesClientException: Unauthorized
at io.fabric8.kubernetes.client.dsl.internal.WatchConnectionManager$1.onFailure(WatchConnectionManager.java:203)
at okhttp3.internal.ws.RealWebSocket.failWebSocket(RealWebSocket.java:571)
at okhttp3.internal.ws.RealWebSocket$2.onResponse(RealWebSocket.java:198)
at okhttp3.RealCall$AsyncCall.execute(RealCall.java:203)
at okhttp3.internal.NamedRunnable.run(NamedRunnable.java:32)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Suppressed: java.lang.Throwable: waiting here
at io.fabric8.kubernetes.client.utils.Utils.waitUntilReady(Utils.java:144)
at io.fabric8.kubernetes.client.dsl.internal.WatchConnectionManager.waitUntilReady(WatchConnectionManager.java:341)
at io.fabric8.kubernetes.client.dsl.base.BaseOperation.watch(BaseOperation.java:755)
at io.fabric8.kubernetes.client.dsl.base.BaseOperation.watch(BaseOperation.java:739)
at io.fabric8.kubernetes.client.dsl.base.BaseOperation.watch(BaseOperation.java:70)
at org.apache.spark.deploy.k8s.submit.Client.$anonfun$run$1(KubernetesClientApplication.scala:129)
at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2538)
at org.apache.spark.deploy.k8s.submit.Client.run(KubernetesClientApplication.scala:129)
at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.$anonfun$run$4(KubernetesClientApplication.scala:221)
at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.$anonfun$run$4$adapted(KubernetesClientApplication.scala:215)
at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2539)
at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.run(KubernetesClientApplication.scala:215)
at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.start(KubernetesClientApplication.scala:188)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
20/07/31 10:29:01 INFO ShutdownHookManager: Shutdown hook called
20/07/31 10:29:01 INFO ShutdownHookManager: Deleting directory /tmp/spark-7a74a268-b11e-4869-af67-c2fc6509b04e