-
Notifications
You must be signed in to change notification settings - Fork 919
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] The status of batch job is ERROR even the job is executed successfully #5169
Comments
Hello @zhifanggao, |
Sorry I missed this issue.
Why? The configuration you provided indicates that you are going to run the Spark application in K8s Cluster mode, the Driver should be launched in a dedicated Pod. |
I roughly remember we had an offline discussion on WeChat, and the root cause is due to some reason(configuration issue?) that the Spark application runs in local mode rather than the K8s cluster mode. Let me close then, and feel free to re-open if you still have such an issue. |
The cluster mode has been used, but the correct status of the task still cannot be obtained. here is spark-submit info
|
@wardlican have you checked via |
@wardlican BTW, if possible, please try 1.8.1 or at least 1.7.3, we have made significant improvements for Kyuubi/Spark on K8s in recent versions. |
Code of Conduct
Search before asking
Describe the bug
test steps:
the status of batch job is ERROR, In fact the batch job is executed successfully.
Checked the kyuubi logs
kyuub server check the driver pod tagged with batch id , once it is not found, It will mark the batch job error status.
But in fact , no driver pod is created .
Affects Version(s)
1.7.1
Kyuubi Server Log Output
No response
Kyuubi Engine Log Output
No response
Kyuubi Server Configurations
Kyuubi Engine Configurations
No response
Additional context
No response
Are you willing to submit PR?
The text was updated successfully, but these errors were encountered: