You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug, including details regarding any error messages, version, and platform.
The Spark integration jobs were using the Java implementation to build spark against. They are currently failing with:
Removed build tracker: '/tmp/pip-build-tracker-_6u01r9l'
+ popd
+ '[' OFF == ON ']'
/
/arrow/ci/scripts/integration_spark.sh: line 36: pushd: /arrow/java: No such file or directory
This should be investigated and fixed. I am unsure whether this should be moved to the arrow-java implementation.
Component(s)
Continuous Integration, Java, Python
The text was updated successfully, but these errors were encountered:
@kou it seems we removed spark being built with latest Java arrow here: #44946 but we missed to remove the version retrieval.
Do we want to add a job to build spark against Arrow Java main? This job should potentially live in arrow-java
raulcd
added a commit
to raulcd/arrow
that referenced
this issue
Dec 9, 2024
…ntegration and update test structure for PySpark (#44981)
### Rationale for this change
The job is currently failing.
### What changes are included in this PR?
Remove unnecessary check on Java code and refactor pyspark test modules to follow new test structure: apache/spark#49104
### Are these changes tested?
Via archery
### Are there any user-facing changes?
No
* GitHub Issue: #44980
Authored-by: Raúl Cumplido <[email protected]>
Signed-off-by: Raúl Cumplido <[email protected]>
Describe the bug, including details regarding any error messages, version, and platform.
The Spark integration jobs were using the Java implementation to build spark against. They are currently failing with:
This should be investigated and fixed. I am unsure whether this should be moved to the arrow-java implementation.
Component(s)
Continuous Integration, Java, Python
The text was updated successfully, but these errors were encountered: