-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Hudi] Add integration tests for Hudi #3338
[Hudi] Add integration tests for Hudi #3338
Conversation
run-integration-tests.py
Outdated
@@ -256,7 +256,8 @@ def run_uniform_hudi_integration_tests(root_dir, version, extra_maven_repo, use_ | |||
python_root_dir = path.join(root_dir, "python") | |||
extra_class_path = path.join(python_root_dir, path.join("delta", "testing")) | |||
package = ','.join([ | |||
"io.delta:delta-%s_2.12:%s" % (get_artifact_name(version), version)]) | |||
"io.delta:delta-%s_2.12:%s" % (get_artifact_name(version), version), | |||
"org.apache.hudi:hudi-spark3.5-bundle_2.12:0.15.0"]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we have to hardcode the dependency versions here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
<!-- Thanks for sending a pull request! Here are some tips for you: 1. If this is your first time, please read our contributor guidelines: https://github.com/delta-io/delta/blob/master/CONTRIBUTING.md 2. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP] Your PR title ...'. 3. Be sure to keep the PR description updated to reflect all changes. 4. Please write your PR title to summarize what this PR proposes. 5. If possible, provide a concise example to reproduce the issue for a faster review. 6. If applicable, include the corresponding issue number in the PR title and link it in the body. --> #### Which Delta project/connector is this regarding? <!-- Please add the component selected below to the beginning of the pull request title For example: [Spark] Title of my pull request --> - [ ] Spark - [ ] Standalone - [ ] Flink - [ ] Kernel - [x] Other (Hudi) ## Description <!-- - Describe what this PR changes. - Describe why we need the change. If this PR resolves an issue be sure to include "Resolves #XXX" to correctly link and close the issue upon merge. --> This PR adds integration tests for Hudi. Previously, the integration test was not able to actually read the table from a Hudi engine due to Spark incompatibility, but since the new Hudi 15.0.0 release supports Spark 3.5 we can now add verifications that actually read the tables from Hudi. ## How was this patch tested? <!-- If tests were added, say they were added here. Please make sure to test the changes thoroughly including negative and positive cases if possible. If the changes were tested in any way other than unit tests, please clarify how you tested step by step (ideally copy and paste-able, so that other reviewers can test and check, and descendants can verify in the future). If the changes were not tested, please explain why. --> Added integration tests ## Does this PR introduce _any_ user-facing changes? <!-- If yes, please clarify the previous behavior and the change this PR proposes - provide the console output, description and/or an example to show the behavior difference if possible. If possible, please also clarify if this is a user-facing change compared to the released Delta Lake versions or within the unreleased branches such as master. If no, write 'No'. --> No
Which Delta project/connector is this regarding?
Description
This PR adds integration tests for Hudi. Previously, the integration test was not able to actually read the table from a Hudi engine due to Spark incompatibility, but since the new Hudi 15.0.0 release supports Spark 3.5 we can now add verifications that actually read the tables from Hudi.
How was this patch tested?
Added integration tests
Does this PR introduce any user-facing changes?
No