Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] add unit tests for katib sdk #2305

Conversation

shashank-iitbhu
Copy link
Contributor

What this PR does / why we need it:

Which issue(s) this PR fixes (optional, in fixes #<issue number>(, fixes #<issue_number>, ...) format, will close the issue(s) when PR gets merged):
Fixes #2184

Checklist:

  • Docs included if any changes are user facing

Signed-off-by: Shashank Mittal <[email protected]>
Copy link

[APPROVALNOTIFIER] This PR is NOT APPROVED

This pull-request has been approved by: shashank-iitbhu
Once this PR has been reviewed and has the lgtm label, please assign tenzen-y for approval. For more information see the Kubernetes Code Review Process.

The full list of commands accepted by this bot can be found here.

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

mock_get_namespaced_custom_object.return_value = mock_response
experiment = katib_client.get_experiment("test-experiment")
assert experiment == valid_experiment

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

test_get_experiment_success is failing currently, can you help me fix this?

================================================================ short test summary info ================================================================
FAILED sdk/python/v1beta1/kubeflow/katib/api/katib_client_test.py::TestGetExperiment::test_get_experiment_success - AssertionError: assert None == {'api_version': None,\n 'kind': None,\n 'metadata': {'annotations': None,\n              'cluster_name': None,\n     ...
======================================================= 1 failed, 5 passed, 100 warnings in 0.28s =======================================================
make: *** [pytest] Error 1

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As I can see, get_experiment returns null object, did you mock the call correctly ?
Should you add patches on the KatibClient, similar as in TrainingClient: https://github.com/kubeflow/training-operator/blob/master/sdk/python/kubeflow/training/api/training_client_test.py#L181-L195 ?

Copy link
Member

@andreyvelich andreyvelich left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for your first contribution @shashank-iitbhu 🎉
I left a few comments.
/assign @kubeflow/wg-training-leads @droctothorpe

Comment on lines +132 to +134
success_message = f"Experiment {namespace}/{experiment_name} has been created"
print(success_message)
return success_message
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do you need this change ?

Comment on lines +28 to +35
with pytest.raises(TimeoutError):
katib_client.create_experiment(valid_experiment, namespace="timeout")

@patch('kubeflow.katib.api.katib_client.client.CustomObjectsApi.create_namespaced_custom_object')
def test_create_experiment_runtime_error(self, mock_create_namespaced_custom_object, katib_client, valid_experiment):
mock_create_namespaced_custom_object.side_effect = RuntimeError()
with pytest.raises(RuntimeError):
katib_client.create_experiment(valid_experiment, namespace="runtime")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are we testing here that create_experiment raises correct exception based on create_namespaced_custom_object' call ?

def test_create_experiment_success(self, mock_create_namespaced_custom_object, katib_client, valid_experiment):
mock_create_namespaced_custom_object.return_value = {"metadata": {"name": "test-experiment"}}
experiment = katib_client.create_experiment(valid_experiment)
assert experiment == "Experiment default/test-experiment has been created"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

create_experiment API doesn't return any info, so you should not check this assert.


class TestCreateExperiment:
@patch('kubeflow.katib.api.katib_client.client.CustomObjectsApi.create_namespaced_custom_object')
def test_create_experiment_success(self, mock_create_namespaced_custom_object, katib_client, valid_experiment):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think, it would be nice if you could add unit test to verify these conditions: https://github.com/kubeflow/katib/blob/master/sdk/python/v1beta1/kubeflow/katib/api/katib_client.py#L96-L108.
E.g. you can create create test_data list under TestCreateExperiment class and add various test cases there, similar to TrainingClient tests.

mock_get_namespaced_custom_object.return_value = mock_response
experiment = katib_client.get_experiment("test-experiment")
assert experiment == valid_experiment

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As I can see, get_experiment returns null object, did you mock the call correctly ?
Should you add patches on the KatibClient, similar as in TrainingClient: https://github.com/kubeflow/training-operator/blob/master/sdk/python/kubeflow/training/api/training_client_test.py#L181-L195 ?

@andreyvelich
Copy link
Member

@shashank-iitbhu Please let us know if you have time to address our comments this or next week.
We are planing to cut a first Katib 0.17 RC.0 release soon.

@tariq-hasan
Copy link
Contributor

Hello @andreyvelich I was wondering if I can pick up on this work if you are planning on including this in Katib 0.18.

@andreyvelich
Copy link
Member

Hello @andreyvelich I was wondering if I can pick up on this work if you are planning on including this in Katib 0.18.

That would be great @tariq-hasan! We can even include this in Katib 0.17 by cherry-pick this PR to the release branch.
@shashank-iitbhu Are you ok to delegate this PR to @tariq-hasan so we can include that important feature to the next Katib release ?

@shashank-iitbhu
Copy link
Contributor Author

Hello @andreyvelich I was wondering if I can pick up on this work if you are planning on including this in Katib 0.18.

That would be great @tariq-hasan! We can even include this in Katib 0.17 by cherry-pick this PR to the release branch. @shashank-iitbhu Are you ok to delegate this PR to @tariq-hasan so we can include that important feature to the next Katib release ?

@andreyvelich I'm okay with @tariq-hasan working on this PR.

@tariq-hasan
Copy link
Contributor

Sounds good.

@andreyvelich I notice that the scope of testing in training_client_test.py is limited to create_job but that the tests were written before the train API was implemented.

I was therefore wondering if we limit the scope of katib_client_test.py to create_experiment or if we also include the tune API.

Do you plan to include tests for all the functions in training_client.py and katib_client.py in upcoming releases but limit the scope for this issue?

I plan to submit a PR within the next few days but I was also wondering if there is any strict timeline I would need to follow to get this code in the release.

@andreyvelich
Copy link
Member

andreyvelich commented Apr 30, 2024

Sounds good.

I was therefore wondering if we limit the scope of katib_client_test.py to create_experiment or if we also include the tune API.

I think, we can start with something, it would be nice to add tests for create_experiment and tune APIs, but if you want you can add tests just for one of these APIs.

Do you plan to include tests for all the functions in training_client.py and katib_client.py in upcoming releases but limit the scope for this issue?

I don't think we have enough time to include tests for all APIs before next release.

I plan to submit a PR within the next few days but I was also wondering if there is any strict timeline I would need to follow to get this code in the release.

Sure, that sound great! If we want to include that in the Katib 0.17 release, it would be nice to merge your PR before distribution testing phase, which is May 20th: https://github.com/kubeflow/community/tree/master/releases/release-1.9

@tariq-hasan
Copy link
Contributor

Sounds good.
I was therefore wondering if we limit the scope of katib_client_test.py to create_experiment or if we also include the tune API.

I think, we can start with something, it would be nice to add tests for create_experiment and tune APIs, but if you want you can add tests just for one of these APIs.

Do you plan to include tests for all the functions in training_client.py and katib_client.py in upcoming releases but limit the scope for this issue?

I don't think we have enough time to include tests for all APIs before next release.

I plan to submit a PR within the next few days but I was also wondering if there is any strict timeline I would need to follow to get this code in the release.

Sure, that sound great! If we want to include that in the Katib 0.17 release, it would be nice to merge your PR before distribution testing phase, which is May 20th: https://github.com/kubeflow/community/tree/master/releases/release-1.9

Sounds good. I created a PR for the create_experiment function in the katib_client module and plan to add another for the tune function once the current PR is merged.

@andreyvelich
Copy link
Member

This was implemented here: #2325

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add unit tests to the Katib SDK
3 participants