Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implemented precision_score function in _classification.py #23063

Closed
wants to merge 3 commits into from

Conversation

muzakkirhussain011
Copy link
Contributor

@muzakkirhussain011 muzakkirhussain011 commented Sep 5, 2023

The precision_score function calculates the precision score for classification tasks, which is a measure of the accuracy of positive predictions.

The precision_score function takes as input the true labels y_true and predicted labels y_pred. It also allows for optional parameters such as normalize for controlling whether the result should be normalized and sample_weight for handling sample weights.

The function first determines the type of target variable, checking if it is a multilabel classification problem. If it is, it computes the difference between y_true and y_pred and counts the number of non-zero elements along the specified axis, corresponding to the number of different labels predicted. It then checks if this count is equal to zero for each sample and converts the result to an integer.

For binary or multiclass classification problems, the code simply compares y_true and y_pred for equality, again converting the result to integers.

The function sums up the integer values, which represent the number of correct positive predictions, and optionally normalizes the result by dividing it by the total number of samples. The final result is cast as a float if normalization is enabled.

PR Description

Related Issue

Close #

Checklist

  • Did you add a function?
  • Did you add the tests?
  • Did you follow the steps we provided?

Socials:

The precision_score function calculates the precision score for classification tasks, which is a measure of the accuracy of positive predictions.

The precision_score function takes as input the true labels y_true and predicted labels y_pred. It also allows for optional parameters such as normalize for controlling whether the result should be normalized and sample_weight for handling sample weights.

The function first determines the type of target variable, checking if it is a multilabel classification problem. If it is, it computes the difference between y_true and y_pred and counts the number of non-zero elements along the specified axis, corresponding to the number of different labels predicted. It then checks if this count is equal to zero for each sample and converts the result to an integer.

For binary or multiclass classification problems, the code simply compares y_true and y_pred for equality, again converting the result to integers.

The function sums up the integer values, which represent the number of correct positive predictions, and optionally normalizes the result by dividing it by the total number of samples. The final result is cast as a float if normalization is enabled.
@github-actions
Copy link
Contributor

github-actions bot commented Sep 5, 2023

Thanks for contributing to Ivy! 😊👏
Here are some of the important points from our Contributing Guidelines 📝:
1. Feel free to ignore the run_tests (1), run_tests (2), … jobs, and only look at the display_test_results job. 👀 It contains the following two sections:
- Combined Test Results: This shows the results of all the ivy tests that ran on the PR. ✔️
- New Failures Introduced: This lists the tests that are passing on main, but fail on the PR Fork. Please try to make sure that there are no such tests. 💪
2. The lint / Check formatting / check-formatting tests check for the formatting of your code. 📜 If it fails, please check the exact error message in the logs and fix the same. ⚠️🔧
3. Finally, the test-docstrings / run-docstring-tests check for the changes made in docstrings of the functions. This may be skipped, as well. 📚
Happy coding! 🎉👨‍💻

@muzakkirhussain011
Copy link
Contributor Author

muzakkirhussain011 commented Sep 6, 2023

Hi @AnnaTz ,

I've added a new function, precision_score, to the _classification.py module. This function is tailored to classification tasks and is designed to work seamlessly with Ivy.

The precision_score function calculates precision, an essential metric for classification tasks, and contributes to Ivy's compatibility in machine learning projects.

I'd greatly appreciate it if you could review and provide your feedback on this addition.

Thank you

@AnnaTz AnnaTz requested a review from umairjavaid September 8, 2023 09:52
@AnnaTz
Copy link
Contributor

AnnaTz commented Sep 8, 2023

Hi @muzakkirhussain011, you will also need to add a test for the function to make sure it works as expected.

This commit adds a test case for the precision_score function to ensure its correctness. The test compares the results of the custom precision_score function with scikit-learn's precision_score function to validate its accuracy.

Tested precision_score function against example data and scikit-learn's implementation.
@muzakkirhussain011
Copy link
Contributor Author

Hi @muzakkirhussain011, you will also need to add a test for the function to make sure it works as expected.

Hi @muzakkirhussain011, you will also need to add a test for the function to make sure it works as expected.

Hi @AnnaTz ,

Thank you for your feedback! I'm pleased to let you know that a test for the precision_score function has been added to ensure it works as expected.

You can check out the details in the pull request
#23264

If you have any more suggestions or questions, please feel free to let me know.

Best regards

@AnnaTz
Copy link
Contributor

AnnaTz commented Sep 8, 2023

Hi @muzakkirhussain011, you will also need to add a test for the function to make sure it works as expected.

Hi @muzakkirhussain011, you will also need to add a test for the function to make sure it works as expected.

Hi @AnnaTz ,

Thank you for your feedback! I'm pleased to let you know that a test for the precision_score function has been added to ensure it works as expected.

You can check out the details in the pull request #23264

If you have any more suggestions or questions, please feel free to let me know.

Best regards

Thanks! However, we prefer to merge a single PR containing a new function and its test. Please combine your 2 PRs to review them together.

@muzakkirhussain011
Copy link
Contributor Author

Thanks! However, we prefer to merge a single PR containing a new function and its test. Please combine your 2 PRs to review them together.

Hi @AnnaTz ,

Thank you for your feedback. I've now combined the two PRs into a single one for review.

You can find the updated PR with both the new function and its corresponding test at the following link:
#23277

Please review it, and let me know if you have any further suggestions or feedback.

Thanks again for your guidance.

Best regards

Copy link
Contributor

@zeus2x7 zeus2x7 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @muzakkirhussain011 the test function needs to be written in the ivy_tests/test_ivy/test_frontends/test_sklearn/test_metrics/test_classification.py file
You can go through these sections of docs for much greater understanding
Ivy Tests
Ivy Frontend Tests

@muzakkirhussain011
Copy link
Contributor Author

Hi @muzakkirhussain011 the test function needs to be written in the ivy_tests/test_ivy/test_frontends/test_sklearn/test_metrics/test_classification.py file
You can go through these sections of docs for much greater understanding
Ivy Tests
Ivy Frontend Tests

Hi @zeus2x7 ,

Thank you for your guidance. I've written the test function and placed it in the ivy_tests/test_ivy/test_frontends/test_sklearn/test_metrics/test_classification.py file.

You can find the updated PR with both the new function and its corresponding test at the following link:
#23277

If you have any specific details or requirements for the test function, please let me know, and I'll make any necessary adjustments. Additionally, I'll review the provided sections of the documentation for a deeper understanding.

Thanks again for your assistance.

Best regards

@zeus2x7
Copy link
Contributor

zeus2x7 commented Sep 8, 2023

@muzakkirhussain011 can you close this PR which doesn't have the test function implemented as it would be better.

@muzakkirhussain011
Copy link
Contributor Author

@muzakkirhussain011 can you close this PR which doesn't have the test function implemented as it would be better.

Hi @zeus2x7 ,

Thank you for bringing this to my attention. I wanted to clarify that the PR containing the code and test function hasn't been assigned any reviewers yet. However, I've added the test function to the other PR that was lacking it and doesn't have any reviewers.

I appreciate your attention to detail, and please feel free to let me know if you have any further suggestions or requirements.

Best regards

@zeus2x7
Copy link
Contributor

zeus2x7 commented Sep 8, 2023

Hi, @muzakkirhussain011 another pr created by you #23277 has been assigned a reviewer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants