Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ADA-SVR (1/4) Unused of parameter full_rank #58

Open
wants to merge 8 commits into
base: main
Choose a base branch
from

Conversation

lionelkusch
Copy link
Collaborator

In the file hidimstat/adaptive_permutation_threshold.py, the function _manual_inversion has a parameter full_rank. This parameter is not used in the main function ada_svr (line 39).

I propose to remove because it's actually a dead code. Can you check if there are not too many assertions and if I miss some assertions?

Remove the error because it's not possible to access it due to the operation of np.dot(X, X.T) on line 39. The only line using this function.
@lionelkusch
Copy link
Collaborator Author

I remove the VallueError because it's not possible to access it due to the operation of np.dot(X, X.T) on line 39. The only line using this function.

variable unnecessary and without meaning in this context
Copy link

codecov bot commented Dec 13, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 77.00%. Comparing base (e95b4f6) to head (d063e93).
Report is 1 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main      #58      +/-   ##
==========================================
- Coverage   77.09%   77.00%   -0.09%     
==========================================
  Files          46       46              
  Lines        2471     2453      -18     
==========================================
- Hits         1905     1889      -16     
+ Misses        566      564       -2     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Contributor

@bthirion bthirion left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.
Is thsis part of the code base tested ?

@lionelkusch
Copy link
Collaborator Author

There were not tests for this part of the code. I found the dead code by trying to test this code through the main function ada_svr.

Add comment about the condition of X from the removal lines
@bthirion
Copy link
Contributor

I suggest to take this opportunity to add a test. best,

@lionelkusch
Copy link
Collaborator Author

Do you want to remove it or keep it and add some tests for it?

@bthirion
Copy link
Contributor

My feeling is that we should add a test.

@lionelkusch
Copy link
Collaborator Author

The pull request removes lines that cannot be tested via the main function.
Consequently, testing these lines makes no sense.
On the other hand, if we keep them, I can test the private function.

@lionelkusch
Copy link
Collaborator Author

I replace the function _make_inverse with the function of numpy: https://numpy.org/doc/stable/reference/generated/numpy.linalg.pinv.html
Which is doing the same thing with the only difference is there is not a test when the matrix is not square.

@lionelkusch
Copy link
Collaborator Author

Just to get a trace of it.
I notice a bug in the previous function for inverting matrix.
line 69: X_inv = np.linalg.multi_dot([U, np.diag(s_inv), V])
should be :
line 69: X_inv = np.linalg.multi_dot([V.T, np.diag(s_inv), U.T])

Copy link
Contributor

@bthirion bthirion left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thx.

@lionelkusch lionelkusch changed the title Unused of parameter full_rank ADA-SVR (1/4) Unused of parameter full_rank Dec 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants