Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Relaxed Lasso #58

Open
azev77 opened this issue Feb 9, 2021 · 9 comments
Open

Relaxed Lasso #58

azev77 opened this issue Feb 9, 2021 · 9 comments

Comments

@azev77
Copy link

azev77 commented Feb 9, 2021

The GLMNet package includes the Relaxed Lasso option, which recent research has shown performs very well.
Would it be possible for GLMNet.jl to allow this?

@JackDunnNZ
Copy link
Collaborator

I had a quick look through how this is implemented in the R package, and it looks like the logic for the relaxed option sits in the R code, rather than in the core fortran library. So unfortunately it looks like we can't simply access the relaxed option from the core compiled library, instead this R logic would need to be duplicated into the Julia package which is a bigger undertaking.

@azev77
Copy link
Author

azev77 commented Feb 10, 2021

I see, That means it’s likely to be faster in the julia version

@AdaemmerP
Copy link

Thanks for the great package!
Would it be possible to change 'CompressedPredictorMatrix' to a mutable struct? This would allow modifying the predicted values and implementing the relaxed lasso.

@JackDunnNZ
Copy link
Collaborator

I think that should be fine, although it might be better to update the code to use a generic sparse matrix instead rather than the custom struct. I'm not too familiar with the internals of the package but it feels like that should be possible?

@AdaemmerP
Copy link

Yes, a sparse matrix might be better to save the parameters. Regarding the struct, I think line 90 should be changed to a mutable struct: https://github.com/JuliaStats/GLMNet.jl/blob/master/src/GLMNet.jl
Or is there any other possibility to change and save the values? I want to modify the parameters and then use them with GLMNet.predict()

@JackDunnNZ
Copy link
Collaborator

Or is there any other possibility to change and save the values?

You may be able to use Setfield.jl or Accessors.jl to easily update the GLMNetPath with new coefficients, something like

new_path = @set path.betas = new_betas

@AdaemmerP
Copy link

Nice, thanks for the tip!

@azev77
Copy link
Author

azev77 commented Sep 2, 2022

@AdaemmerP did you have any luck implementing the relaxed Lasso?

@AdaemmerP
Copy link

@azev77 Yes, I was able to implement but within a time series framework (https://github.com/AdaemmerP/DetectSparsity/blob/main/CaseStudies/Functions.jl, lines 337 - 501). I also used the Lasso.jl package for it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants