-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🚀 Instate documentation for MLJFlux #252
Conversation
Includes API index and placeholders for the structure
@ablaom Know you prefer to review changes live (i.e., after deploying docs). Trying to recall whether that was possible before merging a PR with initial documentation and Github action. |
@ablaom this is now ready for a review. However, this includes only the pages/sections under the "Introduction" or "Interface" headlines in the sidebar (tutorials and workflow examples and contributing are still work in progress; even if I committed some work related to them). |
Perfect. I will likely review before the weekend. P.S. This looked super polished in our offline discussion today. Thanks for the great work. |
Maybe is doesn't matter, but I don't ordinarily track docs/Manifest.toml. |
Still to do:
|
While I will defer to your superior sense of aesthetics, I must admit a preference for serif fonts. I find them easier to read. I think screen resolutions are so fantastic these days that old arguments against Serif fonts don't apply. But again, happy to defer to your judgement and taste. |
docs/src/index.md
Outdated
|
||
# 3. Wrap it in a machine in fit | ||
mach = machine(clf, X, y) | ||
fit!(mach) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We don't need to fit!
if we're going to evaluate.
fit!(mach) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oops. Such a rookie mistake. Thanks.
P.S.: Need to modify the comment as well will do now in an independent commit.
docs/src/index.md
Outdated
|
||
- **Comparing** your model with a non-deep learning models | ||
|
||
Thus, for model that could be implemented in both `Flux` and `MLJFlux`, one could choose working with `MLJFlux` instead of `Flux` if they are interested in any of the functionality above, while not willing to implement it from scratch and/or when they would prefer working with a more high-level interface equivalent to that of MLJ for their task. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thus, for model that could be implemented in both `Flux` and `MLJFlux`, one could choose working with `MLJFlux` instead of `Flux` if they are interested in any of the functionality above, while not willing to implement it from scratch and/or when they would prefer working with a more high-level interface equivalent to that of MLJ for their task. | |
[`FastAI`](https://fluxml.ai/FastAI.jl/dev/FastAI@dev/doc/docs/introduction.md.html) also provides a high-level interface for interacting with `Flux` models. If you are working exclusively with neural network models, and are less interested with comparison/composition with other kinds of machine learning models, this may be more suitable choice. Otherwise, `MLJFlux` may be better, especially for the users already familiar with the MLJ interface. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ablaom while I have been only browsing and searching in the docs for the last few minutes, my conclusion so far is that FastAI
does not go too much beyond providing a high-level interface for interacting with Flux models.
For instance, I could neither by searching their docs nor quickly browsing the pages find that they support features such as advanced out-of-sample evaluation (e.g., cross validation) or hyperparameter search (e.g., tuning the number of hidden layers/number of neurons per layers) and of course, composability and comparison as you mentioned. FluxTraining.jl (related to FastAI in some way) seems to only support hyperparameter scheduling which is not exactly hyperparameter search (e.g., still can't tune the number of layers with this; can argue it's not really "tuning" whatsoever). That same package does support early stopping though.
Thus, I find the following to not the most accurate fact, in that FastAI
is not a drop-in replacement for MLJFlux
:
If you are working exclusively with neural network models, and are less interested with comparison/composition with other kinds of machine learning models, this (i.e., FastAI) may be more suitable choice.
I'm also not sure why the mention exclusively with neural network models
because in their README they tend to just speak more generically of deep learning models.
I would recommend one of the following line of actions:
- That we don't mention
FastAI
at all (i.e., what an average package would do; also avoids confusion to users that: (i), only knowMLJFlux
and (ii), whereMLJFlux
is sufficient for them; if (ii) did not hold then they would have likely been able to find FastAI themselves) - Go out of our way and maximize sincerity as you intended and mention what we know about
FastAI
(e.g., after the comparison with Flux we add a line likeA comparable project, FastAI/FluxTraining, also provides a high-level interface for interacting with Flux models and supports a subset of features that may overlap with (but not include all of) those supported by MLJFlux
For the last point, one can also go into more detail in the comparison but I don't find this necessary as both packages are likely to change with time and users may be in general less interested in such comparison. Comparing MLJFlux
and Flux
makes lots of sense because it's guaranteed everyone likely knows Flux
and this helps them understand MLJFlux
better.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it's helpful to make links to other related packages. I also admit, that (out of some laziness) I may have been too generous is stating what FastAI.jl can do. That said, I have had some interactions with the developers and indirectly with the project and have the impression it is a well thought-out design by experts, and I remember a lot of community engagement. If I were working exclusively with images today - and certainly video - I expect I might prefer FastAI to MLJFlux.
So the second suggestion sounds good, which I agree is more accurate.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Alright. I agree with everything you said. So we will go with replacing the text above with:
A comparable project, FastAI/FluxTraining, also provides a high-level interface for interacting with Flux models and supports a set of features that may overlap with (but not include all of) those supported by MLJFlux.
Right? Let me know if you would to fine-tune it further.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
all good
Co-authored-by: Anthony Blaom, PhD <[email protected]>
… into instate-docs
Co-authored-by: Anthony Blaom, PhD <[email protected]>
Co-authored-by: Anthony Blaom, PhD <[email protected]>
Co-authored-by: Anthony Blaom, PhD <[email protected]>
Co-authored-by: Anthony Blaom, PhD <[email protected]>
Co-authored-by: Anthony Blaom, PhD <[email protected]>
Co-authored-by: Anthony Blaom, PhD <[email protected]>
Co-authored-by: Anthony Blaom, PhD <[email protected]>
Co-authored-by: Anthony Blaom, PhD <[email protected]>
Co-authored-by: Anthony Blaom, PhD <[email protected]>
Co-authored-by: Anthony Blaom, PhD <[email protected]>
Co-authored-by: Anthony Blaom, PhD <[email protected]>
Co-authored-by: Anthony Blaom, PhD <[email protected]>
Co-authored-by: Anthony Blaom, PhD <[email protected]>
Co-authored-by: Anthony Blaom, PhD <[email protected]>
Co-authored-by: Anthony Blaom, PhD <[email protected]>
Thank you so much for your time and effort in this fantastic review, @ablaom. I think it would make it easier for me to automate the documentation deployment if we merge this (after finishing with the remaining comments and removing any pages yet to be added; e.g., like in workflow examples) and then after merging make another PR for tutorials/workflow examples. The idea is that I will want the action to trigger the deploy "on PR to dev" and for it to take the effect docs must be deployable from dev. It's likely though that there is a way to do this without merging first but I'm not sure of it, just yet. |
If I remember correctly, I tried to copy the font used in I know you didn't require this much justification but I also wanted to clarify why I think what I think. |
Appreciate it! Yes, this may well be a generational thing, and I support your decision to keep the "modern" fonts. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great to see this progress. Merge when ready.
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## dev #252 +/- ##
=======================================
Coverage 92.08% 92.08%
=======================================
Files 12 12
Lines 316 316
=======================================
Hits 291 291
Misses 25 25 ☔ View full report in Codecov by Sentry. |
@ablaom I can't merge because I don't have write access. Could you do instead? I added the script for deployment but it may be the case that someone with access to the repo needed to go to |
@EssamWisam You should have an invite for access now. |
Includes API index and placeholders for the documentation structure.