Releases: JuliaAI/DecisionTree.jl
Releases · JuliaAI/DecisionTree.jl
v0.10.13
DecisionTree v0.10.13
- (bug fixes) Ensure
nfoldsCV
passes the RNG on toAdaBoostClassifier
models. Fix non-threadsafe use of RNGs in random forests (#174) @rikhuijzer - (testing) Replace all RNGs in tests with
StableRNGs
to ensure consistency across Julia versions and introduce tests over multiple seeds (#174) @rikhuijzer - (enhancement) Improve display returned by
print_tree
(making it consistent with the AbstractTrees.jl fallback) and fix docstring (#171, #172, #173) @rikhuijzer
Merged pull requests:
- Fix typo in
print_tree
description (#171) (@rikhuijzer) - Replace
L
by<
andR
by≥
(#172) (@rikhuijzer) - Round digits in
print_tree
(#173) (@rikhuijzer) - Test multiple seeds (#174) (@rikhuijzer)
- For a 0.10.13 release (#178) (@ablaom)
v0.10.12
DecisionTree v0.10.12
- (enhancement) Add method
wrap
which allows one to wrap a decision tree (Leaf
orNode
object) in a new tree structure implementing the AbstractTrees.jl interface. Unlike raw DecisionTree.jl decision trees, the nodes of the wrapped objects can include the names of splitting-features at nodes, and any other node metadata. See thewrap
docstring for details. (#158) @roland-KA
Closed issues:
- Avoid "regressor" terminology (#49)
- Support sample weights? (#141)
- TagBot trigger issue (#145)
- Splitting criterion in regression tree (#148)
- Transfer of ownership? (#152)
- Update General.jl to reflect new ownership (#153)
- Issue to trigger new releases (#162)
Merged pull requests:
- Add docstring to print_tree (#151) (@KronosTheLate)
- Update readme to reflect new ownership (#154) (@ablaom)
- Add MLJ usage instructions (#156) (@ablaom)
- Add implementation of AbstractTrees-interface (#158) (@roland-KA)
- For a 0.10.12 release (#161) (@ablaom)
- Relax some test tolerances (#163) (@ablaom)
- Add compat AbstractTrees = "0.3" (#165) (@ablaom)
v0.10.11
What's Changed
- Add GitHub actions CI by @IanButterworth in https://github.com/bensadeghi/DecisionTree.jl/pull/143
- Add option to print feature names in
print_tree
by @IanButterworth in https://github.com/bensadeghi/DecisionTree.jl/pull/142 - Move
check_input
to utils to avoid duplicate code by @barucden in https://github.com/bensadeghi/DecisionTree.jl/pull/137 - Fix feature_names kwarg structure for julia 1.0 by @IanButterworth in https://github.com/bensadeghi/DecisionTree.jl/pull/144
New Contributors
- @IanButterworth made their first contribution in https://github.com/bensadeghi/DecisionTree.jl/pull/143
Full Changelog: bensadeghi/DecisionTree.jl@v0.10.10...v0.10.11
v0.10.10
DecisionTree v0.10.10
Merged pull requests:
- calibrate using training data for cross validation (#133) (@salbert83)
v0.10.9
DecisionTree v0.10.9
Merged pull requests:
v0.10.8
v0.10.7
v0.10.6
DecisionTree v0.10.6
Closed issues:
- Features of type AbstractVector in apply_tree? (#124)
- Forest building on parallel threads is not a good idea (#125)
- Request: option to suppress printing in nfoldCV_tree (#127)
Merged pull requests:
- seeding for reproducible multi-threaded RFs (#126) (@bensadeghi)
- use AbstractVector and AbstractMatrix (#128) (@bensadeghi)
v0.10.5
DecisionTree v0.10.5
v0.10.4
DecisionTree v0.10.4
Closed issues:
- build_adaboost_stumps produces unexpected convert errors (#121)
Merged pull requests:
- Add convert: Leaf to Node (#122) (@bensadeghi)