Releases: JuliaAI/DecisionTree.jl
Releases · JuliaAI/DecisionTree.jl
v0.12.4
DecisionTree v0.12.4
- (bugfix) This release fixes a bug where
labels
would internally always be converted toFloat64
(#226). This is now fixed thanks to @xinadi!
Merged pull requests:
- Apply JuliaFormatter (#219) (@rikhuijzer)
- Add
.JuliaFormatter.toml
(#221) (@rikhuijzer) - Float64 replaced by AbstractFloat for regression (#226) (@xinadi)
- For a 0.12.4 release (#227) (@rikhuijzer)
Closed issues:
v0.12.3
v0.12.2
v0.11.4
DecisionTree v0.11.4 (RELEASED AFTER 0.12.0)
This release bumps the compat requirement for the dependency AbstractArrays.jl which was previously set incorrectly.
v0.12.1
DecisionTree v0.12.1
Closed issues:
- Round thresholds in display of trees (#197)
Merged pull requests:
- Improved label printing/plotting of nodes/leaves (#200) (@roland-KA)
- For a 0.12.1 release (#201) (@ablaom)
v0.12.0
DecisionTree v0.12.0
- (breaking) For each tree in a random forest models, use
seed!
to put a copy of each associated RNG into a unique state, to mitigate observed correlation between trees (#194, #198) @dhanak @rikhuijzer
Closed issues:
- Citation / Reference for DecisionTree.jl (#193)
- RNG “shuffling” introduced in #174 is fundamentally flawed (#194)
Merged pull requests:
v0.11.3
DecisionTree v0.11.3
- (enhancement) #195 @roland-KA
Merged pull requests:
- Adapt to new
AbstractNode
inAbstractTrees
(#195) (@roland-KA) - For a 0.11.3 release (#196) (@ablaom)
v0.11.2
DecisionTree v0.11.2
- (enhancement) Add option to use multithreading in getting predictions for random forests, as in
apply_forest(forest, features; use_multithreading=true)
. The default continues to be no threading. (#188) @salbert83
Merged pull requests:
v0.11.1
DecisionTree v0.11.1
Merged pull requests:
- Fix
print_tree
(#185) (@rikhuijzer) - Fix type piracy on
zero
(#186) (@rikhuijzer) - For a 0.11.1 release (#187) (@ablaom)
v0.11.0
DecisionTree v0.11.0
- Bump minimum Julia version to 1.6.
- (enhancement) Add methods
impurity_importance
,split_importance
andpermutation_importance
for all models (#182) @yufongpeng - (breaking) Use the SAMME algorithm for calculating
AdaBoostStumpClassifier
coefficients (#167) @yufongpeng - (breaking) Do not export the functions
R2
,mean_sqaure_error
,majority_vote
,confusion_matrix
andConfusionMatrix
(#183) - Add
accuracy
to built-in measures but do not export.
Closed issues:
- [Feature Request] Add a Field for Feature Importance (#170)
- Remove the
Int
as rng functionality (#177) - Stop exporting metrics? (#181)
Merged pull requests:
- Adabooststump algorithm change (#167) (@yufongpeng)
- Bump compat julia="1.6", AbstractArrays="0.3,0.4"; adjust CI (#179) (@ablaom)
- Feature importance (#182) (@yufongpeng)
- Stop exporting measures (#183) (@ablaom)
- For a 0.11 release (#184) (@ablaom)