Skip to content

Commit

Permalink
improve params docs (#3252)
Browse files Browse the repository at this point in the history
  • Loading branch information
StrikerRUS authored Jul 27, 2020
1 parent 091f41b commit b299de3
Show file tree
Hide file tree
Showing 2 changed files with 12 additions and 7 deletions.
10 changes: 6 additions & 4 deletions docs/Parameters.rst
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ Core Parameters

- ``predict``, for prediction, aliases: ``prediction``, ``test``

- ``convert_model``, for converting model file into if-else format, see more information in `IO Parameters <#io-parameters>`__
- ``convert_model``, for converting model file into if-else format, see more information in `Convert Parameters <#convert-parameters>`__

- ``refit``, for refitting existing models with new data, aliases: ``refit_tree``

Expand Down Expand Up @@ -115,6 +115,8 @@ Core Parameters

- ``goss``, Gradient-based One-Side Sampling

- **Note**: internally, LightGBM uses ``gbdt`` mode for the first ``1 / learning_rate`` iterations

- ``data`` :raw-html:`<a id="data" title="Permalink to this parameter" href="#data">&#x1F517;&#xFE0E;</a>`, default = ``""``, type = string, aliases: ``train``, ``train_data``, ``train_data_file``, ``data_filename``

- path of training data, LightGBM will train from this data
Expand Down Expand Up @@ -204,7 +206,7 @@ Learning Control Parameters

- the number of columns is large, or the total number of bins is large

- ``num_threads`` is large, e.g. ``>20``
- ``num_threads`` is large, e.g. ``> 20``

- you want to reduce memory cost

Expand All @@ -222,7 +224,7 @@ Learning Control Parameters

- the number of data points is large, and the total number of bins is relatively small

- ``num_threads`` is relatively small, e.g. ``<=16``
- ``num_threads`` is relatively small, e.g. ``<= 16``

- you want to use small ``bagging_fraction`` or ``goss`` boosting to speed up

Expand Down Expand Up @@ -460,7 +462,7 @@ Learning Control Parameters

- you need to specify all features in order. For example, ``mc=-1,0,1`` means decreasing for 1st feature, non-constraint for 2nd feature and increasing for the 3rd feature

- ``monotone_constraints_method`` :raw-html:`<a id="monotone_constraints_method" title="Permalink to this parameter" href="#monotone_constraints_method">&#x1F517;&#xFE0E;</a>`, default = ``basic``, type = string, aliases: ``monotone_constraining_method``, ``mc_method``
- ``monotone_constraints_method`` :raw-html:`<a id="monotone_constraints_method" title="Permalink to this parameter" href="#monotone_constraints_method">&#x1F517;&#xFE0E;</a>`, default = ``basic``, type = enum, options: ``basic``, ``intermediate``, aliases: ``monotone_constraining_method``, ``mc_method``

- used only if ``monotone_constraints`` is set

Expand Down
9 changes: 6 additions & 3 deletions include/LightGBM/config.h
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,7 @@ struct Config {
// alias = task_type
// desc = ``train``, for training, aliases: ``training``
// desc = ``predict``, for prediction, aliases: ``prediction``, ``test``
// desc = ``convert_model``, for converting model file into if-else format, see more information in `IO Parameters <#io-parameters>`__
// desc = ``convert_model``, for converting model file into if-else format, see more information in `Convert Parameters <#convert-parameters>`__
// desc = ``refit``, for refitting existing models with new data, aliases: ``refit_tree``
// desc = **Note**: can be used only in CLI version; for language-specific packages you can use the correspondent functions
TaskType task = TaskType::kTrain;
Expand Down Expand Up @@ -145,6 +145,7 @@ struct Config {
// desc = ``rf``, Random Forest, aliases: ``random_forest``
// desc = ``dart``, `Dropouts meet Multiple Additive Regression Trees <https://arxiv.org/abs/1505.01866>`__
// desc = ``goss``, Gradient-based One-Side Sampling
// descl2 = **Note**: internally, LightGBM uses ``gbdt`` mode for the first ``1 / learning_rate`` iterations
std::string boosting = "gbdt";

// alias = train, train_data, train_data_file, data_filename
Expand Down Expand Up @@ -225,7 +226,7 @@ struct Config {
// desc = set this to ``true`` to force col-wise histogram building
// desc = enabling this is recommended when:
// descl2 = the number of columns is large, or the total number of bins is large
// descl2 = ``num_threads`` is large, e.g. ``>20``
// descl2 = ``num_threads`` is large, e.g. ``> 20``
// descl2 = you want to reduce memory cost
// desc = **Note**: when both ``force_col_wise`` and ``force_row_wise`` are ``false``, LightGBM will firstly try them both, and then use the faster one. To remove the overhead of testing set the faster one to ``true`` manually
// desc = **Note**: this parameter cannot be used at the same time with ``force_row_wise``, choose only one of them
Expand All @@ -235,7 +236,7 @@ struct Config {
// desc = set this to ``true`` to force row-wise histogram building
// desc = enabling this is recommended when:
// descl2 = the number of data points is large, and the total number of bins is relatively small
// descl2 = ``num_threads`` is relatively small, e.g. ``<=16``
// descl2 = ``num_threads`` is relatively small, e.g. ``<= 16``
// descl2 = you want to use small ``bagging_fraction`` or ``goss`` boosting to speed up
// desc = **Note**: setting this to ``true`` will double the memory cost for Dataset object. If you have not enough memory, you can try setting ``force_col_wise=true``
// desc = **Note**: when both ``force_col_wise`` and ``force_row_wise`` are ``false``, LightGBM will firstly try them both, and then use the faster one. To remove the overhead of testing set the faster one to ``true`` manually
Expand Down Expand Up @@ -440,7 +441,9 @@ struct Config {
// desc = you need to specify all features in order. For example, ``mc=-1,0,1`` means decreasing for 1st feature, non-constraint for 2nd feature and increasing for the 3rd feature
std::vector<int8_t> monotone_constraints;

// type = enum
// alias = monotone_constraining_method, mc_method
// options = basic, intermediate
// desc = used only if ``monotone_constraints`` is set
// desc = monotone constraints method
// descl2 = ``basic``, the most basic monotone constraints method. It does not slow the library at all, but over-constrains the predictions
Expand Down

0 comments on commit b299de3

Please sign in to comment.