Skip to content

Commit

Permalink
implementation md edit
Browse files Browse the repository at this point in the history
  • Loading branch information
aleksiakolo committed Jun 24, 2024
1 parent 877cfd3 commit dd53a9f
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion docs/source/implementation.md
Original file line number Diff line number Diff line change
Expand Up @@ -103,4 +103,4 @@ The final stage uses the processed and cached data to train an XGBoost model. Th

- **Iterator for Data Loading**: Custom iterators are designed to load sparse matrices efficiently into the XGBoost training process, which can handle sparse inputs natively, thus maintaining high computational efficiency.
- **Training and Validation**: The model is trained using the tabular data, with evaluation steps that include early stopping to prevent overfitting and tuning of hyperparameters based on validation performance.
- **Hyperaparameter Tuning**: We use [optuna](https://optuna.org/) to tune over XGBoost model pramters, aggregations, window_sizes, and the minimimum code inclusion frequency.
- **Hyperaparameter Tuning**: We use [optuna](https://optuna.org/) to tune over XGBoost model pramters, aggregations, window sizes, and the minimimum code inclusion frequency.

0 comments on commit dd53a9f

Please sign in to comment.