Higgs challenge ends recently, xgboost is being used by many users. This list highlights the xgboost solutions of players
- Blogpost by phunther: Winning solution of Kaggle Higgs competition: what a single model can do
- The solution by Tianqi Chen and Tong He Link
This is the folder giving example of how to use XGBoost Python Module to run Kaggle Higgs competition
This script will achieve about 3.600 AMS score in public leaderboard. To get start, you need do following step:
- Compile the XGBoost python lib
cd ../..
make
-
Put training.csv test.csv on folder './data' (you can create a symbolic link)
-
Run ./run.sh
speedtest.py compares xgboost's speed on this dataset with sklearn.GBM
- Alternatively, you can run using R, higgs-train.R and higgs-pred.R.