-
Notifications
You must be signed in to change notification settings - Fork 3.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LightGBMError: GPU Tree Learner was not enabled in this build (ubuntu 18.04 - anaconda - jupyter notebook env) #3310
Comments
Have you tried this installation guide?
Please be more precise. What is the problem? What steps did you do? Post your commands and corresponding logs. |
Hi @StrikerRUS
|
OK. Seems that you have successfully compiled dynamic library. Now you should install Python wrapper.
Please pay attention to the second line: |
Thanks, I solved the problem following your instructions, but now I have a problem with bin size --> However I have the same error Should I open another issue? As issue #3310 is solved |
@diego-florez did you use categorical features? if yes, you can try to disable them. |
@guolinke yes, 11/18 are categorical, but they are set as type "categorical" and I tried with the following params And the same error persist with the 3 of them |
@diego-florez Are you loading your data from a file? Because
|
the training data is a pandas dataframe previously loaded in the notebook, there is not any other solution for the problem? Maybe change categorical variables to one hot encoding? |
@diego-florez can you try to use label encoding for categorical features, and don't set 'categorical_feature'. Then LightGBM will treat them as numerical features. |
I have done exactly what was shown in the post you replied to, in this reply. The issue is that there's no increase in the computation speed. and the GPU is barley being used(I see usage as low as 0.08GB). How do I speed up the training? my data set sizes are, train=(247570, 70) and validation=(13031, 70). here is all my code and my hyperparameters:
|
@naveen-9697 Please refer to #768 and some benchmarks in https://github.com/Laurae2/ml-perf/issues and https://github.com/szilard/GBM-perf/issues for the problem of small GPU utilization. |
Closing this issue because the original problem has been solved. |
This issue has been automatically locked since there has not been any recent activity since it was closed. To start a new related discussion, open a new issue at https://github.com/microsoft/LightGBM/issues including a reference to this. |
How you are using LightGBM?
LightGBM component: python-api -- sklear-api -- lightgbm.LGBMClassifier
Environment info
ubuntu 18.04 -- anaconda3 -- python3.7 -- jupyter notebook
Operating System: Ubuntu 18.04
CPU/GPU model: NVIDIA-SMI 390.138
C++ compiler version: gcc version 7.5.0
CMake version: cmake version 3.10.2
Python version: Python 3.7.4
LightGBM version or commit hash: LightGBM 3.0.0
Error message and / or logs
LightGBMError: GPU Tree Learner was not enabled in this build. Please recompile with CMake option -DUSE_GPU=1
I want to use LightGBM classifier with gpu on jupyter notebook and I didn't find a way to do it
I know there is a closed issue#2222 solving the problem for windows env. I have followed it as well as followed the installation guide https://lightgbm.readthedocs.io/en/latest/GPU-Tutorial.html, however I was unable to solve the problem
The text was updated successfully, but these errors were encountered: