-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Generated Data Shape always 0 #9
Comments
Hi @iamamiramine sorry that I just saw your message. Did you solve it? The reason can be that your max_length is too small so that the generation cannot successfully generate one complete row of data. |
Hello, I tried changing the max_length parameter and it did not work. |
I am also having problems with this, I am using from imblearn.datasets import fetch_datasets
sick = fetch_datasets()['sick']
sick.data.shape |
You do not need to set the max_length to 1024 that big, you can uncomment this part of code to see what is the length of your encoded row: Tabula/tabula/tabula_dataset.py Line 64 in 3869567
Let me know if that helps. |
Yes, I don't think it has to do with max_length, the issue in this case is that some numbers always are outside of the requested ranges in the predicted dataframe, so they are always filtered out. I have tried to switch temperature, k, and training epochs to no avail. |
May I ask if your problem is solved, I'm experiencing this problem as well |
This problem also occurs if you train with too few epochs. Train with more epochs and sampling speed improves. |
In my case, no matter the epochs it wouldn't work (I trained for a week in a 80GB A100). |
I am facing an issue when generating data using Tabula.
I trained Tabula on the following datasets:
However, when generating, the generation loop is stuck because generated data shape is always 0 (
num_samples
is always greater thangen_data.shape[0]
).I tried re-training, and tried changing the
max_length
parameter in the sampling function, but it was of no help.Can you please help me figure out how to fix this issue?
The text was updated successfully, but these errors were encountered: