Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request]: batch_size for notears with GPU #212

Open
1 task done
blacksnail789521 opened this issue Sep 14, 2023 · 0 comments
Open
1 task done

[Feature Request]: batch_size for notears with GPU #212

blacksnail789521 opened this issue Sep 14, 2023 · 0 comments
Labels
enhancement New feature or request

Comments

@blacksnail789521
Copy link

Contact Details

[email protected]

Feature Description

When using from causalnex.structure.pytorch.notears import from_pandas for causal graph learning with GPU support, there is no option to set the batch_size for PyTorch. It appears that the batch size defaults to the number of samples, as indicated by the code: X_torch = torch.from_numpy(x).float().to(self.device). Consequently, with data shaped (12000, 55), an 8GB RTX3070 encounters out-of-memory issues.

Why is this feature needed?

Allowing for adjustable batch sizes ensures that the library can handle datasets of varying sizes without memory constraints.

Additional Context

No response

Possible implementation

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct
@blacksnail789521 blacksnail789521 added the enhancement New feature or request label Sep 14, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant