Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add paddle backend #313

Conversation

HydrogenSulfate
Copy link
Contributor

@HydrogenSulfate HydrogenSulfate commented Sep 12, 2024

Category

  • New feature
  • Bugfix
  • Breaking change
  • Refactoring
  • Documentation
  • Other (please explain)

Description

As talked in #285, we now add PaddlePaddle framework as one of backends.

run test:

# cu123
python -m pip install --pre paddlepaddle-gpu -i https://www.paddlepaddle.org.cn/packages/nightly/cu123/
# cu118
# python -m pip install --pre paddlepaddle-gpu -i https://www.paddlepaddle.org.cn/packages/nightly/cu118/
cd warp/
python warp/tests/test_paddle.py

output:

Warp 1.3.3 initialized:
   CUDA Toolkit 11.6, Driver 12.0
   Devices:
     "cpu"      : "x86_64"
     "cuda:0"   : "Tesla V100-PCIE-16GB" (16 GiB, sm_70, mempool enabled)
   Kernel cache:
     /root/.cache/warp/1.3.3
W0919 20:40:12.708083 18145 gpu_resources.cc:119] Please NOTE: device: 0, GPU Compute Capability: 7.0, Driver API Version: 12.0, Runtime API Version: 11.6
W0919 20:40:12.744881 18145 gpu_resources.cc:164] device: 0, cuDNN Version: 8.4.
test_array_ctype_from_paddle_cuda_0 (__main__.TestPaddle) ... ok
test_device_conversion_cuda_0 (__main__.TestPaddle) ... ok
test_direct_cuda_0 (__main__.TestPaddle) ... Module __main__ f49fde0 load on device 'cuda:0' took 673.95 ms  (compiled)
ok
test_dtype_from_paddle (__main__.TestPaddle) ... ok
test_dtype_to_paddle (__main__.TestPaddle) ... ok
test_from_paddle_cuda_0 (__main__.TestPaddle) ... ok
test_from_paddle_slices_cuda_0 (__main__.TestPaddle) ... ok
test_from_paddle_zero_strides_cuda_0 (__main__.TestPaddle) ... ok
test_paddle_autograd_cuda_0 (__main__.TestPaddle) ... ok
test_to_paddle_cuda_0 (__main__.TestPaddle) ... ok

----------------------------------------------------------------------
Ran 10 tests in 0.805s

OK

todolist:

Changelog

  • Add specific line-by-line info of high level changes in this PR.

Before your PR is "Ready for review"

  • Do you agree to the terms under which contributions are accepted as described in Section 9 the Warp License?
  • Have you read the Contributor Guidelines?
  • Have you written any new necessary tests?
  • Have you added or updated any necessary documentation?
  • Have you added any files modified by compiling Warp and building the documentation to this PR (.e.g. stubs.py, functions.rst)?
  • Does your code pass ruff check and ruff format --check?

@HydrogenSulfate HydrogenSulfate marked this pull request as draft September 12, 2024 13:27
@HydrogenSulfate HydrogenSulfate marked this pull request as ready for review September 19, 2024 12:41
@HydrogenSulfate HydrogenSulfate changed the title [WIP] Add paddle backend Add paddle backend Sep 19, 2024
@HydrogenSulfate
Copy link
Contributor Author

@mmacklin Can you help to review this code? Thanks :)

@shi-eric
Copy link
Contributor

Hi @HydrogenSulfate, we are currently busy preparing for the 1.4 release so I don't think we can review your changes until after next week, but in the meantime, can you squash your changes down to a single commit? This is because the way we merge external contributions right now means that we can't squash it ourselves, or else your name disappears from the author list. Thanks!

nvlukasz and others added 26 commits September 28, 2024 11:39
update interoperability.rst related to paddle content

add Paddle in index.rst
@HydrogenSulfate HydrogenSulfate mentioned this pull request Sep 28, 2024
14 tasks
@HydrogenSulfate
Copy link
Contributor Author

HydrogenSulfate commented Sep 28, 2024

Hi @HydrogenSulfate, we are currently busy preparing for the 1.4 release so I don't think we can review your changes until after next week, but in the meantime, can you squash your changes down to a single commit? This is because the way we merge external contributions right now means that we can't squash it ourselves, or else your name disappears from the author list. Thanks!

Thanks for reply, commits will be squashed in #318

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.