Skip to content

Commit

Permalink
[pre-commit.ci] auto fixes from pre-commit.com hooks
Browse files Browse the repository at this point in the history
for more information, see https://pre-commit.ci
  • Loading branch information
pre-commit-ci[bot] committed Sep 19, 2024
1 parent b73760b commit 66cc6f2
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion transformer_engine/pytorch/attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -120,7 +120,8 @@
if not logger.hasHandlers():
logger.addHandler(_stream_handler)
logger.debug(
"To use flash-attn v3, please follow these steps to install the flashattn-hopper package: \n"
"To use flash-attn v3, please follow these steps to install the flashattn-hopper"
" package: \n"
"""(1) pip install "git+https://github.com/Dao-AILab/flash-attention.git#egg=flashattn-hopper&subdirectory=hopper" \n"""
"""(2) python_path=`python -c "import site; print(site.getsitepackages()[0])"` \n"""
"""(3) mkdir -p $python_path/flashattn_hopper \n"""
Expand Down

0 comments on commit 66cc6f2

Please sign in to comment.