Skip to content

Commit

Permalink
update readmes
Browse files Browse the repository at this point in the history
  • Loading branch information
jarlsondre committed Nov 28, 2024
1 parent 6895472 commit 69e1dd2
Show file tree
Hide file tree
Showing 3 changed files with 8 additions and 5 deletions.
10 changes: 6 additions & 4 deletions docs/uv-tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,9 @@ If you wish to use the `uv sync` and/or `uv lock` commands, which is how you use
to manage all your project packages, then note that these commands will only work
with the directory called `.venv` in the project directory. Sometimes, this can be a
bit annoying, especially with an existing venv, so we recommend using a
[symlink](https://en.wikipedia.org/wiki/Symbolic_link).
[symlink](https://en.wikipedia.org/wiki/Symbolic_link). If you need to have multiple
venvs that you want to switch between, you can update the symlink to whichever of them
you want to use at the moment. For SLURM scripts, you can hardcode them if need be.

### Symlinking .venv

Expand Down Expand Up @@ -61,7 +63,7 @@ particular, if you are a developer you would use one of the following two comman
you are on HPC with cuda, you would use:

```bash
uv sync --extra torch --extra dev --extra linux \
uv sync --extra torch --extra dev \
--no-cache \
--index https://download.pytorch.org/whl/cu121
```
Expand Down Expand Up @@ -93,6 +95,6 @@ can use the following command:
uv add <package>
```

> [!NOTE]
> [!Warning]
> This will add the package to your `.venv` venv, so make sure to have symlinked to
> this directory if you haven't already.
> this directory if you haven't already.
1 change: 1 addition & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,7 @@ dependencies = [
"matplotlib>=3.9.2",
"pip>=24.3.1",
"prov4ml@git+https://github.com/matbun/ProvML@new-main",
"ray"
]

[project.optional-dependencies]
Expand Down
2 changes: 1 addition & 1 deletion tutorials/distributed-ml/torch-scaling-test/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ python ddp_trainer.py -c config/base.yaml -c config/ddp.yaml --log-int 42
## Run a single training

Training runs are meant to be submitted via SLURM, from a unified job script file:
`slurm.sh`.You can select the distributed training algorithm and provide the command
`slurm.sh`. You can select the distributed training algorithm and provide the command
to execute setting SLURM environment variables using the `--export` option:

```bash
Expand Down

0 comments on commit 69e1dd2

Please sign in to comment.