Skip to content

Commit

Permalink
upd
Browse files Browse the repository at this point in the history
  • Loading branch information
suzanmanasreh committed Jul 17, 2024
1 parent eb202ae commit fe3f9d3
Show file tree
Hide file tree
Showing 4 changed files with 9 additions and 10 deletions.
14 changes: 7 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,18 +42,18 @@ Then to execute and run a case, you can:
mkdir build
cd build
cmake ..
make -j 8 case # or just `make` to make common and all the cases
cd case
mpiexec -n 1 ./initcond
mpiexec -n 2 ./tube # number of nodes can be changed
make -j 8 minicase # or just `make` to make common and all the cases
cd minicase
mpiexec -n 1 ./minit
mpiexec -n 2 ./mtube # number of nodes can be changed
```

This will generate output files in `build/case/D`. To keep output files in `examples/case/D` and use input files in `examples/case/Input`, you can do this instead once files are built:
This will generate output files in `build/minicase/D`. To keep output files in `examples/minicase/D` and use input files in `examples/minicase/Input`, you can do this instead once files are built in the `build` directory:

```shell
cd examples/case
mpiexec -n 1 ../../build/case/initcond
mpiexec -n 2 ../../build/case/tube
mpiexec -n 1 ../../build/case/minit
mpiexec -n 2 ../../build/case/mtube
```

To run a case with more cells and nodes, you should use a supercomputing cluster. Instructions on how to build RBC3D on a cluster are [available here](https://github.com/comp-physics/RBC3D/blob/master/install/readme.md).
Expand Down
1 change: 0 additions & 1 deletion install/clusters.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,4 +61,3 @@ cd examples/case
srun -n 1 ../../build/case/initcond
srun ../../build/case/tube
```
3 changes: 2 additions & 1 deletion install/readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,8 @@
## Use an appropriate computer

* RBC3D has not been tested on WSL or Windows computers broadly. We do not recommend using WSL. Instead, use a Linux partition or a *nix-based computing cluster.
At Georgia Tech we have several, including ICE and PACE Phoenix. Installer scripts for these clusters are available and run instructions are available in `install/clusters.md`.

At Georgia Tech we have several, including ICE and PACE Phoenix. Automated installer scripts and run instructions for these clusters are available in `install/clusters.md`.

## Ensure you have compilers and wrappers

Expand Down
1 change: 0 additions & 1 deletion install/visualization.md

This file was deleted.

0 comments on commit fe3f9d3

Please sign in to comment.