Skip to content

Commit

Permalink
Publish shardy docs to openxla.org/shardy
Browse files Browse the repository at this point in the history
PiperOrigin-RevId: 694307625
  • Loading branch information
GleasonK authored and copybara-github committed Nov 8, 2024
1 parent 0b621c3 commit f4fb5cd
Show file tree
Hide file tree
Showing 3 changed files with 23 additions and 11 deletions.
22 changes: 17 additions & 5 deletions docs/_toc.yaml
Original file line number Diff line number Diff line change
@@ -1,6 +1,18 @@
toc:
- heading: Shardy documentation
- title: Getting started
section:
- title: Overview
path: sdy_dialect.md
- heading: Shardy documentation
- title: Getting started
section:
- title: Introduction
path: /shardy
- title: Reference documentation
section:
- title: SDY dialect
path: /shardy/sdy_dialect
- title: SDY export passes
path: /shardy/sdy_export_passes
- title: SDY import passes
path: /shardy/sdy_import_passes
- title: SDY op interfaces
path: /shardy/sdy_op_interfaces
- title: SDY propagation passes
path: /shardy/sdy_propagation_passes
6 changes: 3 additions & 3 deletions docs/sdy_export_passes.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,13 +8,13 @@ sharded operands and produce a sharded result without requiring any reshard
communications (note that the operation might still require communication
such as all-reduce or halo-swaps).

After propagation, some opeartions may still have incompatible shardings.
After propagation, some operations may still have incompatible shardings.

Please note, when an axis (or sub-axis) is used to shard non-corresponding
dimensions (e.g. non-contracting dimensions in matmul) across multiple
tensors, or when an axis shards a dimension in one tensor but not the
corresponding dimension in the other tensor, it is said that the operation
has a sharding conflict. Hence, after this pass, the opeartions become
has a sharding conflict. Hence, after this pass, the operations become
conflict-free.

This pass injects reshard operations explicitly so that, for each operation,
Expand Down Expand Up @@ -47,7 +47,7 @@ In the example above, there is a conflict since `lhs` and `rhs` tensors
are both sharded on axis "x" on their non-contracting dimensions. Here,
`rhs` tensor is resharded, before the dot operation, explicitly to be
sharded only on its first dimension and on axis "x". This way, the dot
opearation becomes compatible.
operation becomes compatible.
### `-sdy-remove-sharding-groups`

_Removes ShardingGroupOps after propagation._
Expand Down
6 changes: 3 additions & 3 deletions shardy/dialect/sdy/transforms/export/passes.td
Original file line number Diff line number Diff line change
Expand Up @@ -52,13 +52,13 @@ def InsertExplicitReshardsPass : Pass<"sdy-insert-explicit-reshards", "func::Fun
communications (note that the operation might still require communication
such as all-reduce or halo-swaps).

After propagation, some opeartions may still have incompatible shardings.
After propagation, some operations may still have incompatible shardings.

Please note, when an axis (or sub-axis) is used to shard non-corresponding
dimensions (e.g. non-contracting dimensions in matmul) across multiple
tensors, or when an axis shards a dimension in one tensor but not the
corresponding dimension in the other tensor, it is said that the operation
has a sharding conflict. Hence, after this pass, the opeartions become
has a sharding conflict. Hence, after this pass, the operations become
conflict-free.

This pass injects reshard operations explicitly so that, for each operation,
Expand Down Expand Up @@ -91,7 +91,7 @@ def InsertExplicitReshardsPass : Pass<"sdy-insert-explicit-reshards", "func::Fun
are both sharded on axis "x" on their non-contracting dimensions. Here,
`rhs` tensor is resharded, before the dot operation, explicitly to be
sharded only on its first dimension and on axis "x". This way, the dot
opearation becomes compatible.
operation becomes compatible.
}];
let dependentDialects = ["mlir::sdy::SdyDialect"];
}
Expand Down

0 comments on commit f4fb5cd

Please sign in to comment.