Skip to content

Commit

Permalink
Re-format code in comments for readthedocs.
Browse files Browse the repository at this point in the history
PiperOrigin-RevId: 318963204
  • Loading branch information
Lukasz Kaiser authored and copybara-github committed Jun 30, 2020
1 parent f8e474f commit a4887f7
Show file tree
Hide file tree
Showing 2 changed files with 13 additions and 16 deletions.
14 changes: 6 additions & 8 deletions trax/fastmath/ops.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,14 +18,12 @@
Import these operations directly from fastmath and import fastmath.numpy as np:
```
from trax import fastmath
from trax.fastmath import numpy as np
x = np.array([1.0, 2.0]) # Use like numpy.
y = np.exp(x) # Common numpy ops are available and accelerated.
z = fastmath.logsumexp(y) # Special operations (below) available from fastmath.
```
>>> from trax import fastmath
>>> from trax.fastmath import numpy as np
>>>
>>> x = np.array([1.0, 2.0]) # Use like numpy.
>>> y = np.exp(x) # Common numpy ops are available and accelerated.
>>> z = fastmath.logsumexp(y) # Special operations available from fastmath.
Trax uses either TensorFlow 2 or JAX as backend for accelerating operations.
You can select which one to use (e.g., for debugging) with `use_backend`.
Expand Down
15 changes: 7 additions & 8 deletions trax/layers/combinators.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,14 +27,13 @@ class Serial(base.Layer):
"""Combinator that applies layers serially (by function composition).
This combinator is commonly used to construct deep networks, e.g., like this:
```
mlp = tl.Serial(
tl.Dense(128),
tl.Relu(),
tl.Dense(10),
tl.LogSoftmax()
)
```
>>> mlp = tl.Serial(
>>> tl.Dense(128),
>>> tl.Relu(),
>>> tl.Dense(10),
>>> tl.LogSoftmax()
>>> )
A Serial combinator uses stack semantics to manage data for its sublayers.
Each sublayer sees only the inputs it needs and returns only the outputs it
Expand Down

0 comments on commit a4887f7

Please sign in to comment.