Skip to content

Commit

Permalink
DouglasOrr published a site update
Browse files Browse the repository at this point in the history
  • Loading branch information
DouglasOrr committed Oct 2, 2023
1 parent 03500bd commit bcd9955
Show file tree
Hide file tree
Showing 4 changed files with 7,753 additions and 51 deletions.
65 changes: 40 additions & 25 deletions core.html
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ <h1 class="title">Module <code>tensor_tracker.core</code></h1>
as usual. Your <code><a title="tensor_tracker.core.Tracker" href="#tensor_tracker.core.Tracker">Tracker</a></code> will be filled with a list of <code><a title="tensor_tracker.core.Stash" href="#tensor_tracker.core.Stash">Stash</a></code>es, containing
copies of fwd/bwd tensors at (sub)module outputs. (Beware, this can consume
a lot of memory.)</p>
<p>Usage:</p>
<p>Usage (<a href="usage.html">notebook</a>):</p>
<pre><code>with tensor_tracker.track(model) as tracker:
model(inputs).backward()

Expand Down Expand Up @@ -56,8 +56,8 @@ <h1 class="title">Module <code>tensor_tracker.core</code></h1>
<code>tracker = Tracker(); tracker.register(...); tracker.unregister()</code></p>
</li>
</ul>
<p>See also: example of
<a href="example.html">visualising transformer activations &amp; gradients using UMAP</a>.</p>
<p>See also: <a href="example.html">example of
visualising transformer activations &amp; gradients using UMAP</a>.</p>
<details class="source">
<summary>
<span>Expand source code</span>
Expand All @@ -71,7 +71,7 @@ <h1 class="title">Module <code>tensor_tracker.core</code></h1>
copies of fwd/bwd tensors at (sub)module outputs. (Beware, this can consume
a lot of memory.)

Usage:
Usage ([notebook](usage.html)):

```
with tensor_tracker.track(model) as tracker:
Expand All @@ -98,8 +98,8 @@ <h1 class="title">Module <code>tensor_tracker.core</code></h1>
- Manually register/unregister hooks:
`tracker = Tracker(); tracker.register(...); tracker.unregister()`

See also: example of
[visualising transformer activations &amp; gradients using UMAP](example.html).
See also: [example of
visualising transformer activations &amp; gradients using UMAP](example.html).
&#34;&#34;&#34;

import dataclasses
Expand Down Expand Up @@ -278,16 +278,21 @@ <h1 class="title">Module <code>tensor_tracker.core</code></h1>
return len(self.stashes)

def to_frame(
self, stat: Callable[[Tensor], Tensor] = torch.std
self,
stat: Callable[[Tensor], Tensor] = torch.std,
stat_name: Optional[str] = None,
) -&gt; &#34;pandas.DataFrame&#34;: # type:ignore[name-defined] # NOQA: F821
import pandas

column_name = (
getattr(stat, &#34;__name__&#34;, &#34;value&#34;) if stat_name is None else stat_name
)

def to_item(stash: Stash) -&gt; Dict[str, Any]:
d = stash.__dict__.copy()
first_value = stash.first_value
d[&#34;value&#34;] = (
stat(first_value).item() if isinstance(first_value, Tensor) else None
)
d.pop(&#34;value&#34;)
v = stash.first_value
d[column_name] = stat(v).item() if isinstance(v, Tensor) else None
d[&#34;type&#34;] = f&#34;{stash.type.__module__}.{stash.type.__name__}&#34;
return d

Expand Down Expand Up @@ -407,7 +412,7 @@ <h2 class="section-title" id="header-functions">Functions</h2>
as usual. Your <code><a title="tensor_tracker.core.Tracker" href="#tensor_tracker.core.Tracker">Tracker</a></code> will be filled with a list of <code><a title="tensor_tracker.core.Stash" href="#tensor_tracker.core.Stash">Stash</a></code>es, containing
copies of fwd/bwd tensors at (sub)module outputs. (Beware, this can consume
a lot of memory.)</p>
<p>Usage:</p>
<p>Usage (<a href="usage.html">notebook</a>):</p>
<pre><code>with tensor_tracker.track(model) as tracker:
model(inputs).backward()

Expand Down Expand Up @@ -436,8 +441,8 @@ <h2 class="section-title" id="header-functions">Functions</h2>
<code>tracker = Tracker(); tracker.register(...); tracker.unregister()</code></p>
</li>
</ul>
<p>See also: example of
<a href="example.html">visualising transformer activations &amp; gradients using UMAP</a>.</p></div>
<p>See also: <a href="example.html">example of
visualising transformer activations &amp; gradients using UMAP</a>.</p></div>
<details class="source">
<summary>
<span>Expand source code</span>
Expand Down Expand Up @@ -671,16 +676,21 @@ <h3>Instance variables</h3>
return len(self.stashes)

def to_frame(
self, stat: Callable[[Tensor], Tensor] = torch.std
self,
stat: Callable[[Tensor], Tensor] = torch.std,
stat_name: Optional[str] = None,
) -&gt; &#34;pandas.DataFrame&#34;: # type:ignore[name-defined] # NOQA: F821
import pandas

column_name = (
getattr(stat, &#34;__name__&#34;, &#34;value&#34;) if stat_name is None else stat_name
)

def to_item(stash: Stash) -&gt; Dict[str, Any]:
d = stash.__dict__.copy()
first_value = stash.first_value
d[&#34;value&#34;] = (
stat(first_value).item() if isinstance(first_value, Tensor) else None
)
d.pop(&#34;value&#34;)
v = stash.first_value
d[column_name] = stat(v).item() if isinstance(v, Tensor) else None
d[&#34;type&#34;] = f&#34;{stash.type.__module__}.{stash.type.__name__}&#34;
return d

Expand Down Expand Up @@ -750,7 +760,7 @@ <h3>Methods</h3>
</details>
</dd>
<dt id="tensor_tracker.core.Tracker.to_frame"><code class="name flex">
<span>def <span class="ident">to_frame</span></span>(<span>self, stat: Callable[[torch.Tensor], torch.Tensor] = &lt;built-in method std of type object&gt;) ‑> pandas.DataFrame</span>
<span>def <span class="ident">to_frame</span></span>(<span>self, stat: Callable[[torch.Tensor], torch.Tensor] = &lt;built-in method std of type object&gt;, stat_name: Optional[str] = None) ‑> pandas.DataFrame</span>
</code></dt>
<dd>
<div class="desc"></div>
Expand All @@ -759,16 +769,21 @@ <h3>Methods</h3>
<span>Expand source code</span>
</summary>
<pre><code class="python">def to_frame(
self, stat: Callable[[Tensor], Tensor] = torch.std
self,
stat: Callable[[Tensor], Tensor] = torch.std,
stat_name: Optional[str] = None,
) -&gt; &#34;pandas.DataFrame&#34;: # type:ignore[name-defined] # NOQA: F821
import pandas

column_name = (
getattr(stat, &#34;__name__&#34;, &#34;value&#34;) if stat_name is None else stat_name
)

def to_item(stash: Stash) -&gt; Dict[str, Any]:
d = stash.__dict__.copy()
first_value = stash.first_value
d[&#34;value&#34;] = (
stat(first_value).item() if isinstance(first_value, Tensor) else None
)
d.pop(&#34;value&#34;)
v = stash.first_value
d[column_name] = stat(v).item() if isinstance(v, Tensor) else None
d[&#34;type&#34;] = f&#34;{stash.type.__module__}.{stash.type.__name__}&#34;
return d

Expand Down
18 changes: 9 additions & 9 deletions example.html

Large diffs are not rendered by default.

44 changes: 27 additions & 17 deletions index.html
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ <h1 class="title">Package <code>tensor_tracker</code></h1>
as usual. Your <code><a title="tensor_tracker.Tracker" href="#tensor_tracker.Tracker">Tracker</a></code> will be filled with a list of <code><a title="tensor_tracker.Stash" href="#tensor_tracker.Stash">Stash</a></code>es, containing
copies of fwd/bwd tensors at (sub)module outputs. (Beware, this can consume
a lot of memory.)</p>
<p>Usage:</p>
<p>Usage (<a href="usage.html">notebook</a>):</p>
<pre><code>with tensor_tracker.track(model) as tracker:
model(inputs).backward()

Expand Down Expand Up @@ -56,8 +56,8 @@ <h1 class="title">Package <code>tensor_tracker</code></h1>
<code>tracker = Tracker(); tracker.register(...); tracker.unregister()</code></p>
</li>
</ul>
<p>See also: example of
<a href="example.html">visualising transformer activations &amp; gradients using UMAP</a>.</p>
<p>See also: <a href="example.html">example of
visualising transformer activations &amp; gradients using UMAP</a>.</p>
<details class="source">
<summary>
<span>Expand source code</span>
Expand Down Expand Up @@ -159,7 +159,7 @@ <h2 class="section-title" id="header-functions">Functions</h2>
as usual. Your <code><a title="tensor_tracker.Tracker" href="#tensor_tracker.Tracker">Tracker</a></code> will be filled with a list of <code><a title="tensor_tracker.Stash" href="#tensor_tracker.Stash">Stash</a></code>es, containing
copies of fwd/bwd tensors at (sub)module outputs. (Beware, this can consume
a lot of memory.)</p>
<p>Usage:</p>
<p>Usage (<a href="usage.html">notebook</a>):</p>
<pre><code>with tensor_tracker.track(model) as tracker:
model(inputs).backward()

Expand Down Expand Up @@ -188,8 +188,8 @@ <h2 class="section-title" id="header-functions">Functions</h2>
<code>tracker = Tracker(); tracker.register(...); tracker.unregister()</code></p>
</li>
</ul>
<p>See also: example of
<a href="example.html">visualising transformer activations &amp; gradients using UMAP</a>.</p></div>
<p>See also: <a href="example.html">example of
visualising transformer activations &amp; gradients using UMAP</a>.</p></div>
<details class="source">
<summary>
<span>Expand source code</span>
Expand Down Expand Up @@ -423,16 +423,21 @@ <h3>Instance variables</h3>
return len(self.stashes)

def to_frame(
self, stat: Callable[[Tensor], Tensor] = torch.std
self,
stat: Callable[[Tensor], Tensor] = torch.std,
stat_name: Optional[str] = None,
) -&gt; &#34;pandas.DataFrame&#34;: # type:ignore[name-defined] # NOQA: F821
import pandas

column_name = (
getattr(stat, &#34;__name__&#34;, &#34;value&#34;) if stat_name is None else stat_name
)

def to_item(stash: Stash) -&gt; Dict[str, Any]:
d = stash.__dict__.copy()
first_value = stash.first_value
d[&#34;value&#34;] = (
stat(first_value).item() if isinstance(first_value, Tensor) else None
)
d.pop(&#34;value&#34;)
v = stash.first_value
d[column_name] = stat(v).item() if isinstance(v, Tensor) else None
d[&#34;type&#34;] = f&#34;{stash.type.__module__}.{stash.type.__name__}&#34;
return d

Expand Down Expand Up @@ -502,7 +507,7 @@ <h3>Methods</h3>
</details>
</dd>
<dt id="tensor_tracker.Tracker.to_frame"><code class="name flex">
<span>def <span class="ident">to_frame</span></span>(<span>self, stat: Callable[[torch.Tensor], torch.Tensor] = &lt;built-in method std of type object&gt;) ‑> pandas.DataFrame</span>
<span>def <span class="ident">to_frame</span></span>(<span>self, stat: Callable[[torch.Tensor], torch.Tensor] = &lt;built-in method std of type object&gt;, stat_name: Optional[str] = None) ‑> pandas.DataFrame</span>
</code></dt>
<dd>
<div class="desc"></div>
Expand All @@ -511,16 +516,21 @@ <h3>Methods</h3>
<span>Expand source code</span>
</summary>
<pre><code class="python">def to_frame(
self, stat: Callable[[Tensor], Tensor] = torch.std
self,
stat: Callable[[Tensor], Tensor] = torch.std,
stat_name: Optional[str] = None,
) -&gt; &#34;pandas.DataFrame&#34;: # type:ignore[name-defined] # NOQA: F821
import pandas

column_name = (
getattr(stat, &#34;__name__&#34;, &#34;value&#34;) if stat_name is None else stat_name
)

def to_item(stash: Stash) -&gt; Dict[str, Any]:
d = stash.__dict__.copy()
first_value = stash.first_value
d[&#34;value&#34;] = (
stat(first_value).item() if isinstance(first_value, Tensor) else None
)
d.pop(&#34;value&#34;)
v = stash.first_value
d[column_name] = stat(v).item() if isinstance(v, Tensor) else None
d[&#34;type&#34;] = f&#34;{stash.type.__module__}.{stash.type.__name__}&#34;
return d

Expand Down
Loading

0 comments on commit bcd9955

Please sign in to comment.