Skip to content

Commit

Permalink
Update [guides][core]evaluation.ipynb
Browse files Browse the repository at this point in the history
  • Loading branch information
ASEM000 committed Apr 10, 2024
1 parent a1a026a commit ce6104b
Showing 1 changed file with 2 additions and 4 deletions.
6 changes: 2 additions & 4 deletions docs/notebooks/[guides][core]evaluation.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"In contrast to some machine learning frameworks that handle different behaviors of layers during training and evaluation with specific methods like `model.eval()` and `model.fit()`, `serket` offers a more explicit approach for managing this behavior. `serket` provides a function called `tree_eval`, which allows users to modify a tree of layers to disable any training-related behavior. The significant difference is that instead of modifying the layers in place `serket` replaces them with their evaluation counterparts.\n",
"Serket uses dispatching over layers to determine the evaluation counterpart of a layer.For instance, during evaluation, the `Dropout` layer is replaced by an `Identity` layer, and the `BatchNorm` layer is replaced by an `EvalBatchNorm` layer. This ensures that when evaluating a tree of layers, users can follow the **\"What you see is what you get\" (WYSIWYG)** principle, meaning that Seeing a `Dropout`/`BatchNorm` in the model have one meaning.\n",
"\n",
"For instance, during evaluation, the `Dropout` layer is replaced by an `Identity` layer, and the `BatchNorm` layer is replaced by an `EvalBatchNorm` layer. This ensures that when evaluating a tree of layers, users can follow the **\"What you see is what you get\" (WYSIWYG)** principle, meaning that Seeing a `Dropout`/`BatchNorm` in the model have one meaning.\n",
"\n",
"Moreover, when building neural networks that includes the likes of `Dropout`, By design threading a training/eval flag is no longer required."
" The dispatch design choice allow user to define layers like `Dropout` and `BatchNorm` that have different behaviors during training and evaluation without having to threading a training/eval flag through the layers."
]
},
{
Expand Down

0 comments on commit ce6104b

Please sign in to comment.