Skip to content

Commit

Permalink
Merge branch 'Llama-DefaultLN' of https://github.com/allenai/LLM into…
Browse files Browse the repository at this point in the history
… Llama-DefaultLN
  • Loading branch information
dirkgr committed Oct 25, 2023
2 parents 06558d4 + e9c92c8 commit bb8e2f6
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion olmo/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -948,7 +948,7 @@ def forward(
):
# shape: (batch_size, seq_len, d_model)
x, cache = self.__activation_checkpoint_fn(
block, attention_bias=attention_bias, layer_past=layer_past, use_cache=use_cache
block, x, attention_bias=attention_bias, layer_past=layer_past, use_cache=use_cache
)

if attn_key_values is not None:
Expand Down

0 comments on commit bb8e2f6

Please sign in to comment.