Skip to content

Commit

Permalink
remove default attention type
Browse files Browse the repository at this point in the history
  • Loading branch information
wenxindongwork committed Sep 10, 2024
1 parent 7631466 commit 170e1e4
Show file tree
Hide file tree
Showing 3 changed files with 0 additions and 3 deletions.
1 change: 0 additions & 1 deletion MaxText/configs/models/gemma2-27b.yml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,6 @@ vocab_size: 256128
decoder_block: "gemma2"
normalization_layer_epsilon: 1.e-06
logits_via_embedding: True
attention: "dot_product"
final_logits_soft_cap: 30.0
attn_logits_soft_cap: 50.0
sliding_window_size: 4096
Expand Down
1 change: 0 additions & 1 deletion MaxText/configs/models/gemma2-2b.yml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,6 @@ vocab_size: 256128
decoder_block: "gemma2"
normalization_layer_epsilon: 1.e-06
logits_via_embedding: True
attention: "dot_product"
final_logits_soft_cap: 30.0
attn_logits_soft_cap: 50.0
sliding_window_size: 4096
Expand Down
1 change: 0 additions & 1 deletion MaxText/configs/models/gemma2-9b.yml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,6 @@ vocab_size: 256128
decoder_block: "gemma2"
normalization_layer_epsilon: 1.e-06
logits_via_embedding: True
attention: "dot_product"
final_logits_soft_cap: 30.0
attn_logits_soft_cap: 50.0
sliding_window_size: 4096
Expand Down

0 comments on commit 170e1e4

Please sign in to comment.