Skip to content

Commit

Permalink
Update openmoe_34b_config.json
Browse files Browse the repository at this point in the history
  • Loading branch information
Orion-Zheng authored Jan 3, 2024
1 parent 616611f commit 50c885f
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion examples/language/openmoe/model/openmoe_34b_config.json
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@
"max_position_embeddings": 2048,
"mlp_gated": true,
"model_type": "llama",
"moe_layer_interval": 3,
"moe_layer_interval": 4,
"num_attention_heads": 24,
"num_experts": 32,
"num_hidden_layers": 32,
Expand Down

0 comments on commit 50c885f

Please sign in to comment.