Skip to content

Commit

Permalink
Add Mixture of Experts (MoE) link to llm.md
Browse files Browse the repository at this point in the history
  • Loading branch information
PhilipMay committed Dec 29, 2023
1 parent c9b254a commit aeda25a
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions source/machine-learning/llm.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,8 @@
- Paper: <https://arxiv.org/abs/2305.18290>
- <https://plainenglish.io/community/direct-preference-optimization-dpo-a-simplified-approach-to-fine-tuning-large-language-models>
- <https://huggingface.co/blog/dpo-trl>
- Mixture of Experts (MoE)
- HF Blog: [Mixture of Experts Explained](https://huggingface.co/blog/moe)

## Specific Models

Expand Down

0 comments on commit aeda25a

Please sign in to comment.