Implementation of a generative language model based on the GPT (Generative Pre-trained Transformer) architecture. The model is designed to generate text based on the training data, and it uses a transformer architecture with self-attention mechanisms.
-
Notifications
You must be signed in to change notification settings - Fork 0
justmogen/nGPT
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Building a Generatively Pretrained Transformer (GPT), following the paper "Attention is All You Need"
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published