Skip to content
/ nGPT Public

Building a Generatively Pretrained Transformer (GPT), following the paper "Attention is All You Need"

Notifications You must be signed in to change notification settings

justmogen/nGPT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 

Repository files navigation

Implementation of a generative language model based on the GPT (Generative Pre-trained Transformer) architecture. The model is designed to generate text based on the training data, and it uses a transformer architecture with self-attention mechanisms.

About

Building a Generatively Pretrained Transformer (GPT), following the paper "Attention is All You Need"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages