Skip to content
#

transformer-model

Here are 9 public repositories matching this topic...

Language: All
Filter by language

Drawing inspiration from Andrej Karpathy’s iconic lecture, "Let’s Build GPT: From Scratch, in Code, Spelled Out", this project takes you on an immersive journey into the inner workings of GPT. Step-by-step, we’ll construct a GPT model from the ground up, demystifying its architecture and bringing its mechanics to life through hands-on coding.

  • Updated Dec 1, 2024
  • Jupyter Notebook

This project demonstrates text summarization using the BART (Bidirectional and Auto-Regressive Transformers) model. BART is a transformer model trained as a denoising autoencoder and is effective for text generation tasks such as summarization.

  • Updated Sep 7, 2024
  • Python

Blip Image Captioning + GPT-2 Happy Model: Generate joyful responses to image captions using state-of-the-art NLP and computer vision. Pretrained models and data preprocessing included for seamless integration. Explore the intersection of deep learning, sentiment analysis, and language generation

  • Updated Jun 4, 2023
  • Python

Improve this page

Add a description, image, and links to the transformer-model topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the transformer-model topic, visit your repo's landing page and select "manage topics."

Learn more