Skip to content

this is Implementation of Transfomrers Numpy Version from Scratch which all LLM based on have abetter understaning cam help to build this type of model in right concept

Notifications You must be signed in to change notification settings

deep-matter/ViT-Pytorch

Repository files navigation

TransformersNumpy-Version

this is Implementation of Transfomrers Numpy Version from Scratch which all LLM based on have abetter understaning cam help to build this type of model in right concept Welcome to the repository for the TransformersNumpy-Version project! This project focuses on implementing Transformers, a groundbreaking model in the field of natural language processing and machine learning, using the powerful numpy library.

Introduction

Transformers have revolutionized the way we process and understand natural language, enabling breakthroughs in tasks such as machine translation, sentiment analysis, and question-answering systems. This repository aims to provide a comprehensive implementation of Transformers using numpy, showcasing the core concepts and functionalities of this powerful model.

Key Features

  • Numpy Implementation: The implementation in this repository heavily relies on the numpy library, allowing for efficient computations and easy-to-understand code.
  • Full Implementation: The repository provides a complete implementation of Transformers, including attention mechanisms, positional encoding, and feed-forward networks. .

Contents

Here's an overview of the contents you'll find in this repository:

  • Decoder.py and Encoder.py: This file contains the main implementation of the Transformer model using numpy. It includes classes and functions for attention mechanisms, positional encoding, and the overall Transformer architecture.
  • LayersNumpy.py: This file provides utility functions for data preprocessing and handling, enabling seamless integration with different datasets.
  • transfomers-explained-mathematic-and-code-in-depth.ipynb: This Jupyter Notebook serves as an example to showcase how to use the Transformer model implemented in this repository. It includes a step-by-step walkthrough of training and evaluating the model on a specific task.
  • main.py: this provide full combinations of the blocks Transformser.

About

this is Implementation of Transfomrers Numpy Version from Scratch which all LLM based on have abetter understaning cam help to build this type of model in right concept

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published