Skip to content

AI to predict a masked word in a text sequence using the BERT language model.

Notifications You must be signed in to change notification settings

M4hf0d/cs50ai-attention

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Masked Word Prediction using BERT

Overview

This project involves creating an AI to predict a masked word in a text sequence using the BERT language model. The project is divided into two main parts:

  1. Implementing a program to predict masked words using BERT.
  2. Analyzing the attention diagrams generated by the model to understand what BERT's attention heads focus on.

Requirements

  • Python Version: The latest version of Python you should use is Python 3.12.
  • Dependencies: Install the required dependencies using pip3 install -r requirements.txt.

How to Run

To run the program, use the following command:

$ python mask.py

Usage

Text: We turned down a narrow lane and passed through a small [MASK].

We turned down a narrow lane and passed through a small field.
We turned down a narrow lane and passed through a small clearing.
We turned down a narrow lane and passed through a small park.

About

AI to predict a masked word in a text sequence using the BERT language model.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages