Skip to content

Latest commit

 

History

History
5 lines (4 loc) · 378 Bytes

README.md

File metadata and controls

5 lines (4 loc) · 378 Bytes

Fine-tuning the pretrained T5 transformer model for abstractive summary of text

This is based on a tutorial video on YouTube by "Venelin Valkov" ("https://www.youtube.com/watch?v=KMyZUIraHio&t=1287s"). This can be used to fine-tune this highly powerful transformer for summarization task which is already pretrained on C4 dataset of 750GB of text. Please chech the notebook.