Skip to content

A simple example of deploying a pre-trained BERT model as a REST API

Notifications You must be signed in to change notification settings

sshkhr/BERTdeploy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

BERTdeploy

This is a small app I built using HuggingFace Transformers and FastAPI to perform text classification using the pre-trained DistilBERT model. I mostly relied on the excellent tutorial by Venelin to build this (ref 1). I made a few key changes to his approach:

  • Used pre-trained model instead of fine-tuning
  • Used requirement.txt for pip instead of using pipenv
  • Did not use a lot of extra code style packages

How to use?

  • pip install requirements.txt
  • bash bin/run_server

Then make your API call:

http POST http://127.0.0.1:8000/classify text="Pre-trained DistilBERT seems to work quite well!"

You'll get an output like:

{
    "confidence": 0.9998160004615784,
    "probabilities": {
        "negative": 0.00018407008610665798,
        "positive": 0.9998160004615784
    },
    "sentiment": "positive"
}

TO-DO

  • Deploy on Heroku - will need to find a workaround for downloading pre-trained models since Heroku's file system is ephemeral
  • Add tests for API calls

References

About

A simple example of deploying a pre-trained BERT model as a REST API

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published