In-a-rush Sentiment Analysis service using Hugging Face 🤗 and FastAPI. The model used is a Bert-based sentiment analysis model. 🧠
Install Docker 🐳
Build the docker image:
docker build -t sentiment-container .
Run the docker container with the service. The container might take some time to download the models before starting serving.
docker run -dp 8000:8000 sentiment-container
Install conda and pytorch accordingly to your OS and hardware
Create the environment
conda create -n sentiment_analysis python=3.8 pip
Activate the environment
conda activate sentiment_analysis
Install requirements
pip install -r requirements.txt
Run the server
cd src
uvicorn main:app --port 8000
To know more about other endpoints, consult the OpenAPI documentation in http://127.0.0.1:8000/redoc or try out the swagger interactive documentation in http://127.0.0.1:8000/docs
- Sentiment analysis endpoint
Method | POST |
Path | /sentiment |
Body | {"query_string": "String"} |
Code | Body | Description |
---|---|---|
200 | {"label": "Sentiment label", "score": "Sentiment score(confidence)"} |
Response with the sentiment label POSITIVE , NEGATIVE and confidence level |
422 | {"detail": {"loc": [], "msg": "Message", "type": "Type"}} |
Validation error with calidation error details. |
Replace the query_string
parameter with the sentence you want to
curl -X 'POST' \
'http://127.0.0.1:8000/sentiment' \
-H 'accept: application/json' \
-H 'Content-Type: application/json' \
-d '{
"query_string": "You are my best friend"
}'
You will get a response similar to the following
{
"label": "POSITIVE",
"score": 0.9998573064804077
}