Skip to content

jfhetzer/qa-service

Folders and files

NameName
Last commit message
Last commit date

Latest commit

author
Jannik Hetzer
Sep 9, 2022
f3b2dea · Sep 9, 2022

History

1 Commit
Sep 9, 2022
Sep 9, 2022
Sep 9, 2022
Sep 9, 2022
Sep 9, 2022
Sep 9, 2022
Sep 9, 2022
Sep 9, 2022

Repository files navigation

Containerized Question Answering Service

Simple Flask REST service for English question answering. Leverages the transformer API and a RoBERTa model with question answering-head pretrained by deepset on the SQuAD2.0 dataset (learn more here). Runs on CPU only.

Setup and Start Service

This service can be either set up in a Python 3.8 environment or build as Docker image.

Python Environment

Install all necessary requirements in your Python environment via:

pip install -r requirements.txt

To start the server run:

python server.py -p <PORT>

With the optional parameter -p the port can be changed from the default port 5000.

Docker Container

To build the docker image run the following command:

docker build -t qaservice:final .

To start a container with the previously built image run the following command or use your favorite framework for deployment.

docker run -d -p <PORT>:5000 --name qaservice qaservice:final

Other than in the [Python Environment](#Python Environment) the parameter -p is mandatory since the internal port has to be mapped to a port of the host.

API Usage

The service exposes a single REST endpoint: POST /inference which expects a JSON object of the following format

{
    impossible: boolean [optional, default: True]
    top_k: integer [optional, default: 1]
    data: [
        {
            questions: [
                'question1',
                'question2',
                ...
            ]
            context: 'context'
        },
        ...
    ]
}

The parameter impossible can be set to False if all questions can be answered with the given context and the model should be forced to return a valid answer. The parameter top_k can be set to an arbitrary value k >= 1 to get the best k answers for each question.

The questions and contexts are provided under the parameter data in a nested form in which all questions are grouped by the corresponding context.

Example requests can be found in the folder requests and can be sent with python requests.py -p . The parameter -p has to be set if the port on the host differs from the default port 5000. Responses will be logged in an automatically created build folder.

About

containerized question answering service

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published