Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Assignment 1 - Updated The Base Model and Added Comments #13

Open
wants to merge 4 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
# mgmt590-lec1

Code For Answering Questions

In this version, we have fit a different model (xlm-roberta-large-squad2) and have added comments for better readability

References -
https://huggingface.co/deepset/xlm-roberta-large-squad2
47 changes: 40 additions & 7 deletions answer.py
Original file line number Diff line number Diff line change
@@ -1,12 +1,45 @@
import pandas as pd
from transformers.pipelines import pipeline
from flask import Flask
from flask import request
from flask import jsonify

hg_comp = pipeline('question-answering', model="distilbert-base-uncased-distilled-squad", tokenizer="distilbert-base-uncased-distilled-squad")
# Create my flask app
app = Flask(__name__)

data = pd.read_csv('examples.csv')
# Define a handler for the / path, which
# returns "Hello World"
@app.route("/")
def hello_world():
return "<p>Hello, World!</p>"

for idx, row in data.iterrows():
context = row['context']
question = row['question']
answer = hg_comp({'question': question, 'context': context})['answer']
print(answer)
# Define a handler for the /answer path, which
# processes a JSON payload with a question and
# context and returns an answer using a Hugging
# Face model.
@app.route("/answer", methods=['POST'])
def answer():

# Get the request body data
data = request.json

# Import model
hg_comp = pipeline('question-answering', model="distilbert-base-uncased-distilled-squad", tokenizer="distilbert-base-uncased-distilled-squad")

# Answer the answer
answer = hg_comp({'question': data['question'], 'context': data['context']})['answer']

# Create the response body.
out = {
"question": data['question'],
"context": data['context'],
"answer": answer
}

return jsonify(out)

# Run if running "python answer.py"
if __name__ == '__main__':

# Run our Flask app and start listening for requests!
app.run(host='0.0.0.0', port=8000, threaded=True)