-
Notifications
You must be signed in to change notification settings - Fork 17
Performing inference with pretrained natural language processing models
Here, you will see how to use pretrained natural language processing models to do various things. First, start by created a new Python file called gpt-neo.py
. In that file, lets start by importing the necessary functions:
from quickai import gpt_neo
Now, lets call the gpt_neo
function:
generated_text = gpt_neo("Hello", "2.7B")
print(generated_text)
The gpt_neo
function has 2 required parameters. The first one is the prompt to use to start the generation. The second is which GPT-NEO model to use. Above, we have specified to use the 2.7B model. I chose this model because it performs the best. However, do keep in mind, the 2.7B model is a 10GB one time download. There are also other optional parameters, such as max_length and temp. max_length determines how much text to generate. The default value is 100. temp is the temperature of the text, which is set to 0.9 by default.
Once you run the above code, after the generation is done(the time for this depends on the hardware you are running on), you will see the generated text in the console.
Now, lets try sentiment analysis. Create a new Python file called sentiment_analysis.py
and add the following import:
from quickai import sentiment_analysis
The sentiment_analysis
has only one required parameter, which is the text to perform sentiment analysis on.
sentiment = sentiment_analysis("I love pizza")
print(sentiment)
Go ahead and run that file. Once you run it, you will see the result in the console.
Lets try question and answering now. Create a new Python file, q_a.py
, and add the following import
from quickai import q_and_a
The q_and_a
has 2 required paraters: context, which is the text to get answers from, and question, which is the question to ask:
answer = q_and_a("The bridge was blue. ", "What color was the bridge?")
print(answer)
Once you run the file, you will see the answer to the question in the console.
quickai can also perform summarization and NER. All of the tasks have the same basic structure, and mostly common parameters.
View package on PyPi