From ca50493dc548a69e0d71f1e86abcb64fe05a8ea5 Mon Sep 17 00:00:00 2001 From: Sayan Roy Date: Fri, 14 Apr 2023 01:41:54 +0530 Subject: [PATCH 1/5] fixing the typo #990 --- website/docs/Use-Cases/Auto-Generation.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/Use-Cases/Auto-Generation.md b/website/docs/Use-Cases/Auto-Generation.md index 94d9742dcf..214ee87c37 100644 --- a/website/docs/Use-Cases/Auto-Generation.md +++ b/website/docs/Use-Cases/Auto-Generation.md @@ -100,7 +100,7 @@ The returned `config` contains the optimized configuration and `analysis` contai ### Perform inference with the tuned config -One can use [`flaml.oai.Completion.create`](../reference/autogen/oai/completion#create) to performance inference. It materializes a prompt using a given context. For example, +One can use [`flaml.oai.Completion.create`](../reference/autogen/oai/completion#create) to perform inference. It materializes a prompt using a given context. For example, ```python response = oai.Completion.create(problme=problem, **config) From 690ce2981e7867dfad6954f6e31f37134ad75fbb Mon Sep 17 00:00:00 2001 From: Sayan Roy Date: Wed, 19 Apr 2023 20:12:27 +0530 Subject: [PATCH 2/5] Update website/docs/Use-Cases/Auto-Generation.md Co-authored-by: Chi Wang --- website/docs/Use-Cases/Auto-Generation.md | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/website/docs/Use-Cases/Auto-Generation.md b/website/docs/Use-Cases/Auto-Generation.md index e7427e27ff..931dae9a0c 100644 --- a/website/docs/Use-Cases/Auto-Generation.md +++ b/website/docs/Use-Cases/Auto-Generation.md @@ -100,8 +100,7 @@ The returned `config` contains the optimized configuration and `analysis` contai ## Perform inference with the tuned config - -One can use [`flaml.oai.Completion.create`](../reference/autogen/oai/completion#create) to perform inference. It materializes a prompt using a given context. For example, +One can use [`flaml.oai.Completion.create`](../reference/autogen/oai/completion#create) to perform inference. There are a number of benefits of using `flaml.oai.Completion.create` to perform inference. From 8a8f64d1f7b9851343376dea2f347cb3b7cbd548 Mon Sep 17 00:00:00 2001 From: Sayan Roy Date: Wed, 19 Apr 2023 20:12:53 +0530 Subject: [PATCH 3/5] removing extra space : Update website/docs/Use-Cases/Auto-Generation.md Co-authored-by: Chi Wang --- website/docs/Use-Cases/Auto-Generation.md | 1 - 1 file changed, 1 deletion(-) diff --git a/website/docs/Use-Cases/Auto-Generation.md b/website/docs/Use-Cases/Auto-Generation.md index 931dae9a0c..1478b6f176 100644 --- a/website/docs/Use-Cases/Auto-Generation.md +++ b/website/docs/Use-Cases/Auto-Generation.md @@ -101,7 +101,6 @@ The returned `config` contains the optimized configuration and `analysis` contai ## Perform inference with the tuned config One can use [`flaml.oai.Completion.create`](../reference/autogen/oai/completion#create) to perform inference. - There are a number of benefits of using `flaml.oai.Completion.create` to perform inference. A template is either a format str, or a function which produces a str from several input fields. From 5b9c600a52c01613590784573e2239cce0146945 Mon Sep 17 00:00:00 2001 From: Sayan Roy Date: Wed, 19 Apr 2023 20:13:06 +0530 Subject: [PATCH 4/5] Update website/docs/Use-Cases/Auto-Generation.md Co-authored-by: Chi Wang --- website/docs/Use-Cases/Auto-Generation.md | 1 - 1 file changed, 1 deletion(-) diff --git a/website/docs/Use-Cases/Auto-Generation.md b/website/docs/Use-Cases/Auto-Generation.md index 1478b6f176..aea3b39d29 100644 --- a/website/docs/Use-Cases/Auto-Generation.md +++ b/website/docs/Use-Cases/Auto-Generation.md @@ -121,7 +121,6 @@ It is easy to hit error when calling OpenAI APIs, due to connection, rate limit, If the provided prompt or message is a template, it will be automatically materialized with a given context. For example, - ```python response = oai.Completion.create(problme=problem, prompt="{problem} Solve the problem carefully.", **config) ``` From 917ebf3f1c286e8d24724e72cf63ae3a308d1cf8 Mon Sep 17 00:00:00 2001 From: Chi Wang Date: Tue, 25 Apr 2023 21:56:34 -0700 Subject: [PATCH 5/5] Update website/docs/Use-Cases/Auto-Generation.md --- website/docs/Use-Cases/Auto-Generation.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/Use-Cases/Auto-Generation.md b/website/docs/Use-Cases/Auto-Generation.md index aea3b39d29..ba1047173c 100644 --- a/website/docs/Use-Cases/Auto-Generation.md +++ b/website/docs/Use-Cases/Auto-Generation.md @@ -100,7 +100,7 @@ The returned `config` contains the optimized configuration and `analysis` contai ## Perform inference with the tuned config -One can use [`flaml.oai.Completion.create`](../reference/autogen/oai/completion#create) to perform inference. +One can use [`flaml.oai.Completion.create`](../reference/autogen/oai/completion#create) to perform inference. There are a number of benefits of using `flaml.oai.Completion.create` to perform inference. A template is either a format str, or a function which produces a str from several input fields.