From 5efd40a51520adf2df1132d42f99f683ad6717ff Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?F=C3=A9lix=20Marty?= <9808326+fxmarty@users.noreply.github.com> Date: Mon, 16 Oct 2023 10:08:35 +0200 Subject: [PATCH 1/3] fix broken link --- docs/source/onnxruntime/usage_guides/pipelines.mdx | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/docs/source/onnxruntime/usage_guides/pipelines.mdx b/docs/source/onnxruntime/usage_guides/pipelines.mdx index c12185b560d..e3f327e66ee 100644 --- a/docs/source/onnxruntime/usage_guides/pipelines.mdx +++ b/docs/source/onnxruntime/usage_guides/pipelines.mdx @@ -55,11 +55,10 @@ There are tags on the Model Hub that allow you to filter for a model you'd like -To be able to load the model with the ONNX Runtime backend, the export to ONNX needs -to be supported for the considered architecture. +To be able to load the model with the ONNX Runtime backend, the export to ONNX needs to be supported for the considered architecture. You can check the list of supported architectures -[here](/exporters/onnx/package_reference/configuration#Supported-architectures). +[here](/exporters/onnx/overview#overview). From d400bbe813126c8db5645b79c74732f8ab1505a5 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?F=C3=A9lix=20Marty?= <9808326+fxmarty@users.noreply.github.com> Date: Mon, 16 Oct 2023 16:39:10 +0200 Subject: [PATCH 2/3] more fixes --- docs/source/bettertransformer/overview.mdx | 2 ++ docs/source/onnxruntime/usage_guides/pipelines.mdx | 3 +-- 2 files changed, 3 insertions(+), 2 deletions(-) diff --git a/docs/source/bettertransformer/overview.mdx b/docs/source/bettertransformer/overview.mdx index 0c79d8c7551..861a8751903 100644 --- a/docs/source/bettertransformer/overview.mdx +++ b/docs/source/bettertransformer/overview.mdx @@ -50,6 +50,7 @@ The list of supported model below: - [DeiT](https://arxiv.org/abs/2012.12877) - [Electra](https://arxiv.org/abs/2003.10555) - [Ernie](https://arxiv.org/abs/1904.09223) +- [Falcon](https://arxiv.org/abs/2306.01116) - [FSMT](https://arxiv.org/abs/1907.06616) - [GPT2](https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf) - [GPT-j](https://huggingface.co/EleutherAI/gpt-j-6B) @@ -58,6 +59,7 @@ The list of supported model below: - [GPT BigCode](https://arxiv.org/abs/2301.03988) (SantaCoder, StarCoder) - [HuBERT](https://arxiv.org/pdf/2106.07447.pdf) - [LayoutLM](https://arxiv.org/abs/1912.13318) +- [Llama & Llama2](https://arxiv.org/abs/2302.13971) - [MarkupLM](https://arxiv.org/abs/2110.08518) - [Marian](https://arxiv.org/abs/1804.00344) - [MBart](https://arxiv.org/abs/2001.08210) diff --git a/docs/source/onnxruntime/usage_guides/pipelines.mdx b/docs/source/onnxruntime/usage_guides/pipelines.mdx index e3f327e66ee..6e01849accf 100644 --- a/docs/source/onnxruntime/usage_guides/pipelines.mdx +++ b/docs/source/onnxruntime/usage_guides/pipelines.mdx @@ -57,8 +57,7 @@ There are tags on the Model Hub that allow you to filter for a model you'd like To be able to load the model with the ONNX Runtime backend, the export to ONNX needs to be supported for the considered architecture. -You can check the list of supported architectures -[here](/exporters/onnx/overview#overview). +You can check the list of supported architectures [here](optimum/exporters/onnx/overview#overview). From bfbee9e6cee86ca0f69669de66d5e3ec32a9e11b Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?F=C3=A9lix=20Marty?= <9808326+fxmarty@users.noreply.github.com> Date: Tue, 17 Oct 2023 10:50:24 +0200 Subject: [PATCH 3/3] let's just hardcode the link --- docs/source/onnxruntime/usage_guides/pipelines.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/onnxruntime/usage_guides/pipelines.mdx b/docs/source/onnxruntime/usage_guides/pipelines.mdx index 6e01849accf..cdd85564783 100644 --- a/docs/source/onnxruntime/usage_guides/pipelines.mdx +++ b/docs/source/onnxruntime/usage_guides/pipelines.mdx @@ -57,7 +57,7 @@ There are tags on the Model Hub that allow you to filter for a model you'd like To be able to load the model with the ONNX Runtime backend, the export to ONNX needs to be supported for the considered architecture. -You can check the list of supported architectures [here](optimum/exporters/onnx/overview#overview). +You can check the list of supported architectures [here](https://huggingface.co/docs/optimum/exporters/onnx/overview#overview).