From 314f31ada201e3ead1362bf34526939fbed7e15f Mon Sep 17 00:00:00 2001 From: fxmarty <9808326+fxmarty@users.noreply.github.com> Date: Tue, 17 Oct 2023 10:50:43 +0200 Subject: [PATCH] Fix broken ORT doc link (#1452) * fix broken link * more fixes * let's just hardcode the link --- docs/source/bettertransformer/overview.mdx | 2 ++ docs/source/onnxruntime/usage_guides/pipelines.mdx | 6 ++---- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/source/bettertransformer/overview.mdx b/docs/source/bettertransformer/overview.mdx index 0c79d8c7551..861a8751903 100644 --- a/docs/source/bettertransformer/overview.mdx +++ b/docs/source/bettertransformer/overview.mdx @@ -50,6 +50,7 @@ The list of supported model below: - [DeiT](https://arxiv.org/abs/2012.12877) - [Electra](https://arxiv.org/abs/2003.10555) - [Ernie](https://arxiv.org/abs/1904.09223) +- [Falcon](https://arxiv.org/abs/2306.01116) - [FSMT](https://arxiv.org/abs/1907.06616) - [GPT2](https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf) - [GPT-j](https://huggingface.co/EleutherAI/gpt-j-6B) @@ -58,6 +59,7 @@ The list of supported model below: - [GPT BigCode](https://arxiv.org/abs/2301.03988) (SantaCoder, StarCoder) - [HuBERT](https://arxiv.org/pdf/2106.07447.pdf) - [LayoutLM](https://arxiv.org/abs/1912.13318) +- [Llama & Llama2](https://arxiv.org/abs/2302.13971) - [MarkupLM](https://arxiv.org/abs/2110.08518) - [Marian](https://arxiv.org/abs/1804.00344) - [MBart](https://arxiv.org/abs/2001.08210) diff --git a/docs/source/onnxruntime/usage_guides/pipelines.mdx b/docs/source/onnxruntime/usage_guides/pipelines.mdx index c12185b560d..cdd85564783 100644 --- a/docs/source/onnxruntime/usage_guides/pipelines.mdx +++ b/docs/source/onnxruntime/usage_guides/pipelines.mdx @@ -55,11 +55,9 @@ There are tags on the Model Hub that allow you to filter for a model you'd like -To be able to load the model with the ONNX Runtime backend, the export to ONNX needs -to be supported for the considered architecture. +To be able to load the model with the ONNX Runtime backend, the export to ONNX needs to be supported for the considered architecture. -You can check the list of supported architectures -[here](/exporters/onnx/package_reference/configuration#Supported-architectures). +You can check the list of supported architectures [here](https://huggingface.co/docs/optimum/exporters/onnx/overview#overview).