Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Timeline support markdown #1808

Open
Weaxs opened this issue Oct 22, 2024 · 5 comments
Open

Timeline support markdown #1808

Weaxs opened this issue Oct 22, 2024 · 5 comments

Comments

@Weaxs
Copy link
Contributor

Weaxs commented Oct 22, 2024

Is your feature request related to a problem? Please describe.
Timeline shortcode not support markdown now
image

Describe the solution you'd like

Describe alternatives you've considered

Additional context

@wermos
Copy link
Contributor

wermos commented Feb 25, 2025

Can you share a MWE (minimal working example)?

It might have something to do with a PR I made (#1515).

But it still seems to work normally on the Blowfish docs page.

Can you share what version of Blowfish and Hugo you were using?

@Weaxs
Copy link
Contributor Author

Weaxs commented Feb 25, 2025

Can you share a MWE (minimal working example)?

It might have something to do with a PR I made (#1515).

But it still seems to work normally on the Blowfish docs page.

Can you share what version of Blowfish and Hugo you were using?

like this:

{{< timeline >}}

{{< timelineItem icon="star" header="2023.07" badge="Jina Embeddings" >}}
[[2023.07] Jina Embeddings: A Novel Set of High-Performance Sentence Embedding Models](https://arxiv.org/abs/2307.11224):<br/> The first version of Jina's Embedding is based on the **T5 model ** ([*Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer. [2019.10]*](https://arxiv.org/abs/1910.10683)) for pre-training, and secondly, the Jina team designed a new dataset specifically for training the Embedding model.
{{< /timelineItem >}}

{{< timelineItem icon="star" header="2023.10" badge="Jina Embeddings 2" >}}
[[2023.10] Jina Embeddings 2: 8192-Token General-Purpose Text Embeddings for Long Documents](https://arxiv.org/abs/2310.19923):<br/> Jina's second version of Embedding is pre-trained on the basis of the    pretrained on the **Bert model**, using linear bias Attention ([**ALiBi**](https://arxiv.org/abs/2108.12409)) to increase the contextual information of the model input; secondly, the text data is used to fine-tune the model's embedding ability; finally, **adversarial example fine-tuning** is added to further improve the model's ability.
{{< /timelineItem >}}

{{< timelineItem icon="star" header="2024.09" badge="jina-embeddings-v3" >}}
[[2024.09] jina-embeddings-v3: Multilingual Embeddings With Task LoRA](https://arxiv.org/abs/2409.10173):<br/> Jina's third version of Embedding is a relatively new model. It is trained and fine-tuned based on the **XLM-RoBERTa model** (here, BGE-M3 is borrowed), and FlashAttention 2 is introduced to improve computational efficiency. <br/> One of the more innovative points is that the Jina team has introduced five **task-specific LoRA adapters**. These five adapters are trained independently. They are: ① **retrieval.passage**: used to embed documents in the query document retrieval task, mainly paragraph embedding in asymmetric retrieval ② **retrieval.query**: in the query query in query-document retrieval tasks, mainly query embedding in asymmetric retrieval ③ **separation**: cluster analysis of documents for clustering and reordering tasks ④ **classification**: text classification for classification tasks ⑤ **text-matching**: semantic text similarity matching for tasks designed for semantic similarity, such as symmetric retrieval.
{{< /timelineItem >}}

{{< /timeline >}}

preview at v2.78.0

Image

@wermos
Copy link
Contributor

wermos commented Feb 27, 2025

Can you share which version number it used to work at? I can confirm that it is appearing broken for me as well, as you've shown.

@Weaxs
Copy link
Contributor Author

Weaxs commented Feb 28, 2025

Can you share which version number it used to work at? I can confirm that it is appearing broken for me as well, as you've shown.

I'm not sure which version it used to work at.

maybe it has never worked correctly ? 😶

@wermos
Copy link
Contributor

wermos commented Feb 28, 2025

maybe it has never worked correctly ?

I believe that may be the case. I can think of a fix, but then we lose the ability to embed GitHub cards inside timelines. I'm not sure how fond of that fix the creator of the theme may be.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants