Skip to content

Commit

Permalink
update readme
Browse files Browse the repository at this point in the history
  • Loading branch information
liuxu77 committed Nov 1, 2024
1 parent bdebc12 commit 1e37bb9
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 4 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Related reading: [Moirai Paper](https://arxiv.org/abs/2402.02592), [Moirai Sales

## 🎉 What's New

* Oct 2024: A new model Moirai-MoE! The preprint is available on [arXiv](https://arxiv.org/abs/2410.10469), along with model weights of [small](https://huggingface.co/Salesforce/moirai-moe-1.0-R-small) and [base](https://huggingface.co/Salesforce/moirai-moe-1.0-R-base), and [simple example](https://github.com/SalesforceAIResearch/uni2ts/project/moirai-moe-1) to get started.
* Oct 2024: A new model Moirai-MoE! The preprint is available on [arXiv](https://arxiv.org/abs/2410.10469), along with model weights of [small](https://huggingface.co/Salesforce/moirai-moe-1.0-R-small) and [base](https://huggingface.co/Salesforce/moirai-moe-1.0-R-base), and [simple example](https://github.com/SalesforceAIResearch/uni2ts/tree/main/project/moirai-moe-1) to get started.

* Sep 2024: Released [Evaluation Code](https://github.com/SalesforceAIResearch/uni2ts/tree/main/project/benchmarks) of [TimesFM](https://arxiv.org/abs/2310.10688), [Chronos](https://arxiv.org/abs/2403.07815) and [VisionTS](https://arxiv.org/abs/2408.17253) on Monash, LSF and PF benchmarks.

Expand Down
6 changes: 3 additions & 3 deletions project/moirai-moe-1/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ Our paper [Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixt
The figure below presents the major difference between Moirai-MoE and Moirai. Compared to Moirai using multi-heuristic-defined input/output projection layers to model time series with different frequencies, Moirai-MoE utilizes a single input/output projection layer while delegating the task of capturing diverse time series patterns to the sparse mixture of experts Transformers. With these designs, the specialization of Moirai-MoE is achieved in a data-driven manner and operates at the token level.

<p align="center">
<img src="./img/framework.png" height="200" alt="" align=center />
<img src="./img/framework.png" height="300" alt="" align=center />
</p>


Expand Down Expand Up @@ -85,13 +85,13 @@ plot_next_multi(
Extensive experiments on 39 datasets demonstrate the superiority of Moirai-MoE over existing foundation models in both in-distribution and zero-shot scenarios.

<p align="center">
<img src="./img/in-dist.png" height="250" alt="" align=center />
<img src="./img/in-dist.png" height="300" alt="" align=center />
</p>

The above figure presents the in-distribution evaluation using a total of 29 datasets from the Monash benchmark. The evaluation results show that Moirai-MoE beats all competitors.

<p align="center">
<img src="./img/zero-shot.png" height="410" alt="" align=center />
<img src="./img/zero-shot.png" height="450" alt="" align=center />
</p>

The above table shows a zero-shot forecasting evaluation on 10 datasets and Moirai-MoE-Base achieves the best zero-shot performance.
Expand Down

0 comments on commit 1e37bb9

Please sign in to comment.