Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

complete BigST with preprocess #206

Merged
merged 53 commits into from
Dec 5, 2024
Merged

Conversation

superarthurlx
Copy link
Contributor

以PEMS08为例,配置文件先选择PreprocesPEMS08.py,会在checkpoints目录先生成BigSTPreprocess的.pt文件,把路径写在PEMS08.py中就可以使用use-long这个参数了

@ChengqingYu
Copy link

我在PEMS08上测试了这个算法,实验设置为2016的历史长度预测未来12步,取得了以下精度:
MAE: 12.8374, MAPE: 0.0849, RMSE: 21.4005
基本符合这个算法的主要性能特点。
需要注意的是,需要先使用PreprocesPEMS08.py预训练,然后再运行PEMS08.py,且需要修改“PREPROCESSED_FILE”的地址为checkpoints目录下BigSTPreprocess的.pt文件

@zezhishao zezhishao merged commit 7e6412b into GestaltCogTeam:master Dec 5, 2024
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants