We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
it is finetine.sh, i want to change horizon-len to finetine the dataset.
export TF_CPP_MIN_LOG_LEVEL=2 XLA_PYTHON_CLIENT_PREALLOCATE=false python3 finetune.py \ --model-name="google/timesfm-1.0-200m" \ --backend="cpu" \ --horizon-len=512 \ --context-len=1440 \ --freq="15min" \ --data-path="../datasets/ETT-small/ETTm2.csv" \ --num-epochs=100 \ --learning-rate=1e-3 \ --adam-epsilon=1e-7 \ --adam-clip-threshold=1e2 \ --early-stop-patience=10 \ --datetime-col="date" \ --use-lora \ --lora-rank=2 \ --lora-target-modules="all" \ --use-dora \ --cos-initial-decay-value=1e-4 \ --cos-decay-steps=40000 \ --cos-final-decay-value=1e-5 \ --ema-decay=0.9999
And i got an error about TypeError: sub got incompatible shapes for broadcasting: (112, 128), (112, 512). how to fix it.
TypeError: sub got incompatible shapes for broadcasting: (112, 128), (112, 512)
Whether horizon-len is fixed when finetuneing.
The text was updated successfully, but these errors were encountered:
No branches or pull requests
peft/finetine.sh
it is finetine.sh, i want to change horizon-len to finetine the dataset.
And i got an error about
TypeError: sub got incompatible shapes for broadcasting: (112, 128), (112, 512)
. how to fix it.Whether horizon-len is fixed when finetuneing.
The text was updated successfully, but these errors were encountered: