Skip to content

Commit

Permalink
Fine tuned w1 and w2 for pretraining (#158)
Browse files Browse the repository at this point in the history
  • Loading branch information
L-M-Sherlock authored Jan 6, 2025
1 parent 333651f commit 24dfb16
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"

[project]
name = "FSRS-Optimizer"
version = "5.6.3"
version = "5.6.4"
readme = "README.md"
dependencies = [
"matplotlib>=3.7.0",
Expand Down
4 changes: 2 additions & 2 deletions src/fsrs_optimizer/fsrs_optimizer.py
Original file line number Diff line number Diff line change
Expand Up @@ -1104,8 +1104,8 @@ def loss(stability):
else:
rating_stability[small_rating] = rating_stability[big_rating]

w1 = 3 / 5
w2 = 3 / 5
w1 = 0.41
w2 = 0.54

if len(rating_stability) == 0:
raise Exception("Not enough data for pretraining!")
Expand Down

0 comments on commit 24dfb16

Please sign in to comment.