You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Sorry for the super late answer... I won't excuse for this mistake.
I will explain the question first.
total_formula_path : This is a dataset which is sum of train_dataset and test_dataset. This is required for building a tokenizer with total vocab. Because the code in this repository doesn't save tokenizer object.
beam_search_k : This is the depth of beam search algorithm. It is required when inference stage.
I tried to train your model using configs/draft.json but it has lots of missing "KEYS" that are required like
Can you provide correct json file
The text was updated successfully, but these errors were encountered: