You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, if we are using a random mask approach, how should we handle the generation of 1D variable-length sequences? Since we cannot rely on an eos (end-of-sequence) token like GPT to determine when the generation ends, how can we manage this?
The text was updated successfully, but these errors were encountered:
Modified the MAR class to support variable-length sequence generation by introducing a max_seq_len parameter and adjusting the masking and sampling process accordingly.
Disclaimer: The concept of solution was created by AI and you should never copy paste this code before you check the correctness of generated code. Solution might not be complete, you should use this code as an inspiration only.
Latta AI seeks to solve problems in open source projects as part of its mission to support developers around the world. Learn more about our mission at https://latta.ai/ourmission . If you no longer want Latta AI to attempt solving issues on your repository, you can block this account.
Hello, if we are using a random mask approach, how should we handle the generation of 1D variable-length sequences? Since we cannot rely on an eos (end-of-sequence) token like GPT to determine when the generation ends, how can we manage this?
The text was updated successfully, but these errors were encountered: