Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Minor; noob question] Uniform distribution instead of normal #232

Open
p0p4k opened this issue Jan 21, 2024 · 0 comments
Open

[Minor; noob question] Uniform distribution instead of normal #232

p0p4k opened this issue Jan 21, 2024 · 0 comments

Comments

@p0p4k
Copy link

p0p4k commented Jan 21, 2024

From the paper
image

if self.mask_prob > 0.:
rand = torch.randn(inp.shape, device = x.device)
rand[:, 0] = -torch.finfo(rand.dtype).max # first token should not be masked out
num_mask = min(int(seq * self.mask_prob), seq - 1)
indices = rand.topk(num_mask, dim = -1).indices
mask = ~torch.zeros_like(inp).scatter(1, indices, 1.).bool()
kwargs.update(self_attn_kv_mask = mask)

I am still trying to understand the code,

rand = torch.randn(inp.shape, device = x.device) ---> creates a random array of normal dist number (0,1)
rand[:, 0] = -torch.finfo(rand.dtype).max # first token should not be masked out  ---> makes first <bos> token unmaskable; will be smallest p for topk
num_mask = min(int(seq * self.mask_prob), seq - 1) ---> we need to mask each token with a mask_prob probability == we can just choose randomly mask_prob ratio of numbers from the token array AND it should never exceed (seq-1) number of tokens
indices = rand.topk(num_mask, dim = -1).indices ---> topk of the random numbers are chosen to be masked (so, shouldn't this be uniform distribution according to the paper?)
mask = ~torch.zeros_like(inp).scatter(1, indices, 1.).bool() --->  creates a boolean mask

I have 2 questions :
(1) Will there ever be a case where seq-1 is bigger than int(seq * self.mask_prob) if we already have asserted the mask_prob is always <1. earlier in the code?
(2) We are masking with a probability value, doesnt it mean sometimes the model might get to see more than (1-mask_prob) tokens? But here we force the ratio throughout? And then does using normal vs uniform make any big difference?
Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant