使用colossalai原生的fp16是否会导致clip_grad_norm无效? #2252
Answered
by
1SAA
yhcc
asked this question in
Community | Q&A
-
如题,根据下面的代码,似乎原生的fp16无法clip gradient? ColossalAI/colossalai/amp/naive_amp/naive_amp.py Lines 42 to 43 in 8897b8f |
Beta Was this translation helpful? Give feedback.
Answered by
1SAA
Jan 3, 2023
Replies: 1 comment 1 reply
-
直接使用NaiveAMPOptimizer的
在之后的代码中并不需要调用 |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
yhcc
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
@yhcc
直接使用NaiveAMPOptimizer的
clip_grad_norm
函数是不正确的行为。如果要使用grad_clipping功能,需要在amp_config中特别标出。如下面这部分展示的代码:在之后的代码中并不需要调用
clip_grad_norm
函数,FP16Optimizer会自己在step()函数中调用。