You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This mainly causes the size error like ''Calculated padded input size per channel: (160 x 3). Kernel size: (8 x 8). Kernel size can't be greater than actual input size...'', so why not adjust the state's dim,maybe the code is :state = np.transpose(state, (2, 0, 1))or adjust the Net, but we may need to pay attention to this. 0.0
The text was updated successfully, but these errors were encountered:
In the script common/wrappers.py , the state.shape is changed to (3, 160, 210) when the function wrap_pytorch() is called. Hope this answers your question.
This mainly causes the size error like ''Calculated padded input size per channel: (160 x 3). Kernel size: (8 x 8). Kernel size can't be greater than actual input size...'', so why not adjust the state's dim,maybe the code is :state = np.transpose(state, (2, 0, 1))or adjust the Net, but we may need to pay attention to this. 0.0
The text was updated successfully, but these errors were encountered: