Skip to content

Commit

Permalink
Add missing colons
Browse files Browse the repository at this point in the history
  • Loading branch information
wesselb committed Aug 23, 2024
1 parent d1ca223 commit da93aff
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions aurora/model/aurora.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,14 +63,14 @@ def __init__(
window_size (tuple[int, int, int], optional): Vertical height, height, and width of the
window of the underlying Swin transformer.
encoder_depths (tuple[int, ...], optional): Number of blocks in each encoder layer.
encoder_num_heads (tuple[int, ...], optional) Number of attention heads in each encoder
encoder_num_heads (tuple[int, ...], optional): Number of attention heads in each encoder
layer. The dimensionality doubles after every layer. To keep the dimensionality of
every head constant, you want to double the number of heads after every layer. The
dimensionality of attention head of the first layer is determined by `embed_dim`
divided by the value here. For all cases except one, this is equal to `64`.
decoder_depths (tuple[int, ...], optional): Number of blocks in each decoder layer.
Generally, you want this to be the reversal of `encoder_depths`.
decoder_num_heads (tuple[int, ...], optional) Number of attention heads in each decoder
decoder_num_heads (tuple[int, ...], optional): Number of attention heads in each decoder
layer. Generally, you want this to be the reversal of `encoder_num_heads`.
latent_levels (int, optional): Number of latent pressure levels.
patch_size (int, optional): Patch size.
Expand Down

0 comments on commit da93aff

Please sign in to comment.