Skip to content

Actions: lucidrains/x-transformers

All workflows

Actions

Loading...
Loading

Showing runs from all workflows
355 workflow runs
355 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

expose AdaptiveRMSNorm for another project
Python package #37: Commit 6fb0520 pushed by lucidrains
July 12, 2024 15:31 56s main
July 12, 2024 15:31 56s
1.31.10
Upload Python Package #390: Release 1.31.10 published by lucidrains
July 6, 2024 14:40 38s
July 6, 2024 14:40 38s
go with Gemma2 softclamp values
Python package #36: Commit 80be134 pushed by lucidrains
July 6, 2024 14:40 48s main
July 6, 2024 14:40 48s
1.31.9
Upload Python Package #389: Release 1.31.9 published by lucidrains
July 6, 2024 13:56 38s
July 6, 2024 13:56 38s
handle key padding mask directly passed into Attend
Python package #35: Commit 832464f pushed by lucidrains
July 6, 2024 13:56 48s main
July 6, 2024 13:56 48s
1.31.8
Upload Python Package #388: Release 1.31.8 published by lucidrains
July 2, 2024 16:36 42s
July 2, 2024 16:36 42s
cleanup unit offset again
Python package #34: Commit 65e9d36 pushed by lucidrains
July 2, 2024 16:36 53s main
July 2, 2024 16:36 53s
Upload Python Package
Upload Python Package #387: published by lucidrains
July 2, 2024 16:31 46s
July 2, 2024 16:31 46s
cleanup unit offset again
Python package #33: Commit cbc4982 pushed by lucidrains
July 2, 2024 16:31 57s main
July 2, 2024 16:31 57s
1.31.6
Upload Python Package #386: Release 1.31.6 published by lucidrains
June 30, 2024 03:04 46s
June 30, 2024 03:04 46s
set a default value for the transformer output softclamp value, and a…
Python package #32: Commit 59cee27 pushed by lucidrains
June 30, 2024 03:04 49s main
June 30, 2024 03:04 49s
1.31.5
Upload Python Package #385: Release 1.31.5 published by lucidrains
June 30, 2024 02:35 44s
June 30, 2024 02:35 44s
softclamping of attention logits needs to happen before masking
Python package #31: Commit 9078611 pushed by lucidrains
June 30, 2024 02:35 54s main
June 30, 2024 02:35 54s
1.31.4
Upload Python Package #384: Release 1.31.4 published by lucidrains
June 28, 2024 12:19 51s
June 28, 2024 12:19 51s
cleanup gamma unit offset in norms
Python package #30: Commit af345a3 pushed by lucidrains
June 28, 2024 12:19 53s main
June 28, 2024 12:19 53s
1.31.3
Upload Python Package #383: Release 1.31.3 published by lucidrains
June 27, 2024 17:58 44s
June 27, 2024 17:58 44s
this should always be on to save beginners from weight decay issues
Python package #29: Commit 2e74ed0 pushed by lucidrains
June 27, 2024 17:58 55s main
June 27, 2024 17:58 55s
1.31.2
Upload Python Package #382: Release 1.31.2 published by lucidrains
June 27, 2024 17:51 44s
June 27, 2024 17:51 44s
this should always be on to save beginners from weight decay issues
Python package #28: Commit 4f0fc67 pushed by lucidrains
June 27, 2024 17:51 47s main
June 27, 2024 17:51 47s
1.31.1
Upload Python Package #381: Release 1.31.1 published by lucidrains
June 27, 2024 15:52 47s
June 27, 2024 15:52 47s
account for unit offset in layerscale too, also fix init of gamma
Python package #27: Commit 044a62f pushed by lucidrains
June 27, 2024 15:52 49s main
June 27, 2024 15:52 49s
add the cool trick that makes norm gammas weight decayable, from @Oha…
Python package #26: Commit cc3b663 pushed by lucidrains
June 27, 2024 15:33 53s main
June 27, 2024 15:33 53s
1.31.0
Upload Python Package #380: Release 1.31.0 published by lucidrains
June 27, 2024 15:31 51s
June 27, 2024 15:31 51s
add the cool trick that makes norm gammas weight decayable, from @Oha…
Python package #25: Commit f933207 pushed by lucidrains
June 27, 2024 15:31 1m 2s main
June 27, 2024 15:31 1m 2s
allow for layers_execute_order to be overridden on forward
Python package #24: Commit 7e73791 pushed by lucidrains
June 23, 2024 03:20 49s main
June 23, 2024 03:20 49s