Skip to content

Latest commit

 

History

History
16 lines (10 loc) · 728 Bytes

README.md

File metadata and controls

16 lines (10 loc) · 728 Bytes

HyperMAE: Modulating Implicit Neural Representations for MAE Training [pdf]

NeurIPS 2023, 4th Workshop on Self-Supervised Learning: Theory and Practice

This codebase is modified from MAE.

Installing packages

  • Please follow MAE.

Datasets

Run

  • Use bash files in scripts folder to pretrain and finetune models.