Skip to content

Conversation

ankitade
Copy link
Contributor

@ankitade ankitade commented Jun 16, 2022

Create a flava directory under models and move flava.py there in preparation for separating out flava text and image encoders in that directory

Stack from ghstack (oldest at bottom):

Differential Revision: D37284910

ankitade added a commit that referenced this pull request Jun 16, 2022
ghstack-source-id: b6f4983
Pull Request resolved: #96
@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jun 16, 2022
@codecov-commenter
Copy link

Codecov Report

Merging #96 (915259e) into gh/ankitade/2/base (3beffd9) will decrease coverage by 0.11%.
The diff coverage is n/a.

@@                  Coverage Diff                   @@
##           gh/ankitade/2/base      #96      +/-   ##
======================================================
- Coverage               88.96%   88.85%   -0.12%     
======================================================
  Files                      33       33              
  Lines                    1722     1722              
======================================================
- Hits                     1532     1530       -2     
- Misses                    190      192       +2     
Impacted Files Coverage Δ
torchmultimodal/models/flava/flava_model.py 87.02% <ø> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 3beffd9...915259e. Read the comment docs.

ankitade added a commit that referenced this pull request Jun 18, 2022
ghstack-source-id: b6f4983
Pull Request resolved: #96
@ankitade ankitade marked this pull request as ready for review June 20, 2022 16:53
@ankitade
Copy link
Contributor Author

@ankitade has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

with self.assertRaises(ValueError):
_ = FLAVASelfAttention(hidden_size=3, num_attention_heads=2)

def test_flava_transformer_without_embeddings_value_error(self):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just wondering, why remove this?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

its an unrelated test that got checked in by mistake while separating out transformers

ankitade added a commit to ankitade/multimodal that referenced this pull request Jun 24, 2022
Summary:
Pull Request resolved: facebookresearch#96

Create a flava directory under models and move flava.py there in preparation for separating out flava text and image encoders in that directory

Test Plan: Imported from OSS

Differential Revision: D37284910

fbshipit-source-id: 4638403d7338c0678e0aab7f6b40c7b3672da0e4
@facebook-github-bot facebook-github-bot deleted the gh/ankitade/2/head branch June 27, 2022 14:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants