Skip to content

Commit

Permalink
Add with torch.no_grad() to DistilBERT integration test forward pass (
Browse files Browse the repository at this point in the history
huggingface#14979)

* refactor: wrap forward pass around no_grad context

* Update tests/test_modeling_distilbert.py

* fix: rm `no_grad` from non-integration tests

* chore: rm whitespace change
  • Loading branch information
jaketae authored Jan 12, 2022
1 parent 021f2ea commit 97f3bee
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion tests/test_modeling_distilbert.py
Original file line number Diff line number Diff line change
Expand Up @@ -284,7 +284,8 @@ def test_inference_no_head_absolute_embedding(self):
model = DistilBertModel.from_pretrained("distilbert-base-uncased")
input_ids = torch.tensor([[0, 345, 232, 328, 740, 140, 1695, 69, 6078, 1588, 2]])
attention_mask = torch.tensor([[0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]])
output = model(input_ids, attention_mask=attention_mask)[0]
with torch.no_grad():
output = model(input_ids, attention_mask=attention_mask)[0]
expected_shape = torch.Size((1, 11, 768))
self.assertEqual(output.shape, expected_shape)
expected_slice = torch.tensor(
Expand Down

0 comments on commit 97f3bee

Please sign in to comment.