Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: added bernoulli_ #23680

Closed
wants to merge 19 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions ivy/functional/frontends/torch/tensor.py
Original file line number Diff line number Diff line change
Expand Up @@ -1167,6 +1167,10 @@ def dot(self, tensor):
def bernoulli(self, *, generator=None, out=None):
return torch_frontend.bernoulli(self._ivy_array, generator=generator, out=out)

@with_supported_dtypes({"2.0.1 and below": ("float32", "float64")}, "torch")
def bernoulli_(self, *, generator=None):
self.ivy_array = self.bernoulli(generator=generator)._ivy_array

# Special Methods #
# -------------------#

Expand Down
34 changes: 34 additions & 0 deletions ivy_tests/test_ivy/test_frontends/test_torch/test_tensor.py
Original file line number Diff line number Diff line change
Expand Up @@ -4235,6 +4235,40 @@ def test_torch_tensor_bernoulli(
)


# bernoulli_
@handle_frontend_method(
class_tree=CLASS_TREE,
init_tree="torch.tensor",
method_name="bernoulli_",
dtype_and_x=helpers.dtype_and_values(
available_dtypes=helpers.get_dtypes("valid"),
),
test_with_out=st.just(True),
Copy link
Contributor

@umairjavaid umairjavaid Sep 16, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove this line

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let me know if that works

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@umairjavaid you mean removing "test_with_out=st.just(True)" line right?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jaskiratsingh2000 Yup we'll have to remove this

)
def test_torch_tensor_bernoulli_(
dtype_and_x,
frontend,
frontend_method_data,
init_flags,
method_flags,
backend_fw,
):
input_dtype, x = dtype_and_x
helpers.test_frontend_method(
init_input_dtypes=input_dtype,
backend_to_test=backend_fw,
init_all_as_kwargs_np={
"input": x[0],
},
method_input_dtypes=input_dtype,
method_all_as_kwargs_np={"generator": x[1], "out": x[2]},
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The "out" kwarg should also be removed here. Thanks

frontend_method_data=frontend_method_data,
init_flags=init_flags,
method_flags=method_flags,
frontend=frontend,
)


# bitwise_and
@handle_frontend_method(
class_tree=CLASS_TREE,
Expand Down
Loading