Skip to content
This repository has been archived by the owner on Nov 7, 2024. It is now read-only.

Added power function to backend to new backends #889

Open
wants to merge 16 commits into
base: master
Choose a base branch
from

Conversation

LuiGiovanni
Copy link
Contributor

This PR implements the power function specifically to symmetric and PyTorch backends with their respective test.

The changes were implemented in the files:

tensornetwork/backends/pytorch/*
tensornetwork/backends/symmetric/*

This is in regards to the issue #840.

@google-cla google-cla bot added the cla: yes label Dec 18, 2020
@LuiGiovanni
Copy link
Contributor Author

@mganahl @alewis Hello! here is the new PR where you may find the new changes and the tensordot issue. I await your feedback and thank you for the oportunity

@LuiGiovanni
Copy link
Contributor Author

Hello @alewis @mganahl! Running the test from pylint seems to cause and issue which I cannot resolve, it states that Python does not exist in module tensorflow, I assume the issue is init.py thought updating it with a small change such as adding a . does not seem to solve the issue, also my pylint returns a 10/10 both locally and here in GitHub's CI, the difference being that I do not get this issue locally.

In my search I found that this issue is not some enclosed issue, I found some people raising this issue as well in other places here are some examples:

I will continue my search for a solution, my main problem being that we can't resolve the tensordot issue without resolving this first, I would love some help or feedback. Thank you so much in advance for all the help and support you have given me!


def power(self, a: Tensor, b: Tensor) -> Tensor:
"""
Returns the power of tensor a to the value of b.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the wording of this docstring seems somewhat imprecise. Can you fix this?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"""
Returns the exponentiation of tensor a raised to b.
In the case b is a tensor, then the power is by element
with a as the base and b as the exponent.

Args:
  a: The tensor that contains the base.
  b: The tensor that contains a single scalar.
"""

This is the change I made to it, once we resolve the other request I will push the changes


def power(self, a: Tensor, b: Tensor) -> Tensor:
"""
Returns the power of tensor a to the value of b.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The __pow__ method of BlockSparseTensor does currently not support broadcasting of any type, hence b has to be a scalar. Pls modify the docstring accordingly.

Copy link
Contributor Author

@LuiGiovanni LuiGiovanni Dec 29, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm a bit confused with this request since this test works fine for me with no failures could you elaborate on what the problem is, please?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The reason the tests passes is due to a bug in the test (I had actually missed that in the earlier reviewa, see below).
in numpy, a**b below does different things depending on how a and b look like. Generally, numpy applies broadcasting (https://numpy.org/doc/stable/user/basics.broadcasting.html) rules when computing this expression. Typically, this approach is relatively widely used by other libraries as well (e.g. tensorflow or pytorch). Supporting broadcasting forBlockSparseTensor is however more complicated because this class has some non-trivial internal structure that needs to be respected by such operations. Currently we just don't support it. The only case supported by BlockSparseTensor is when b is a scalar-type object.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pls add some test ensuring that a and b are BlockSparseTensors

R = 4
backend = symmetric_backend.SymmetricBackend()
a = get_tensor(R, num_charges, dtype)
b = BlockSparseTensor.random(a.sparse_shape)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pls modify the test so that b is a numpy-scalar

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

backend.power should only take BlockSparseTensor objects, pls fix.

Copy link
Contributor

@mganahl mganahl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks again for the PR! Left some comments to be fixed. Thanks!


def power(self, a: Tensor, b: Tensor) -> Tensor:
"""
Returns the power of tensor a to the value of b.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The reason the tests passes is due to a bug in the test (I had actually missed that in the earlier reviewa, see below).
in numpy, a**b below does different things depending on how a and b look like. Generally, numpy applies broadcasting (https://numpy.org/doc/stable/user/basics.broadcasting.html) rules when computing this expression. Typically, this approach is relatively widely used by other libraries as well (e.g. tensorflow or pytorch). Supporting broadcasting forBlockSparseTensor is however more complicated because this class has some non-trivial internal structure that needs to be respected by such operations. Currently we just don't support it. The only case supported by BlockSparseTensor is when b is a scalar-type object.


def power(self, a: Tensor, b: Tensor) -> Tensor:
"""
Returns the power of tensor a to the value of b.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pls add some test ensuring that a and b are BlockSparseTensors

R = 4
backend = symmetric_backend.SymmetricBackend()
a = get_tensor(R, num_charges, dtype)
b = BlockSparseTensor.random(a.sparse_shape)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

backend.power should only take BlockSparseTensor objects, pls fix.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants