Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extension of channels last converter to models with branches #114

Open
wants to merge 11 commits into
base: main
Choose a base branch
from

Conversation

nghielme
Copy link

I extended channels last converted in order to work with models with branches.
I tested with a dummy model composed by a single fork node and with a complete UNet model with multiple fork nodes.

…e problem now seems to be to move the transpose upwards before the fork and properly reconnect the network. Not sure either what I should do concerning the model tensor shape since I see that for the base case it is modified.
@nghielme nghielme requested review from maltanar and jmitrevs April 19, 2024 09:29
_channelsLast_node_types = list(channels_last.custom_op.keys())

# Nodes, which do not modify the shape of the tensor
# And modify all values in the same way.
_move_through_nodes = ["Quant", "Relu"]
_move_through_nodes = ["Quant", "Relu", "LeakyRelu", "Resize"]
Copy link
Author

@nghielme nghielme Apr 23, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure that Resize should be in _move_through_nodes list since it actually modifies the shape of the tensor

@heborras heborras self-requested a review May 29, 2024 13:41
nghielme added 3 commits May 29, 2024 16:41
…e in which the transposes passes the special nodes. I added some cleaning transformation of the domain field, it is not very elegant but I think it is strictly necessary.
a removal of eventual input and  output transposes
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant