You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I noticed that CLIP is already present in the Hailo Model Zoo, which suggests that conversion is possible. link
I need help converting a model I trained myself. How can I parse CLIP to HAR?
After converting a ResNet-based CLIP model to ONNX, I encountered the following error when parsing torch.nn.functional.multi_head_attention_forward from ONNX to HAR.
/local/workspace/hailo_virtualenv/bin/python /local/shared_with_docker/pycharm_codes/hailo_model_zoo/tutorials/clip-reid_parsing.py
Model has been exported to attention_pool2d.onnx
[info] Translation started on ONNX model mha_test
[info] Restored ONNX model mha_test (completion time: 00:00:00.13)
[info] Extracted ONNXRuntime meta-data for Hailo model (completion time: 00:00:00.14)
[info] Attempting to retry parsing on a simplified model, using onnx simplifier
[info] Simplified ONNX model for a retry attempt (completion time: 00:00:01.65)
Traceback (most recent call last):
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 176, in translate_onnx_model
parsing_results = self._parse_onnx_model_to_hn(onnx_model, valid_net_name, start_node_names,
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 231, in _parse_onnx_model_to_hn
return self.parse_model_to_hn(onnx_model, None, net_name, start_node_names, end_node_names,
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 257, in parse_model_to_hn
fuser = HailoNNFuser(converter.convert_model(), net_name, converter.end_node_names)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_client/model_translator/translator.py", line 63, in convert_model
self._create_layers()
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_client/model_translator/edge_nn_translator.py", line 26, in _create_layers
self._add_direct_layers()
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_client/model_translator/edge_nn_translator.py", line 131, in _add_direct_layers
raise ParsingWithRecommendationException(
hailo_sdk_client.model_translator.exceptions.ParsingWithRecommendationException: Parsing failed. The errors found in the graph are:
UnsupportedShuffleLayerError in op /Reshape_2: Unable to create shuffle layer at /Reshape_2
UnsupportedShuffleLayerError in op /Reshape_3: Unable to create shuffle layer at /Reshape_3
UnsupportedShuffleLayerError in op /Reshape_1: Unable to create shuffle layer at /Reshape_1
UnsupportedShuffleLayerError in op /Reshape_5: Failed to determine type of layer to create in node /Reshape_5
UnsupportedShuffleLayerError in op /Reshape_6: Failed to determine type of layer to create in node /Reshape_6
UnsupportedShuffleLayerError in op /Reshape_4: Failed to determine type of layer to create in node /Reshape_4
UnsupportedShuffleLayerError in op /Transpose_5: Failed to determine type of layer to create in node /Transpose_5
UnsupportedShuffleLayerError in op /Reshape_7: Failed to determine type of layer to create in node /Reshape_7
Please try to parse the model again, using these end node names: /Add_3, /Add_2, /Constant_19, /Constant_22, /Constant_20, /Constant_15, /Add_1
Reproduction Code
The code to reproduce the error is as follows. AttentionPool2d is taken from the OpenAI CLIP code: link
Hi @jayong-sv,
In the version you are working with, the Clip model is not supported. In the current release (DFC 3.28.0), only the clip_resnet image encoder is supported. More Clip models, including the text encoder, would be supported in the future.
I noticed that CLIP is already present in the Hailo Model Zoo, which suggests that conversion is possible. link
I need help converting a model I trained myself. How can I parse CLIP to HAR?
After converting a ResNet-based CLIP model to ONNX, I encountered the following error when parsing
torch.nn.functional.multi_head_attention_forward
from ONNX to HAR.Reproduction Code
The code to reproduce the error is as follows.
AttentionPool2d
is taken from the OpenAI CLIP code: linkExecution Environment
The text was updated successfully, but these errors were encountered: