Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

Convert model to CoreML #162

Open
tama4ma opened this issue Dec 7, 2022 · 0 comments
Open

Convert model to CoreML #162

tama4ma opened this issue Dec 7, 2022 · 0 comments

Comments

@tama4ma
Copy link

tama4ma commented Dec 7, 2022

System info
・torch 1.12.1
・coremltools 6.1
・python 3.8.15

I tried to convert model of votenet to the Core ML format.
I am referring to the following.
https://coremltools.readme.io/docs/pytorch-conversion

After defining the model I get an error like below when converting to 'torchscript'

RuntimeError                              Traceback (most recent call last)
[<ipython-input-37-277f5d3f2bc7>](https://localhost:8080/#) in <module>
----> 1 trace = torch.jit.trace(model, dummy_input)

1 frames
[/usr/local/lib/python3.8/dist-packages/torch/jit/_trace.py](https://localhost:8080/#) in trace(func, example_inputs, optimize, check_trace, check_inputs, check_tolerance, strict, _force_outplace, _module_class, _compilation_unit)
    748 
    749     if isinstance(func, torch.nn.Module):
--> 750         return trace_module(
    751             func,
    752             {"forward": example_inputs},

[/usr/local/lib/python3.8/dist-packages/torch/jit/_trace.py](https://localhost:8080/#) in trace_module(mod, inputs, optimize, check_trace, check_inputs, check_tolerance, strict, _force_outplace, _module_class, _compilation_unit)
    965             example_inputs = make_tuple(example_inputs)
    966 
--> 967             module._c._create_method_from_trace(
    968                 method_name,
    969                 func,

RuntimeError: Encountering a dict at the output of the tracer might cause the trace to be incorrect, this is only valid if the container structure does not change based on the module's inputs. Consider using a constant container instead (e.g. for `list`, use a `tuple` instead. for `dict`, use a `NamedTuple` instead). If you absolutely need this and know the side effects, pass strict=False to trace() to allow this behavior.

This is my code

model = VoteNet(10,12,10,np.random.random((10,3))).to('cuda')
model.eval()
model.load_state_dict(torch.load('checkpoint.tar'), strict=False) 
dummy_input = {'point_clouds': torch.rand((20000,3)).unsqueeze(0).cuda()}
trace = torch.jit.trace(model, dummy_input)

I searched problem,then I found this huggingface/transformers#9095

I changed code like this based on comments in the issue

model = VoteNet(10,12,10,np.random.random((10,3))).to('cuda')
model.eval()
model.load_state_dict(torch.load('checkpoint.tar'), return_dict=False) 
dummy_input = {'point_clouds': torch.rand((20000,3)).unsqueeze(0).cuda()}
trace = torch.jit.trace(model, dummy_input)

But this time I got the following error

TypeError                                 Traceback (most recent call last)
[<ipython-input-31-0811dab20119>](https://localhost:8080/#) in <module>
----> 1 model.load_state_dict(torch.load('checkpoint.tar'), return_dict= False)

TypeError: load_state_dict() got an unexpected keyword argument 'return_dict'

I'm a beginner, so maybe I'm stumbling on a rudimentary part
Could you help me?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant