-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
convert yolact to ONNX #74
Comments
See #59. You'll have to put some elbow grease in if you want to get YOLACT traceable (i.e., exportable to ONNX) since I use a lot of pythonic code. I hear @Wilber529 was able to do it following these steps: #59 (comment). You have to rewrite how I pass around variables (dictionaries are not supported I think) and you'll have to rewrite anything after Yolact's forward function (starting with self.detect) in your target language because I wrote it in a super pythonic way to make the model faster. |
Hi @dbolya thanks for you'r answer! |
Yolact does not support conversion to ONNX, which is why you get an error. You'd need to change a lot of things to get conversion to ONNX to work, as outlined by @Wilber529 in #59 (comment). I'm not making these changes to the main branch because they'd make the Python version run slower and make it harder to develop. |
thx |
I have converted yolact to onnx without Detect part, and also modified some upsampling code. |
@Ma-Dan thanks for sharing the reference code ,i shalll look into this process and get back to if i have queries |
@Ma-Dan thank you very much for sharing your work. |
@Ma-Dan Thank you for your code! I convert the model to onnx ,but the results is different from pytorch outpus,such loc , mask and proto, but conf is same! Do you see the problem? |
@abhigoku10 actually I just used the onnx branch from Ma-Dan to create an onnx file. Do you get an error while converting? |
@aweissen1 i was facing some package issues i shall look into to more in depth and solve it , where there any difference i the output generated |
@Ma-Dan Hi, i convert to onnx ssuccessfully,but i found results is not corrent . can you share the version of the pytorch and onnxruntime are you using? Thx |
@Ma-Dan Can you give more information about the package dependencies for your Yolact-ONNX implementation? |
@ABlueLight and @aweissen1 should us the base code given by @Ma-Dan and train the model , or just load the trained model with this code what is the command to be used . Please share the process |
i convert to onnx successfully and results is correct, today. Thx @Ma-Dan |
@ABlueLight after conversion to onnx which platform are you going to deploy it . and did u convert to tensorflow based model |
@abhigoku10 TensorFlow and it can run correctly |
Sorry for the delayed reply, I just fixed code on my repo to use correct onnx output. |
The environment I used: Run |
@Ma-Dan thanks for the response, i have few queries
|
@Ma-Dan Thank you! Great job. |
@ABlueLight how did you import it to Tensorflow? |
file_name= yolact_base_0_4000.onnx Loading model...Traceback (most recent call last): I was able to covert the model to .onnx format. |
|
@aweissen1 @ABlueLight hi guys , i am facing the same issue as above in my inference after conversion file_name= yolact_base_0_4000.onnx Loading model...Traceback (most recent call last): Any suggestions |
@Ma-Dan @aweissen1 @ABlueLight How are guys able to load the ONNX model using torch.load() function? Only onnx.load() can be used right? |
@ABlueLight, do you have a huge difference in speed of inference? I used @Ma-Dan 's helpful work to generate yolact.onnx and I load it through onnx load and onnx_tf.backend import prepare. All other post processing is still torch based. It takes 2 mins per image inference (compared to a couple of seconds in Pytorch) Also, were you able to convert it to pure Tensorflow? (use Tensorflow pb file instead of onnx) |
@ridasalam I convert it to pure tensorflow and it const about 400~500ms on i5 cpu。 |
@sicarioakki ONNX model should be loaded by onnx.load(),i think.. |
Can you share the project of tensorflow? |
Can you Please tell the steps for converting the yolact.pth model to .onnx model, and mention which script should be used for converting,so that it can be helpful to me. My idea is to convert the model into tensorRT, so i am trying to convert the [yolact to onnx to tensorRT]. |
hi i had founded that you are using eval.py script for converting the yolact model to onnx model i having a doubt it showing
so what i did is , i just added few lines now its in dictionary format by using the key value i can take the values of detection, but when i cross checked the detection which is in (dictionary format) it having a key values of
what value can i assign for the
In my understanding 'conf' ':mean score,'mask':mask,'proto': means proto what about 'loc' and 'priors'
i tried like this |
Hi,@Ma-Dan I'm trying to convert yolact model to TensorRT and facing number of issues. Here is my working environment :
There are links to yolact model in onnx format opset version == 9, The Error is : opset version == 11 Thank you in advance for your help |
Thanks for your gread job. I follow your code to convert onnx success, but convert onnx to coreml, it shows errors about upsample layer(you mentioned you modify some upsampling code, could you pls share the modification part? you men the function: def _convert_upsample(builder, node, graph, err): in /home/jiapy/virtualEnv/py3.6torch1.2/lib/python3.6/site-packages/onnx_coreml/_operators.py" ): Process finished with exit code 1 |
Hi. Unexpected error while evaluating model output 783. System.ArgumentException: Cannot reshape array of size 62208 into shape with multiple of 1024 elements I used opset=9, input=(1,550,550,3), model=resnet50-54 |
Hi @Ma-Dan , Do you know how to convert yolact_plus_base_54_800000.pth to ONNX. I run $python eval.py --trained_model=weights/yolact_plus_base_54_800000.pth --score_threshold=0.3 --top_k=100 --cuda=False --image=dog.jpg, got Eorror: Traceback (most recent call last): |
Thank you so much @Ma-Dan , have you tried converting it to tflite? |
give config file |
Done :) |
@amitkumar-delhivery were you able to convert it into tflite? if so did you used it on mobile devices(Android) ? |
|
RuntimeError: Only tuples, lists and Variables supported as JIT inputs/outputs. Dictionaries and strings are also accepted but their usage is not recommended. But got unsupported type Yolact |
@ABlueLight you said you were successful in converting the yolact to onnx and deploy it on TensorFlow. I was wondering if you could share your code? I am still trying to figure out how to convert Yolact to ONNX and then deploy it on TensorFlow. Thanks! |
HI all I used @Ma-Dan repo to convert the .pth file to .onnx file. Now I want to run this using C++. What should I do next? Can someone give me a good direction? What I needed initially was to convert .pt file to .pth file but from this issue I realized that I can convert it to ONNX and then C++. Am I thinking rightly so? Should I now look for solution to convert .onnx to .pth or .onnx can be called from C++ (in C++ implementation)? |
Thank you very much for your work. Can anyone give me some advice? In addition, if I don't think about the above problem, I successfully converted onnx to TensorRT. |
Has anyone successfully converted a Yolact++ model to onnx? For example, yolact_plus_base_54_800000.pth to yolact_plus_base_54_800000.onnx? I'm not sure the @Ma-Dan has updated this repo to support the Yolact++ models. When I try to run eval.py as
I get:
@amitkumar-delhivery suggested above that for this error, add: --config=custom_config, but this leads to
Any suggestions would be appreciated. |
@rbgreenway |
@sdimantsd
I thought it might be a version issue, but I've replicated what people have mentioned above with no change.
That seems like a pytorch problem, but I'm still digging.... |
my computer: RAM 32gb, GeForce GTX 1060 6GB. |
@Ma-Dan Thank you very much for your open source and help. I used your github to convert the model to ncnn: https://github.com/Ma-Dan/yolact/tree/onnx. The environment I used: python 3.7.9, torch1.5.0, torchvision 0.6.0, onnx 1.8.1
|
It work for me. Convert yolact to onnx following this way: https://docs.openvinotoolkit.org/latest/openvino_docs_MO_DG_prepare_model_convert_model_pytorch_specific_Convert_YOLACT.html |
I convert the yolact model to onnx follow the instructions of https://docs.openvinotoolkit.org/latest/openvino_docs_MO_DG_prepare_model_convert_model_pytorch_specific_Convert_YOLACT.html and https://github.com/Ma-Dan/yolact/tree/onnx., both of them are much slower compare with pytorch because the output need to copy from cuda to cpu(125ms vs 30ms), tried to perform nms by torchvision.nms and convert to onnx, the performance even slower, do anyone know how to speed things up if want to run yolact by onnx? Thanks |
For Resnet101-FPN change yolact/yolact.py line (https://github.com/Ma-Dan/yolact/blob/onnx/yolact.py#L344 ) "sizes = [(69, 69), (35, 35)]" to "sizes = [(88, 88), (44, 44)]" to match the output shapes. |
I figured this came up due to different yolact version I guess? should be changed to
but Now I'm getting this problem here can anyone help?
|
if I use onnxeval.py then I produce this error File "onnxeval.py", line 1065, in |
After converting to ONNX my model looks like this: I don't know is it correct or not? Please help me out in it. Thank you. 7767517 |
I'm facing the same error. Did you solve it @chingi071 then please help me out? Thank you in advance. |
@PhuowngNam I had the same issue. Were you ever able to resolve it? |
Hello again,
I'm try to convert yolact to ONNX with the following code:
error msg:
The text was updated successfully, but these errors were encountered: