-
Notifications
You must be signed in to change notification settings - Fork 90
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
how to convert mnet.25.prototxt and mnet.25.caffemodel to mnet-deconv.prototxt and mnet-deconv.caffemodel #4
Comments
what's your problems? |
the error message is here: and I find there are many differences, why the converted model can not be used directly?
|
can you use caffe model directly instead of use tensorRT model ? |
我自己训的模型怎么能变成mnet-deconv那样,转换的时候报错: |
@Royzon Hi where did you download the mobilenet25 mxnet model?could you give me the website?thank u |
@clancylian , I encountered this error when I tried using your mnet25.caffemodel and ment25.prototxt to do inferencing on DeepStream. Deepstream auto convert the caffe model to tensorRT by generating engine plan file. The caffe model itself is working fine, the error occurs when it comes to tensorRT. Any thoughts on what might be the reason? Thanks |
@JaydonChion have you solve this? |
Thanks for your sharing. I convert mnet.25 mxnet model to caffe model successfully by your tools, but I find this caffe model can't used in your project directly, so may you pleasure to help me solve it?
The text was updated successfully, but these errors were encountered: