Skip to content

how to load custom model for inference #5872

Answered by glenn-jocher
e101sg asked this question in Q&A
Discussion options

You must be logged in to vote

@e101sg you have a typo in your arguments, Flase.

In any case, I don't know why you would attempt to pass pretrained=False if you are expecting trained results.

Replies: 3 comments 15 replies

Comment options

You must be logged in to vote
4 replies
@e101sg
Comment options

@glenn-jocher
Comment options

@Abhinay-Sadineni
Comment options

@Abhinay-Sadineni
Comment options

Answer selected by e101sg
Comment options

You must be logged in to vote
10 replies
@tudoanh
Comment options

@kessmith
Comment options

@zkytony
Comment options

@ammarsaf
Comment options

@BGAKSHAYA
Comment options

Comment options

You must be logged in to vote
1 reply
@pinnintipraneethkumar
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet