You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Greedy decoding? You can simply search for the sequence of peaks in the NN output. Or you could create a fake ARPA file format LM that has unity transition probabilities for all words, like a grammar?
is there a way to decode without considering the influence of language model
The text was updated successfully, but these errors were encountered: