Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ConverterError:input resource[0] expected type resource !=float, the type of contextnet_greedy_while_contextnet_decoder_contextnet_prediction_embedding_embedding_lookup_24797_0[0] #236

Open
MhmudAlpurd opened this issue Nov 13, 2021 · 2 comments
Labels
bug Something isn't working need to reproduce Need a code or time to reproduce the issue

Comments

@MhmudAlpurd
Copy link

MhmudAlpurd commented Nov 13, 2021

Hi everyone, I'm going to convert h5 to tflite of contexnet model, by executing tflite.py in the examples/contextnet/ direction, and I get several errors. I run these commands in order,
git clone https://github.com/TensorSpeech/TensorFlowASR.git \n cd TensorFlowASR \n pip3 install -e ".[tf2.x]" # or ".[tf2.x-gpu]" \n pip3 install -U "TensorFlowASR[tf2.x]" \n
error occurs:
File "tflite.py", line 65, in <module> tflite_model = converter.convert() File "/usr/local/lib/python3.7/dist-packages/tensorflow/lite/python/lite.py", line 1682, in convert return super(TFLiteConverterV2, self).convert() File "/usr/local/lib/python3.7/dist-packages/tensorflow/lite/python/lite.py", line 782, in wrapper return self._convert_and_export_metrics(convert_func, *args, **kwargs) File "/usr/local/lib/python3.7/dist-packages/tensorflow/lite/python/lite.py", line 768, in _convert_and_export_metrics result = convert_func(self, *args, **kwargs) File "/usr/local/lib/python3.7/dist-packages/tensorflow/lite/python/lite.py", line 1352, in convert self).convert(graph_def, input_tensors, output_tensors) File "/usr/local/lib/python3.7/dist-packages/tensorflow/lite/python/lite.py", line 956, in convert **converter_kwargs) File "/usr/local/lib/python3.7/dist-packages/tensorflow/lite/python/convert_phase.py", line 213, in wrapper raise converter_error from None # Re-throws the exception. File "/usr/local/lib/python3.7/dist-packages/tensorflow/lite/python/convert_phase.py", line 206, in wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.7/dist-packages/tensorflow/lite/python/convert.py", line 828, in toco_convert_impl enable_mlir_converter=enable_mlir_converter) File "/usr/local/lib/python3.7/dist-packages/tensorflow/lite/python/convert.py", line 311, in toco_convert_protos

raise converter_error tensorflow.lite.python.convert_phase.ConverterError: input resource[0] expected type resource != float, the type of contextnet_greedy_while_contextnet_decoder_contextnet_prediction_embedding_embedding_lookup_24797_0[0] In {{node contextnet_greedy/while/contextnet_decoder/contextnet_prediction_embedding/embedding_lookup}} Failed to functionalize Control Flow V1 ops. Consider using Control Flow V2 ops instead. See https://www.tensorflow.org/api_docs/python/tf/compat/v1/enable_control_flow_v2.
then I execute tflite conversion codes that come to the ReadMe.md. but I do not succeed until now. How can I convert transducer or contextnet h5 models to tflite, directly?

@MhmudAlpurd MhmudAlpurd changed the title How to convert transducer.h5 or contextnet.h5 to tflite, directly? ConverterError: input resource[0] expected type resource != float, the type of contextnet_greedy_while_contextnet_decoder_contextnet_prediction_embedding_embedding_lookup_24797_0[0] Nov 15, 2021
@MhmudAlpurd MhmudAlpurd changed the title ConverterError: input resource[0] expected type resource != float, the type of contextnet_greedy_while_contextnet_decoder_contextnet_prediction_embedding_embedding_lookup_24797_0[0] ConverterError:input resource[0] expected type resource !=float, the type of contextnet_greedy_while_contextnet_decoder_contextnet_prediction_embedding_embedding_lookup_24797_0[0] Nov 15, 2021
@nglehuy nglehuy added bug Something isn't working need to reproduce Need a code or time to reproduce the issue labels Sep 2, 2022
@liuyibox
Copy link

Any updated solution here? I encountered the same issue here with tf2.9.

@liuyibox
Copy link

This bug has been resolved tensorflow/tensorflow#42410 (comment). Basically, we just need to load model while are are loading the concrete function, i.e., change this line to converter = tf.lite.TFLiteConverter.from_concrete_functions([concrete_func], model).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working need to reproduce Need a code or time to reproduce the issue
Projects
None yet
Development

No branches or pull requests

3 participants