You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
"All model checkpoint layers were used when initializing TFLongformerForQuestionAnswering.
All the layers of TFLongformerForQuestionAnswering were initialized from the model checkpoint at valhalla/longformer-base-4096-finetuned-squadv1.
If your task is similar to the task the model of the checkpoint was trained on, you can already use TFLongformerForQuestionAnswering for predictions without further training.
WARNING:tensorflow:From /usr/local/lib/python3.8/dist-packages/tensorflow/python/autograph/pyct/static_analysis/liveness.py:83: Analyzer.lamba_check (from tensorflow.python.autograph.pyct.static_analysis.liveness) is deprecated and will be removed after 2023-09-23.
Instructions for updating:
Lambda fuctions will be no more assumed to be used in the statement where they are used, or at least in the same block. tensorflow/tensorflow#56089
There should be exactly three separator tokens: 2 in every sample for questions answering. You might also consider to set global_attention_mask manually in the forward function to avoid this. This is most likely an error. The global attention is disabled for this forward pass.
WARNING:tensorflow:AutoGraph could not transform <bound method Socket.send of <zmq.Socket(zmq.PUSH) at 0x7f884d569d00>> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, export AUTOGRAPH_VERBOSITY=10) and attach the full output.
Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
WARNING: AutoGraph could not transform <bound method Socket.send of <zmq.Socket(zmq.PUSH) at 0x7f884d569d00>> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, export AUTOGRAPH_VERBOSITY=10) and attach the full output.
Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
There should be exactly three separator tokens: 2 in every sample for questions answering. You might also consider to set global_attention_mask manually in the forward function to avoid this. This is most likely an error. The global attention is disabled for this forward pass.
There should be exactly three separator tokens: 2 in every sample for questions answering. You might also consider to set global_attention_mask manually in the forward function to avoid this. This is most likely an error. The global attention is disabled for this forward pass.
OperatorNotAllowedInGraphError Traceback (most recent call last) in
11 model = TFLongformerForQuestionAnswering.from_pretrained(MODEL_NAME, from_pt=True)
12
---> 13 model.save_pretrained("./{}".format(MODEL_NAME), saved_model=True)
4 frames /usr/local/lib/python3.8/dist-packages/transformers/models/longformer/modeling_tf_longformer.py in call(self, input_ids, attention_mask, head_mask, global_attention_mask, token_type_ids, position_ids, inputs_embeds, output_attentions, output_hidden_states, return_dict, start_positions, end_positions, training)
2267 # set global attention on question tokens
2268 if global_attention_mask is None and input_ids is not None:
-> 2269 if shape_list(tf.where(input_ids == self.config.sep_token_id))[0] != 3 * shape_list(input_ids)[0]:
2270 logger.warning(
2271 f"There should be exactly three separator tokens: {self.config.sep_token_id} in every sample for"
OperatorNotAllowedInGraphError: Using a symbolic tf.Tensor as a Python bool is not allowed: AutoGraph did convert this function. This might indicate you are trying to use an unsupported feature."
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I'm getting an error while trying to run the Longformer Models configuration in google colab. Does anyone have solutions?
HuggingFace_in_Spark_NLP_LongformerForQuestionAnswering.ipynb.txt
"All model checkpoint layers were used when initializing TFLongformerForQuestionAnswering.
All the layers of TFLongformerForQuestionAnswering were initialized from the model checkpoint at valhalla/longformer-base-4096-finetuned-squadv1.
If your task is similar to the task the model of the checkpoint was trained on, you can already use TFLongformerForQuestionAnswering for predictions without further training.
WARNING:tensorflow:From /usr/local/lib/python3.8/dist-packages/tensorflow/python/autograph/pyct/static_analysis/liveness.py:83: Analyzer.lamba_check (from tensorflow.python.autograph.pyct.static_analysis.liveness) is deprecated and will be removed after 2023-09-23.
Instructions for updating:
Lambda fuctions will be no more assumed to be used in the statement where they are used, or at least in the same block. tensorflow/tensorflow#56089
There should be exactly three separator tokens: 2 in every sample for questions answering. You might also consider to set
global_attention_mask
manually in the forward function to avoid this. This is most likely an error. The global attention is disabled for this forward pass.WARNING:tensorflow:AutoGraph could not transform <bound method Socket.send of <zmq.Socket(zmq.PUSH) at 0x7f884d569d00>> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux,
export AUTOGRAPH_VERBOSITY=10
) and attach the full output.Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
WARNING: AutoGraph could not transform <bound method Socket.send of <zmq.Socket(zmq.PUSH) at 0x7f884d569d00>> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux,
export AUTOGRAPH_VERBOSITY=10
) and attach the full output.Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
There should be exactly three separator tokens: 2 in every sample for questions answering. You might also consider to set
global_attention_mask
manually in the forward function to avoid this. This is most likely an error. The global attention is disabled for this forward pass.There should be exactly three separator tokens: 2 in every sample for questions answering. You might also consider to set
global_attention_mask
manually in the forward function to avoid this. This is most likely an error. The global attention is disabled for this forward pass.OperatorNotAllowedInGraphError Traceback (most recent call last)
in
11 model = TFLongformerForQuestionAnswering.from_pretrained(MODEL_NAME, from_pt=True)
12
---> 13 model.save_pretrained("./{}".format(MODEL_NAME), saved_model=True)
4 frames
/usr/local/lib/python3.8/dist-packages/transformers/models/longformer/modeling_tf_longformer.py in call(self, input_ids, attention_mask, head_mask, global_attention_mask, token_type_ids, position_ids, inputs_embeds, output_attentions, output_hidden_states, return_dict, start_positions, end_positions, training)
2267 # set global attention on question tokens
2268 if global_attention_mask is None and input_ids is not None:
-> 2269 if shape_list(tf.where(input_ids == self.config.sep_token_id))[0] != 3 * shape_list(input_ids)[0]:
2270 logger.warning(
2271 f"There should be exactly three separator tokens: {self.config.sep_token_id} in every sample for"
OperatorNotAllowedInGraphError: Using a symbolic
tf.Tensor
as a Pythonbool
is not allowed: AutoGraph did convert this function. This might indicate you are trying to use an unsupported feature."Beta Was this translation helpful? Give feedback.
All reactions