Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"mm = TrajPatternMiner(traj_list, learning_rate, training_epochs, \ display_step, units, batch_size, max_n_steps, frame_dim, window_size)" #1

Closed
whalienvita opened this issue Apr 7, 2023 · 5 comments

Comments

@whalienvita
Copy link

When i run the code "mm = TrajPatternMiner(traj_list, learning_rate, training_epochs,
display_step, units, batch_size, max_n_steps, frame_dim, window_size)"

There's an error:
TypeCheckError Traceback (most recent call last)
Cell In[10], line 1
----> 1 mm = TrajPatternMiner(traj_list, learning_rate, training_epochs,
2 display_step, units, batch_size, max_n_steps, frame_dim, window_size)

File D:\Users\duhit\Desktop\323\FYP_trajectory_clustering-main\FYP_trajectory_clustering-main\model_2\traj_pattern_miner.py:20, in TrajPatternMiner.init(self, traj, learning_rate, training_epochs, display_step, units, batch_size, max_n_steps, frame_dim, window_size)
18 self.window_size = window_size
19 self.txc = TrajectoryBehaviorExtraction()
---> 20 self.t2v = Traj2Vec(self.learning_rate, self.training_epochs, self.display_step, self.units, self.batch_size, self.frame_dim)

File D:\Users\duhit\Desktop\323\FYP_trajectory_clustering-main\FYP_trajectory_clustering-main\model_2\traj2vec.py:27, in Traj2Vec.init(self, learning_rate, training_epochs, display_step, units, batch_size, frame_dim)
25 self.frame_dim = frame_dim
26 self.encoder = Encoder(self.units, self.batch_size)
---> 27 self.decoder = Decoder(self.units, self.batch_size, self.max_n_steps)
28 self.optimizer = tf.keras.optimizers.experimental.RMSprop(self.learning_rate)
29 self.checkpoint_dir = './training_checkpoints'

File D:\Users\duhit\Desktop\323\FYP_trajectory_clustering-main\FYP_trajectory_clustering-main\model_2\decoder.py:21, in Decoder.init(self, dec_units, batch_sz, max_length_input, attention_type)
18 self.sampler = tfa.seq2seq.sampler.TrainingSampler()
20 # Create attention mechanism with memory = None
---> 21 self.attention_mechanism = self.build_attention_mechanism(self.dec_units,
22 None, self.batch_sz*[self.max_length_input], self.attention_type)
24 # Wrap attention mechanism with the fundamental rnn cell of decoder
25 self.rnn_cell = self.build_rnn_cell(batch_sz)

File D:\Users\duhit\Desktop\323\FYP_trajectory_clustering-main\FYP_trajectory_clustering-main\model_2\decoder.py:46, in Decoder.build_attention_mechanism(self, dec_units, memory, memory_sequence_length, attention_type)
44 return tfa.seq2seq.BahdanauAttention(units=dec_units, memory=memory, memory_sequence_length=memory_sequence_length)
45 else:
---> 46 return tfa.seq2seq.LuongAttention(units=dec_units, memory=memory, memory_sequence_length=memory_sequence_length)

File D:\duhit\anaconda\envs\FYP\lib\site-packages\tensorflow_addons\seq2seq\attention_wrapper.py:543, in init(self, units, memory, memory_sequence_length, scale, probability_fn, dtype, name, **kwargs)
540 # For LuongAttention, we only transform the memory layer; thus
541 # num_units must match expected the query depth.
542 self.probability_fn_name = probability_fn
--> 543 probability_fn = self._process_probability_fn(self.probability_fn_name)
545 def wrapped_probability_fn(score, _):
546 return probability_fn(score)

File D:\duhit\anaconda\envs\FYP\lib\site-packages\typeguard_functions.py:264, in check_variable_assignment(value, expected_annotations, memo)
262 iterated_values.append(obj)
263 try:
--> 264 check_type_internal(obj, expected_type, memo)
265 except TypeCheckError as exc:
266 exc.append_path_element(argname)

File D:\duhit\anaconda\envs\FYP\lib\site-packages\typeguard_checkers.py:671, in check_type_internal(value, annotation, memo)
668 return
670 if not isinstance(value, origin_type):
--> 671 raise TypeCheckError(f"is not an instance of {qualified_name(origin_type)}")

TypeCheckError: probability_fn is not an instance of str

Who knows the solving methods? Emergency!

@AgnesYe1029
Copy link
Owner

When I was running on Local Jupyter notebook, there is no such error message. Are you running on Google Colab?

@whalienvita
Copy link
Author

i am also on the Local Jupyter notebook, however there's an error. i straightly run the original code, do i need to edit it?

@AgnesYe1029
Copy link
Owner

AgnesYe1029 commented Apr 7, 2023

Here we are not even touching probability_fn argument in our code though. Based on the trace you pasted, it seems a type check dependency issue inside site-packages\tensorflow_addons\seq2seq\attention_wrapper.py.

Updates: Kindly checkout this issue raised in tfa's source code. I think it's the same as your situation. While that issue still remains open at tfa side, here's a solution to bypass the type check: "comment out @TypeChecked inside LuongAttention(AttentionMechanism). "

@whalienvita
Copy link
Author

Thank you very much! I removed the code containing TypeCheckError from site-packages\typeguard_checkers.py and the problem was solved.

@AgnesYe1029
Copy link
Owner

AgnesYe1029 commented Apr 9, 2023

While removing TypeCheckError from site-packages\typeguard_checkers.py works, it might not be a sustainable practice as TypeCheckError supports type consistency for all .py documents.
Removing @TypeChecked annotation within site-packages\tensorflow_addons\seq2seq\attention_wrapper.py would be a better option with less impact.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants