Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error while loading weights for damage_inference.py #18

Open
nka77 opened this issue Oct 18, 2021 · 5 comments
Open

Error while loading weights for damage_inference.py #18

nka77 opened this issue Oct 18, 2021 · 5 comments

Comments

@nka77
Copy link

nka77 commented Oct 18, 2021

I am getting shape mis-match error while loading weights in damage_inference.py, which is using 'classification.hdf5' file.
ValueError: Cannot assign to variable conv3_block1_0_conv/kernel:0 due to variable shape (1, 1, 256, 512) and value shape (1, 1, 128, 512) are incompatible
(The weights for 'localization.h5' are loaded fine.)

Below is the snapshot for error:

File "./damage_inference.py", line 93, in run_inference
    model.load_weights(model_weights)
  File "/Users/navjotkaur/miniforge3/envs/tfmacos/lib/python3.8/site-packages/tensorflow/python/keras/engine/training.py", line 2234, in load_weights
    hdf5_format.load_weights_from_hdf5_group(f, self.layers)
  File "/Users/navjotkaur/miniforge3/envs/tfmacos/lib/python3.8/site-packages/tensorflow/python/keras/saving/hdf5_format.py", line 710, in load_weights_from_hdf5_group
    K.batch_set_value(weight_value_tuples)
  File "/Users/navjotkaur/miniforge3/envs/tfmacos/lib/python3.8/site-packages/tensorflow/python/util/dispatch.py", line 201, in wrapper
    return target(*args, **kwargs)
  File "/Users/navjotkaur/miniforge3/envs/tfmacos/lib/python3.8/site-packages/tensorflow/python/keras/backend.py", line 3745, in batch_set_value
    x.assign(np.asarray(value, dtype=dtype(x)))
  File "/Users/navjotkaur/miniforge3/envs/tfmacos/lib/python3.8/site-packages/tensorflow/python/ops/resource_variable_ops.py", line 888, in assign
    raise ValueError(
ValueError: Cannot assign to variable conv3_block1_0_conv/kernel:0 due to variable shape (1, 1, 256, 512) and value shape (1, 1, 128, 512) are incompatible

@suhail017
Copy link

Hi @nka77 , I am having the same issue as yours. Were you able to resolve that problem?

@juka19
Copy link

juka19 commented Apr 11, 2023

Had the same problem. I managed to load the weights with python 3.6, tensorflow==1.14.0 and keras==2.3.1. Also had to downgrade h5py to 2.10.0.

@rongtongxueya
Copy link

I've also encountered this problem. Additionally, when I run the process_data.py file, there are no test and train groupings in the output_dir folder. However, the output in CSV can generate train.csv and test.csv. I'm wondering whether in the command:
usage: python damage_classification.py [-h] --train_data TRAIN_DATA_PATH
--train_csv TRAIN_CSV
--test_data TEST_DATA_PATH
--test_csv TEST_CSV
[--model_in MODEL_WEIGHTS_PATH]
--model_out MODEL_WEIGHTS_OUT
the test_data and train_data can be the same directory?
Has anyone managed to solve it? Is there any kind soul who can help me out? I've been stuck on this for almost a week now.

@TangSL1ping
Copy link

I've also encountered this problem. Additionally, when I run the process_data.py file, there are no test and train groupings in the output_dir folder. However, the output in CSV can generate train.csv and test.csv. I'm wondering whether in the command: usage: python damage_classification.py [-h] --train_data TRAIN_DATA_PATH --train_csv TRAIN_CSV --test_data TEST_DATA_PATH --test_csv TEST_CSV [--model_in MODEL_WEIGHTS_PATH] --model_out MODEL_WEIGHTS_OUT the test_data and train_data can be the same directory? Has anyone managed to solve it? Is there any kind soul who can help me out? I've been stuck on this for almost a week now.

Hello, I have the same problem as you, can you please fix this problem

@juka19
Copy link

juka19 commented Nov 2, 2023

As mentioned above, try this with python 3.6 & tf 1.14.0. For me, it worked.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants