Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

好几个模块挨个报错 我是真的服了 #148

Open
yuganxia opened this issue Jul 16, 2024 · 1 comment
Open

好几个模块挨个报错 我是真的服了 #148

yuganxia opened this issue Jul 16, 2024 · 1 comment

Comments

@yuganxia
Copy link

最开始onnxruntime 报错 我重装解决了
后来报这个
我在外面虚拟环境装了CUDA12.1 cuDNN8.2
虚拟环境里装好了onnxruntime nvidia-cudnn-cu12
在虚拟环境里面测试 torch.cuda.is_available() = True
显卡是4070ti 我是实在搞不清楚为啥它还会报错找不到CUDA了

Error occurred when executing InsightFaceLoader_Zho:

D:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:891 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasnt able to be loaded. Please install the correct version of CUDA andcuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.

File "E:\database\Desktop\ComfyUI_windows_portable\ComfyUI\execution.py", line 151, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\ComfyUI_windows_portable\ComfyUI\execution.py", line 81, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\ComfyUI_windows_portable\ComfyUI\execution.py", line 74, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-InstantID\InstantIDNode.py", line 71, in load_insight_face_antelopev2
model = FaceAnalysis(name="antelopev2", root=current_directory, providers=[provider + 'ExecutionProvider',])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\ComfyUI_windows_portable\python_embeded\Lib\site-packages\insightface\app\face_analysis.py", line 31, in init
model = model_zoo.get_model(onnx_file, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\ComfyUI_windows_portable\python_embeded\Lib\site-packages\insightface\model_zoo\model_zoo.py", line 96, in get_model
model = router.get_model(providers=providers, provider_options=provider_options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\ComfyUI_windows_portable\python_embeded\Lib\site-packages\insightface\model_zoo\model_zoo.py", line 40, in get_model
session = PickableInferenceSession(self.onnx_file, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\ComfyUI_windows_portable\python_embeded\Lib\site-packages\insightface\model_zoo\model_zoo.py", line 25, in init
super().init(model_path, **kwargs)
File "E:\database\Desktop\ComfyUI_windows_portable\python_embeded\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 432, in init
raise fallback_error from e
File "E:\database\Desktop\ComfyUI_windows_portable\python_embeded\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 427, in init
self._create_inference_session(self._fallback_providers, None)
File "E:\database\Desktop\ComfyUI_windows_portable\python_embeded\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 483, in _create_inference_session
sess.initialize_session(providers, provider_options, disabled_optimizers)

@yuganxia
Copy link
Author

整个重装了comfyui 又出了几个没见过的 解决到最后是这个

Error occurred when executing IDGenerationNode:

Allocation on device

File "E:\database\Desktop\Comfyui\execution.py", line 151, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\Comfyui\execution.py", line 81, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\Comfyui\execution.py", line 74, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\Comfyui\custom_nodes\ComfyUI-InstantID\InstantIDNode.py", line 288, in id_generate_image
output = pipe(
^^^^^
File "E:\database\Desktop\Comfyui.ext\Lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\Comfyui\custom_nodes\ComfyUI-InstantID\pipeline_stable_diffusion_xl_instantid.py", line 732, in call
image = self.vae.decode(latents / self.vae.config.scaling_factor, return_dict=False)[0]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\Comfyui.ext\Lib\site-packages\diffusers\utils\accelerate_utils.py", line 46, in wrapper
return method(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\Comfyui.ext\Lib\site-packages\diffusers\models\autoencoders\autoencoder_kl.py", line 302, in decode
decoded = self._decode(z).sample
^^^^^^^^^^^^^^^
File "E:\database\Desktop\Comfyui.ext\Lib\site-packages\diffusers\models\autoencoders\autoencoder_kl.py", line 273, in _decode
dec = self.decoder(z)
^^^^^^^^^^^^^^^
File "E:\database\Desktop\Comfyui.ext\Lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\Comfyui.ext\Lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\Comfyui.ext\Lib\site-packages\diffusers\models\autoencoders\vae.py", line 338, in forward
sample = up_block(sample, latent_embeds)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\Comfyui.ext\Lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\Comfyui.ext\Lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\Comfyui.ext\Lib\site-packages\diffusers\models\unets\unet_2d_blocks.py", line 2615, in forward
hidden_states = resnet(hidden_states, temb=temb, scale=scale)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\Comfyui.ext\Lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\Comfyui.ext\Lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\Comfyui.ext\Lib\site-packages\diffusers\models\resnet.py", line 338, in forward
hidden_states = self.nonlinearity(hidden_states)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\Comfyui.ext\Lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\Comfyui.ext\Lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\Comfyui.ext\Lib\site-packages\torch\nn\modules\activation.py", line 396, in forward
return F.silu(input, inplace=self.inplace)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\database\Desktop\Comfyui.ext\Lib\site-packages\torch\nn\functional.py", line 2102, in silu
return torch._C._nn.silu(input)
^^^^^^^^^^^^^^^^^^^^^^^^

@yuganxia yuganxia reopened this Jul 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant