You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I get the checkpoints and check the final models by rhash -M *
When i run
python app.y --model-name /path/to/pygmalion-7b
the error log are as the following
Traceback (most recent call last):
File "/root/anaconda3/envs/python310/lib/python3.10/site-packages/gradio/routes.py", line 414, in run_predict
output = await app.get_blocks().process_api(
File "/root/anaconda3/envs/python310/lib/python3.10/site-packages/gradio/blocks.py", line 1323, in process_api
result = await self.call_function(
File "/root/anaconda3/envs/python310/lib/python3.10/site-packages/gradio/blocks.py", line 1051, in call_function
prediction = await anyio.to_thread.run_sync(
File "/root/anaconda3/envs/python310/lib/python3.10/site-packages/anyio/to_thread.py", line 31, in run_sync
return await get_asynclib().run_sync_in_worker_thread(
File "/root/anaconda3/envs/python310/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 937, in run_sync_in_worker_thread
return await future
File "/root/anaconda3/envs/python310/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 867, in run
result = context.run(func, *args)
File "/root/pythonfile/gradio-ui/src/gradio_ui.py", line 72, in _run_inference
inference_result = inference_fn(model_history, user_input,
File "/root/pythonfile/gradio-ui/src/app.py", line 62, in inference_fn
model_output = run_raw_inference(model, tokenizer, prompt,
File "/root/pythonfile/gradio-ui/src/model.py", line 64, in run_raw_inference
logits = model.generate(stopping_criteria=stopping_criteria_list,
File "/root/anaconda3/envs/python310/lib/python3.10/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
return func(*args, **kwargs)
File "/root/anaconda3/envs/python310/lib/python3.10/site-packages/transformers/generation/utils.py", line 1267, in generate
self._validate_model_kwargs(model_kwargs.copy())
File "/root/anaconda3/envs/python310/lib/python3.10/site-packages/transformers/generation/utils.py", line 1140, in _validate_model_kwargs
raise ValueError(
ValueError: The following `model_kwargs` are not used by the model: ['token_type_ids'] (note: typos in the generate arguments will also show up in this list)
I have successfully run PygmalionAI/pygmalion-6b and PygmalionAI/pygmalion-2.7b. There may be some problem on LLama tokenizer? I am not sure there is any thing wrong?
The text was updated successfully, but these errors were encountered:
Following the Applying the XORs in https://huggingface.co/PygmalionAI/pygmalion-7b
I get the checkpoints and check the final models by
rhash -M *
When i run
the error log are as the following
I have successfully run
PygmalionAI/pygmalion-6b
andPygmalionAI/pygmalion-2.7b
. There may be some problem on LLama tokenizer? I am not sure there is any thing wrong?The text was updated successfully, but these errors were encountered: