完整代码:
Traceback (most recent call last):
File "/data/user/BMLU-use/src/English_chat/qwen1.5.py", line 97, in <module>
main(model_path=args.model_path,max_length=args.max_length,name=args.name)
File "/data/user/BMLU-use/src/English_chat/qwen1.5.py", line 76, in main
model = AutoModelForCausalLM.from_pretrained(
File "/data/user/zwh_llm/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 562, in from_pretrained
model_class = _get_model_class(config, cls._model_mapping)
File "/data/user/zwh_llm/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 383, in _get_model_class
supported_models = model_mapping[type(config)]
File "/data/user/zwh_llm/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 734, in __getitem__
return self._load_attr_from_module(model_type, model_name)
File "/data/user/zwh_llm/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 748, in _load_attr_from_module
return getattribute_from_module(self._modules[module_name], attr)
File "/data/user/zwh_llm/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 692, in getattribute_from_module
if hasattr(module, attr):
File "/data/user/zwh_llm/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1525, in __getattr__
module = self._get_module(self._class_to_module[name])
File "/data/user/zwh_llm/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1537, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.models.qwen2.modeling_qwen2 because of the following error (look up to see its traceback):
Failed to import transformers.generation.utils because of the following error (look up to see its traceback):
No module named 'torch.distributed.checkpoint.format_utils'
解决方法:更新torch和torchvision
pip install --upgrade torch torchvision
参考资料:
https://stackoverflow.com/questions/59012616/problem-importing-module-modulenotfounderror-no-module-named-torch-utils-chec