-
Notifications
You must be signed in to change notification settings - Fork 848
Description
So following the recommended install method of cloning the client repository and running ./play-rocm.sh, the install goes smoothly then as soon as it starts to load kobold itself it immediately crashes with this error what I can't any reason for. Huggingface_hub is installed, but looking online I can only find 'split_torch_state_dict_into_shards' on their github under src/serialisation/_torch.py, which might mean there is a missing submodule in the ROCm requirements. This is unfortunate because it means I can't use Kobold at all as it won't even start.
Traceback (most recent call last):
File "aiserver.py", line 58, in
from utils import debounce
File "/home/dey/ai/koboldai-client/utils.py", line 12, in
from transformers import PreTrainedModel
File "", line 1039, in _handle_fromlist
File "/home/dey/ai/koboldai-client/runtime/envs/koboldai-rocm/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1066, in getattr
module = self._get_module(self._class_to_module[name])
File "/home/dey/ai/koboldai-client/runtime/envs/koboldai-rocm/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1078, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.modeling_utils because of the following error (look up to see its traceback):
cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub' (/home/dey/ai/koboldai-client/runtime/envs/koboldai-rocm/lib/python3.8/site-packages/huggingface_hub/init.py)