davidkim205 commited on
Commit
bf0838a
1 Parent(s): e5acd1c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -39
README.md CHANGED
@@ -27,45 +27,7 @@ This model was created using the REJECTION SAMPLING technique to create a data s
27
  If the undefined symbol error below occurs, install torch as follows.
28
 
29
  ```
30
- Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
31
- Traceback (most recent call last):
32
- File "/home/david/anaconda3/envs/spaces/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1535, in _get_module
33
- return importlib.import_module("." + module_name, self.__name__)
34
- File "/home/david/anaconda3/envs/spaces/lib/python3.10/importlib/__init__.py", line 126, in import_module
35
- return _bootstrap._gcd_import(name[level:], package, level)
36
- File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
37
- File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
38
- File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
39
- File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
40
- File "<frozen importlib._bootstrap_external>", line 883, in exec_module
41
- File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
42
- File "/home/david/anaconda3/envs/spaces/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py", line 54, in <module>
43
- from flash_attn import flash_attn_func, flash_attn_varlen_func
44
- File "/home/david/anaconda3/envs/spaces/lib/python3.10/site-packages/flash_attn/__init__.py", line 3, in <module>
45
- from flash_attn.flash_attn_interface import (
46
- File "/home/david/anaconda3/envs/spaces/lib/python3.10/site-packages/flash_attn/flash_attn_interface.py", line 10, in <module>
47
- import flash_attn_2_cuda as flash_attn_cuda
48
- ImportError: /home/david/anaconda3/envs/spaces/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi
49
-
50
- The above exception was the direct cause of the following exception:
51
-
52
- Traceback (most recent call last):
53
- File "/work/spaces/models/llama3_templeate.py", line 8, in <module>
54
- model = AutoModelForCausalLM.from_pretrained(
55
- File "/home/david/anaconda3/envs/spaces/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 562, in from_pretrained
56
- model_class = _get_model_class(config, cls._model_mapping)
57
- File "/home/david/anaconda3/envs/spaces/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 383, in _get_model_class
58
- supported_models = model_mapping[type(config)]
59
- File "/home/david/anaconda3/envs/spaces/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 734, in __getitem__
60
- return self._load_attr_from_module(model_type, model_name)
61
- File "/home/david/anaconda3/envs/spaces/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 748, in _load_attr_from_module
62
- return getattribute_from_module(self._modules[module_name], attr)
63
- File "/home/david/anaconda3/envs/spaces/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 692, in getattribute_from_module
64
- if hasattr(module, attr):
65
- File "/home/david/anaconda3/envs/spaces/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1525, in __getattr__
66
- module = self._get_module(self._class_to_module[name])
67
- File "/home/david/anaconda3/envs/spaces/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1537, in _get_module
68
- raise RuntimeError(
69
  RuntimeError: Failed to import transformers.models.llama.modeling_llama because of the following error (look up to see its traceback):
70
  /home/david/anaconda3/envs/spaces/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi
71
 
 
27
  If the undefined symbol error below occurs, install torch as follows.
28
 
29
  ```
30
+ ...
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
31
  RuntimeError: Failed to import transformers.models.llama.modeling_llama because of the following error (look up to see its traceback):
32
  /home/david/anaconda3/envs/spaces/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi
33