runtime error

Exit code: 1. Reason: The value of RUN_AWS_FUNCTIONS is 0 The value of RUN_LOCAL_MODEL is 1 The value of GRADIO_OUTPUT_FOLDER is output/ The value of AWS_REGION is eu-west-2 Traceback (most recent call last): File "/usr/local/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 75, in _load_shared_library return ctypes.CDLL(str(_lib_path), **cdll_args) # type: ignore File "/usr/local/lib/python3.10/ctypes/__init__.py", line 374, in __init__ self._handle = _dlopen(self._name, mode) OSError: libcuda.so.1: cannot open shared object file: No such file or directory During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/user/app/app.py", line 6, in <module> from tools.llm_api_call import extract_topics, load_in_data_file, load_in_previous_data_files, sample_reference_table_summaries, summarise_output_topics, batch_size_default, deduplicate_topics File "/home/user/app/tools/llm_api_call.py", line 24, in <module> from tools.chatfuncs import LlamaCPPGenerationConfig, call_llama_cpp_model, load_model, RUN_LOCAL_MODEL File "/home/user/app/tools/chatfuncs.py", line 5, in <module> from llama_cpp import Llama File "/usr/local/lib/python3.10/site-packages/llama_cpp/__init__.py", line 1, in <module> from .llama_cpp import * File "/usr/local/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 88, in <module> _lib = _load_shared_library(_lib_base_name) File "/usr/local/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 77, in _load_shared_library raise RuntimeError(f"Failed to load shared library '{_lib_path}': {e}") RuntimeError: Failed to load shared library '/usr/local/lib/python3.10/site-packages/llama_cpp/lib/libllama.so': libcuda.so.1: cannot open shared object file: No such file or directory

Container logs:

Fetching error logs...