Skip to content

build(deps): Bump llama-cpp-python[server] from 0.2.79 to 0.2.88 #51

build(deps): Bump llama-cpp-python[server] from 0.2.79 to 0.2.88

build(deps): Bump llama-cpp-python[server] from 0.2.79 to 0.2.88 #51

Triggered via pull request August 20, 2024 08:04
Status Failure
Total duration 3h 49m 46s
Artifacts

pr-check.yaml

on: pull_request
Matrix: Build image
Fit to window
Zoom out
Zoom in

Annotations

1 error
Build image (./chat/vulkan/amd64/Containerfile, ai-lab-playground-chat-vulkan, amd64)
Error: buildah exited with code 1 Trying to pull registry.access.redhat.com/ubi9/python-311:1-72.1722518949... Getting image source signatures Copying blob sha256:c4660e853dd713329dc8ce01d6d31e9bf08d8773b0bb31c21ccfae5fcab3dc5d Copying blob sha256:db22e630b1c7cf081461536c489254a8d1b39ceda32f8f3025314f032860d984 Copying blob sha256:9f2fbf79f82326e925352def0dbdccc800cb9da2fe0125c4d1c33a9cbfc1b629 Copying blob sha256:cc296d75b61273dcb0db7527435a4c3bd03f7723d89a94d446d3d52849970460 Copying blob sha256:c4660e853dd713329dc8ce01d6d31e9bf08d8773b0bb31c21ccfae5fcab3dc5d Copying blob sha256:cc296d75b61273dcb0db7527435a4c3bd03f7723d89a94d446d3d52849970460 Copying blob sha256:9f2fbf79f82326e925352def0dbdccc800cb9da2fe0125c4d1c33a9cbfc1b629 Copying blob sha256:db22e630b1c7cf081461536c489254a8d1b39ceda32f8f3025314f032860d984 Copying config sha256:4147737e222a93ee41758c40abfb2ded77ecd626a9a94604fa5ccb2640cc0d2a Writing manifest to image destination Storing signatures Enabling a Copr repository. Please note that this repository is not part of the main distribution, and quality may vary. The Fedora Project does not exercise any power over the contents of this repository beyond the rules outlined in the Copr FAQ at <https://docs.pagure.org/copr.copr/user_documentation.html#what-i-can-build-in-copr>, and packages are not held to any quality or security level. Please do not file bug reports about these packages in Fedora Bugzilla. In case of problems, contact the owner of this repository. Importing GPG key 0xFA8FEACD: Userid : "ligenix_enterprise-sandbox (None) <ligenix#[email protected]>" Fingerprint: 4845 01A6 F5CF C114 350F ED13 8DA2 CDEA FA8F EACD From : https://download.copr.fedorainfracloud.org/results/ligenix/enterprise-sandbox/pubkey.gpg Cloning into 'shaderc'... CMake Warning at third_party/abseil_cpp/CMakeLists.txt:202 (message): The default and system-level install directories are unsupported except in LTS releases of Abseil. Please set CMAKE_INSTALL_PREFIX to install Abseil in your source or build tree directly. error: subprocess-exited-with-error × Building wheel for llama-cpp-python (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [35 lines of output] *** scikit-build-core 0.10.3 using CMake 3.26.5 (wheel) *** Configuring CMake... loading initial cache file /tmp/tmpqtim2sln/build/CMakeInit.txt -- The C compiler identification is GNU 11.4.1 -- The CXX compiler identification is GNU 11.4.1 -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working C compiler: /usr/bin/gcc - skipped -- Detecting C compile features -- Detecting C compile features - done -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: /usr/bin/g++ - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Found Git: /usr/bin/git (found version "2.43.5") -- Performing Test CMAKE_HAVE_LIBC_PTHREAD -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success -- Found Threads: TRUE -- Found OpenMP_C: -fopenmp (found version "4.5") -- Found OpenMP_CXX: -fopenmp (found version "4.5") -- Found OpenMP: TRUE (found version "4.5") -- OpenMP found -- Using llamafile CMake Error at /usr/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:230 (message): Could NOT find Vulkan (missing: Vulkan_LIBRARY) (found version "1.3.261") Call Stack (most recent call first): /usr/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:600 (_FPHSA_FAILURE_MESSAGE) /usr/share/cmake/Modules/FindVulkan.cmake:597 (find_package_handle_standard_args) vendor/llama.cpp/ggml/src/CMakeLists.txt:580 (find_package) -- Configuring incomplete, errors occurred! *** CMake configuration failed [end of output] note: This error originates from a subprocess, and is likely not a problem