Skip to content

Update Auto merged updates #889

Update Auto merged updates

Update Auto merged updates #889

Triggered via pull request July 3, 2024 12:45
Status Failure
Total duration 3m 31s
Artifacts

model_servers.yaml

on: pull_request
Matrix: model-servers-buid-and-push
Fit to window
Zoom out
Zoom in

Annotations

8 errors
model-servers-buid-and-push (llamacpp-python-cuda, granite, cuda, llamacpp_python, linux/amd64,li...
Error: buildah exited with code 1 Trying to pull quay.io/opendatahub/workbench-images:cuda-ubi9-python-3.9-20231206... Getting image source signatures Copying blob sha256:bee86f8257632eeaa07cebb2436ccab03b967017e1ef485a4525ae5991f0ee33 Copying blob sha256:b824f4b30c465e487e640bdc22e46bafd6983e4e0eabf30085cacf945c261160 Copying blob sha256:5f7d6ade1ce7871adb550730033e6696e928aaafea518b98a7f2c7cb89eda124 Copying blob sha256:2c6c6493f94e4d1f481a0976dec432e3b7c95f1ba764f7a0033b995670112ad7 Copying blob sha256:a64827a24ae8ee62038a21834d13766a06c1526b54c86ac6b260c699820220bc Copying blob sha256:7cb554c593ec96d7901a6f99e4c4d3b45976d92e0aa4f24db8f876ba68903fcb Copying config sha256:1c79c1fc89c6ffbe76e2b62ee55f965fc87fdca64203dda12444b33b9bb1e147 Writing manifest to image destination error: subprocess-exited-with-error × Building wheel for llama-cpp-python (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [28 lines of output] *** scikit-build-core 0.9.8 using CMake 3.29.6 (wheel) *** Configuring CMake... loading initial cache file /tmp/tmp2isof5cx/build/CMakeInit.txt -- The C compiler identification is GNU 13.2.1 -- The CXX compiler identification is GNU 13.2.1 -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working C compiler: /opt/rh/gcc-toolset-13/root/usr/bin/gcc - skipped -- Detecting C compile features -- Detecting C compile features - done -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: /opt/rh/gcc-toolset-13/root/usr/bin/g++ - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Found Git: /usr/bin/git (found version "2.39.3") CMake Error at vendor/llama.cpp/CMakeLists.txt:95 (message): LLAMA_CUBLAS is deprecated and will be removed in the future. Use GGML_CUDA instead Call Stack (most recent call first): vendor/llama.cpp/CMakeLists.txt:100 (llama_option_depr) -- Configuring incomplete, errors occurred! *** CMake configuration failed [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for llama-cpp-python ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects [notice] A new release of pip available: 22.2.2 -> 24.1.1 [notice] To update, run: pip install --upgrade pip Error: building at STEP "RUN CC="/opt/rh/gcc-toolset-13/root/usr/bin/gcc" CXX="/opt/rh/gcc-toolset-13/root/usr/bin/g++" pip install --no-cache-dir -r ./requirements.txt": while running runtime: exit status 1
model-servers-buid-and-push (whispercpp, whisper-small, base, whispercpp, linux/amd64,linux/arm64...
The job was canceled because "llamacpp-python-cuda_gran" failed.
model-servers-buid-and-push (llamacpp-python-vulkan-arm, granite, vulkan/arm64, llamacpp_python, ...
The job was canceled because "llamacpp-python-cuda_gran" failed.
model-servers-buid-and-push (object_detection_python, facebook-detr-resnet-101, base, object_dete...
The job was canceled because "llamacpp-python-cuda_gran" failed.
model-servers-buid-and-push (llamacpp_python, granite, base, llamacpp_python, linux/amd64,linux/a...
The job was canceled because "llamacpp-python-cuda_gran" failed.