Skip to content
This repository was archived by the owner on Oct 25, 2024. It is now read-only.

Commit 590024d

Browse files
remove optimum-intel version limit (#1651)
Signed-off-by: changwangss <[email protected]> Co-authored-by: chensuyue <[email protected]>
1 parent aecd109 commit 590024d

File tree

4 files changed

+5
-4
lines changed

4 files changed

+5
-4
lines changed

.github/workflows/script/install_binary.sh

+2-1
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,5 @@
11
#!/bin/bash
2+
23
source /intel-extension-for-transformers/.github/workflows/script/change_color.sh
34

45
cd /intel-extension-for-transformers
@@ -10,7 +11,7 @@ git config --global --add safe.directory "*"
1011
git submodule update --init --recursive
1112

1213

13-
$BOLD_YELLOW && echo "---------------- run python setup.py sdist bdist_wheel -------------" && $RESET
14+
$BOLD_YELLOW && echo "---------------- run python setup.py bdist_wheel -------------" && $RESET
1415
python setup.py bdist_wheel
1516

1617

.github/workflows/unit-test-optimize.yml

+1-1
Original file line numberDiff line numberDiff line change
@@ -67,7 +67,7 @@ jobs:
6767
with:
6868
submodules: "recursive"
6969
ref: ${{ matrix.test_branch }}
70-
fetch-tags: true
70+
fetch-depth: 0
7171

7272
- name: Docker Build
7373
run: |

examples/huggingface/pytorch/text-generation/quantization/requirements_sq.txt

+1-1
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ sentencepiece != 0.1.92
77
torch==2.3.0+cpu
88
transformers==4.38.1
99
intel_extension_for_pytorch==2.3.0
10-
optimum-intel==1.16.1
10+
optimum-intel
1111
bitsandbytes #baichuan
1212
transformers_stream_generator
1313
tiktoken #qwen

tests/requirements.txt

+1-1
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ mlflow
1616
nlpaug==1.1.9
1717
onnx
1818
onnxruntime
19-
optimum-intel==1.16.1
19+
optimum-intel
2020
peft==0.6.2
2121
py-cpuinfo
2222
sacremoses

0 commit comments

Comments
 (0)