Skip to content

fix: set use_reentrant to True to fix Mixtral-7b bug (#3928) #57

fix: set use_reentrant to True to fix Mixtral-7b bug (#3928)

fix: set use_reentrant to True to fix Mixtral-7b bug (#3928) #57

Triggered via push February 11, 2024 02:22
Status Failure
Total duration 27s
Artifacts
Matrix: release_pull_requestk

Annotations

3 errors, 20 warnings, and 6 notices
cherry_pick_into_release-0.7
Input required and not supplied: github-token
cherry_pick_into_release-0.8
Input required and not supplied: github-token
cherry_pick_into_release-0.7
Node.js 16 actions are deprecated. Please update the following actions to use Node.js 20: shioyang/[email protected]. For more information see: https://github.blog/changelog/2023-09-22-github-actions-transitioning-from-node-16-to-node-20/.
cherry_pick_into_release-0.7
The following actions uses node12 which is deprecated and will be forced to run on node16: shioyang/[email protected]. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
cherry_pick_into_release-0.8
Node.js 16 actions are deprecated. Please update the following actions to use Node.js 20: shioyang/[email protected]. For more information see: https://github.blog/changelog/2023-09-22-github-actions-transitioning-from-node-16-to-node-20/.
cherry_pick_into_release-0.8
The following actions uses node12 which is deprecated and will be forced to run on node16: shioyang/[email protected]. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
All 2 runs failed: test_check_llm_input_features (tests.ludwig.config_validation.test_checks): tests.ludwig.config_validation.test_checks#L0
artifacts/Unit Test Results (Python 3.8 not distributed)/pytest.xml artifacts/Unit Test Results (Python 3.9 not distributed)/pytest.xml
All 2 runs failed: test_retrieval_config_none_type (tests.ludwig.config_validation.test_checks): tests.ludwig.config_validation.test_checks#L0
artifacts/Unit Test Results (Python 3.8 not distributed)/pytest.xml artifacts/Unit Test Results (Python 3.9 not distributed)/pytest.xml
All 2 runs failed: test_retrieval_config_random_type (tests.ludwig.config_validation.test_checks): tests.ludwig.config_validation.test_checks#L0
artifacts/Unit Test Results (Python 3.8 not distributed)/pytest.xml artifacts/Unit Test Results (Python 3.9 not distributed)/pytest.xml
All 2 runs failed: test_retrieval_config_semantic_type (tests.ludwig.config_validation.test_checks): tests.ludwig.config_validation.test_checks#L0
artifacts/Unit Test Results (Python 3.8 not distributed)/pytest.xml artifacts/Unit Test Results (Python 3.9 not distributed)/pytest.xml
All 2 runs failed: test_check_qlora (tests.ludwig.config_validation.test_checks): tests.ludwig.config_validation.test_checks#L0
artifacts/Unit Test Results (Python 3.8 not distributed)/pytest.xml artifacts/Unit Test Results (Python 3.9 not distributed)/pytest.xml
All 2 runs failed: test_check_prompt_requirements (tests.ludwig.config_validation.test_checks): tests.ludwig.config_validation.test_checks#L0
artifacts/Unit Test Results (Python 3.8 not distributed)/pytest.xml artifacts/Unit Test Results (Python 3.9 not distributed)/pytest.xml
All 2 runs failed: test_dataset_fallback_mirror[mercedes_benz_greener-shape0] (tests.ludwig.datasets.test_datasets): tests.ludwig.datasets.test_datasets#L0
artifacts/Unit Test Results (Python 3.8 not distributed)/pytest.xml artifacts/Unit Test Results (Python 3.9 not distributed)/pytest.xml
All 2 runs failed: test_dataset_fallback_mirror[ames_housing-shape1] (tests.ludwig.datasets.test_datasets): tests.ludwig.datasets.test_datasets#L0
artifacts/Unit Test Results (Python 3.8 not distributed)/pytest.xml artifacts/Unit Test Results (Python 3.9 not distributed)/pytest.xml
All 2 runs failed: test_ad_hoc_dataset_download[consumer_complaints-38000] (tests.ludwig.datasets.test_datasets): tests.ludwig.datasets.test_datasets#L0
artifacts/Unit Test Results (Python 3.8 not distributed)/pytest.xml artifacts/Unit Test Results (Python 3.9 not distributed)/pytest.xml
1 out of 2 runs failed: test_hf_dataset_loading (tests.ludwig.datasets.test_datasets): tests.ludwig.datasets.test_datasets#L0
artifacts/Unit Test Results (Python 3.9 not distributed)/pytest.xml
All 2 runs failed: test_chunking[None-s3://ludwig-tests/datasets/synthetic_1k.csv] (tests.ludwig.utils.test_data_utils): tests.ludwig.utils.test_data_utils#L0
artifacts/Unit Test Results (Python 3.8 not distributed)/pytest.xml artifacts/Unit Test Results (Python 3.9 not distributed)/pytest.xml
All 2 runs failed: test_chunking[None-s3://ludwig-tests/datasets/synthetic_1k.parquet] (tests.ludwig.utils.test_data_utils): tests.ludwig.utils.test_data_utils#L0
artifacts/Unit Test Results (Python 3.8 not distributed)/pytest.xml artifacts/Unit Test Results (Python 3.9 not distributed)/pytest.xml
All 2 runs failed: test_chunking[100-s3://ludwig-tests/datasets/synthetic_1k.csv] (tests.ludwig.utils.test_data_utils): tests.ludwig.utils.test_data_utils#L0
artifacts/Unit Test Results (Python 3.8 not distributed)/pytest.xml artifacts/Unit Test Results (Python 3.9 not distributed)/pytest.xml
All 2 runs failed: test_chunking[100-s3://ludwig-tests/datasets/synthetic_1k.parquet] (tests.ludwig.utils.test_data_utils): tests.ludwig.utils.test_data_utils#L0
artifacts/Unit Test Results (Python 3.8 not distributed)/pytest.xml artifacts/Unit Test Results (Python 3.9 not distributed)/pytest.xml
All 2 runs failed: test_performance[ames_housing.gbm.yaml] (tests.regression_tests.benchmark.test_model_performance): tests.regression_tests.benchmark.test_model_performance#L0
artifacts/Unit Test Results (Python 3.8 distributed)/pytest.xml artifacts/Unit Test Results (Python 3.9 distributed)/pytest.xml
All 2 runs failed: test_performance[mercedes_benz_greener.gbm.yaml] (tests.regression_tests.benchmark.test_model_performance): tests.regression_tests.benchmark.test_model_performance#L0
artifacts/Unit Test Results (Python 3.8 distributed)/pytest.xml artifacts/Unit Test Results (Python 3.9 distributed)/pytest.xml
12 skipped tests found
There are 12 skipped tests, see "Raw output" for the full list of skipped tests.
2980 tests found (test 1 to 538)
There are 2980 tests, see "Raw output" for the list of tests 1 to 538.
2980 tests found (test 539 to 1238)
There are 2980 tests, see "Raw output" for the list of tests 539 to 1238.
2980 tests found (test 1239 to 1958)
There are 2980 tests, see "Raw output" for the list of tests 1239 to 1958.
2980 tests found (test 1959 to 2702)
There are 2980 tests, see "Raw output" for the list of tests 1959 to 2702.
2980 tests found (test 2703 to 2980)
There are 2980 tests, see "Raw output" for the list of tests 2703 to 2980.