Skip to content

Commit

Permalink
Auto-merge updates from auto-update branch
Browse files Browse the repository at this point in the history
mlcommons-bot committed Jan 31, 2025
2 parents 1dcad10 + ee3e109 commit 412dfd6
Showing 47 changed files with 1,259 additions and 1,454 deletions.
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
| Model | Scenario | Accuracy | Throughput | Latency (in ms) |
|----------|------------|------------|--------------|-------------------|
| resnet50 | offline | 76 | 22.935 | - |
| Model | Scenario | Accuracy | Throughput | Latency (in ms) |
|-----------|------------|------------|--------------|-------------------|
| retinanet | offline | () | 0.425 | - |
Original file line number Diff line number Diff line change
@@ -5,7 +5,7 @@
* OS version: Linux-6.8.0-1020-azure-x86_64-with-glibc2.39
* CPU version: x86_64
* Python version: 3.12.8 (main, Dec 4 2024, 06:20:31) [GCC 13.2.0]
* MLC version: 0.1.0
* MLC version: unknown

## CM Run Command

@@ -16,16 +16,16 @@ pip install -U mlcflow

mlc rm cache -f

mlc pull repo anandhu-eng@mlperf-automations --checkout=522dda5f5d277b5943abef849a421c25ecea9d4e
mlc pull repo mlcommons@mlperf-automations --checkout=02683cf5e8beb0cc5baaf27802daafc08fe42e67


```
*Note that if you want to use the [latest automation recipes](https://docs.mlcommons.org/inference) for MLPerf,
you should simply reload anandhu-eng@mlperf-automations without checkout and clean MLC cache as follows:*
you should simply reload mlcommons@mlperf-automations without checkout and clean MLC cache as follows:*

```bash
mlc rm repo anandhu-eng@mlperf-automations
mlc pull repo anandhu-eng@mlperf-automations
mlc rm repo mlcommons@mlperf-automations
mlc pull repo mlcommons@mlperf-automations
mlc rm cache -f

```
@@ -39,4 +39,4 @@ Model Precision: fp32
### Accuracy Results

### Performance Results
`Samples per second`: `0.42546`
`Samples per second`: `0.425185`
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
MLPerf Conf path: /home/runner/MLC/repos/local/cache/get-git-repo_e71c543f/inference/mlperf.conf
User Conf path: /home/runner/MLC/repos/anandhu-eng@mlperf-automations/script/generate-mlperf-inference-user-conf/tmp/acd6959766d64bfcb2a9a589bec1d67b.conf
Dataset Preprocessed path: /home/runner/MLC/repos/local/cache/get-preprocessed-dataset-openimages_7ac6cb19
Dataset List filepath: /home/runner/MLC/repos/local/cache/get-preprocessed-dataset-openimages_7ac6cb19/annotations/openimages-mlperf.json
MLPerf Conf path: /home/runner/MLC/repos/local/cache/get-git-repo_638313d0/inference/mlperf.conf
User Conf path: /home/runner/MLC/repos/mlcommons@mlperf-automations/script/generate-mlperf-inference-user-conf/tmp/125d994af984462d96652017f6862e85.conf
Dataset Preprocessed path: /home/runner/MLC/repos/local/cache/get-preprocessed-dataset-openimages_9a4e81ce
Dataset List filepath: /home/runner/MLC/repos/local/cache/get-preprocessed-dataset-openimages_9a4e81ce/annotations/openimages-mlperf.json
Scenario: Offline
Mode: AccuracyOnly
Batch size: 1
Original file line number Diff line number Diff line change
@@ -8,20 +8,14 @@ graph TD
app-mlperf-inference,d775cac873ee4231_(_cpp,_retinanet,_onnxruntime,_cpu,_test,_r5.0-dev_default,_offline_) --> get,mlcommons,inference,src
pull-git-repo,c23132ed65c4421d --> detect,os
app-mlperf-inference,d775cac873ee4231_(_cpp,_retinanet,_onnxruntime,_cpu,_test,_r5.0-dev_default,_offline_) --> pull,git,repo
get-mlperf-inference-src,4b57186581024797 --> detect,os
get-mlperf-inference-src,4b57186581024797 --> get,python3
get-mlperf-inference-src,4b57186581024797 --> get,git,repo,_branch.master,_repo.https://github.com/mlcommons/inference
get-mlperf-inference-utils,e341e5f86d8342e5 --> get,mlperf,inference,src
app-mlperf-inference,d775cac873ee4231_(_cpp,_retinanet,_onnxruntime,_cpu,_test,_r5.0-dev_default,_offline_) --> get,mlperf,inference,utils
app-mlperf-inference-mlcommons-cpp,bf62405e6c7a44bf_(_onnxruntime,_retinanet,_cpu,_offline_) --> detect,os
app-mlperf-inference-mlcommons-cpp,bf62405e6c7a44bf_(_retinanet,_offline,_cpu,_onnxruntime_) --> detect,os
detect-cpu,586c8a43320142f7 --> detect,os
app-mlperf-inference-mlcommons-cpp,bf62405e6c7a44bf_(_onnxruntime,_retinanet,_cpu,_offline_) --> detect,cpu
app-mlperf-inference-mlcommons-cpp,bf62405e6c7a44bf_(_retinanet,_offline,_cpu,_onnxruntime_) --> detect,cpu
get-sys-utils-cm,bc90993277e84b8e --> detect,os
get-mlperf-inference-loadgen,64c3d98d0ba04950 --> detect,os
get-mlperf-inference-loadgen,64c3d98d0ba04950 --> get,python3
get-mlperf-inference-src,4b57186581024797 --> detect,os
get-mlperf-inference-src,4b57186581024797 --> get,python3
get-mlperf-inference-src,4b57186581024797 --> get,git,repo,_branch.master,_repo.https://github.com/mlcommons/inference
get-mlperf-inference-loadgen,64c3d98d0ba04950 --> get,mlcommons,inference,src
get-mlperf-inference-loadgen,64c3d98d0ba04950 --> get,compiler
detect-cpu,586c8a43320142f7 --> detect,os
@@ -50,13 +44,10 @@ graph TD
get-generic-python-lib,94b62a682bc44791_(_pip_) --> get,python3
get-generic-python-lib,94b62a682bc44791_(_package.setuptools_) --> get,generic-python-lib,_pip
get-mlperf-inference-loadgen,64c3d98d0ba04950 --> get,generic-python-lib,_package.setuptools
app-mlperf-inference-mlcommons-cpp,bf62405e6c7a44bf_(_onnxruntime,_retinanet,_cpu,_offline_) --> get,loadgen
get-mlperf-inference-src,4b57186581024797 --> detect,os
get-mlperf-inference-src,4b57186581024797 --> get,python3
get-mlperf-inference-src,4b57186581024797 --> get,git,repo,_branch.master,_repo.https://github.com/mlcommons/inference
app-mlperf-inference-mlcommons-cpp,bf62405e6c7a44bf_(_onnxruntime,_retinanet,_cpu,_offline_) --> get,mlcommons,inference,src
app-mlperf-inference-mlcommons-cpp,bf62405e6c7a44bf_(_retinanet,_offline,_cpu,_onnxruntime_) --> get,loadgen
app-mlperf-inference-mlcommons-cpp,bf62405e6c7a44bf_(_retinanet,_offline,_cpu,_onnxruntime_) --> get,mlcommons,inference,src
get-onnxruntime-prebuilt,be02c84ff57c4244_(_cpu_) --> detect,os
app-mlperf-inference-mlcommons-cpp,bf62405e6c7a44bf_(_onnxruntime,_retinanet,_cpu,_offline_) --> get,lib,onnxruntime,lang-cpp,_cpu
app-mlperf-inference-mlcommons-cpp,bf62405e6c7a44bf_(_retinanet,_offline,_cpu,_onnxruntime_) --> get,lib,onnxruntime,lang-cpp,_cpu
get-preprocessed-dataset-openimages,9842f1be8cba4c7b_(_validation,_NCHW_) --> get,python3
get-dataset-openimages,0a9d49b644cf4142_(_validation,_50,_default-annotations_) --> get,python3
get-generic-python-lib,94b62a682bc44791_(_requests_) --> detect,os
@@ -66,9 +57,6 @@ graph TD
get-generic-python-lib,94b62a682bc44791_(_pip_) --> get,python3
get-generic-python-lib,94b62a682bc44791_(_requests_) --> get,generic-python-lib,_pip
get-dataset-openimages,0a9d49b644cf4142_(_validation,_50,_default-annotations_) --> get,generic-python-lib,_requests
get-mlperf-inference-src,4b57186581024797 --> detect,os
get-mlperf-inference-src,4b57186581024797 --> get,python3
get-mlperf-inference-src,4b57186581024797 --> get,git,repo,_branch.master,_repo.https://github.com/mlcommons/inference
get-dataset-openimages,0a9d49b644cf4142_(_validation,_50,_default-annotations_) --> mlperf,inference,source
get-generic-python-lib,94b62a682bc44791_(_boto3_) --> detect,os
detect-cpu,586c8a43320142f7 --> detect,os
@@ -113,9 +101,6 @@ graph TD
get-generic-python-lib,94b62a682bc44791_(_pycocotools_) --> get,generic-python-lib,_pip
get-dataset-openimages,0a9d49b644cf4142_(_validation,_50,_default-annotations_) --> get,generic-python-lib,_pycocotools
get-preprocessed-dataset-openimages,9842f1be8cba4c7b_(_validation,_NCHW_) --> get,dataset,object-detection,openimages,original,_validation,_50,_default-annotations
get-mlperf-inference-src,4b57186581024797 --> detect,os
get-mlperf-inference-src,4b57186581024797 --> get,python3
get-mlperf-inference-src,4b57186581024797 --> get,git,repo,_branch.master,_repo.https://github.com/mlcommons/inference
get-preprocessed-dataset-openimages,9842f1be8cba4c7b_(_validation,_NCHW_) --> mlperf,mlcommons,inference,source,src
get-generic-python-lib,94b62a682bc44791_(_pycocotools_) --> get,python3
get-preprocessed-dataset-openimages,9842f1be8cba4c7b_(_validation,_NCHW_) --> get,generic-python-lib,_pycocotools
@@ -139,30 +124,27 @@ graph TD
get-preprocessed-dataset-openimages,9842f1be8cba4c7b_(_validation,_NCHW_) --> get,generic-python-lib,_numpy
get-generic-python-lib,94b62a682bc44791_(_numpy_) --> get,python3
get-preprocessed-dataset-openimages,9842f1be8cba4c7b_(_validation,_NCHW_) --> get,generic-python-lib,_numpy
app-mlperf-inference-mlcommons-cpp,bf62405e6c7a44bf_(_onnxruntime,_retinanet,_cpu,_offline_) --> get,dataset,preprocessed,openimages,_validation,_NCHW
app-mlperf-inference-mlcommons-cpp,bf62405e6c7a44bf_(_retinanet,_offline,_cpu,_onnxruntime_) --> get,dataset,preprocessed,openimages,_validation,_NCHW
download-file,9cdc8dc41aae437e_(_cmutil,_url.https://zenodo.org/record/6617879/files/resnext50_32x4d_fpn.onnx_) --> detect,os
download-and-extract,c67e81a4ce2649f5_(_url.https://zenodo.org/record/6617879/files/resnext50_32x4d_fpn.onnx_) --> download,file,_cmutil,_url.https://zenodo.org/record/6617879/files/resnext50_32x4d_fpn.onnx
get-ml-model-retinanet,427bc5665e4541c2_(_onnx,_fp32_) --> download-and-extract,_url.https://zenodo.org/record/6617879/files/resnext50_32x4d_fpn.onnx
app-mlperf-inference-mlcommons-cpp,bf62405e6c7a44bf_(_onnxruntime,_retinanet,_cpu,_offline_) --> get,ml-model,retinanet,_onnx,_fp32
app-mlperf-inference-mlcommons-cpp,bf62405e6c7a44bf_(_retinanet,_offline,_cpu,_onnxruntime_) --> get,ml-model,retinanet,_onnx,_fp32
generate-mlperf-inference-user-conf,3af4475745964b93 --> detect,os
detect-cpu,586c8a43320142f7 --> detect,os
generate-mlperf-inference-user-conf,3af4475745964b93 --> detect,cpu
generate-mlperf-inference-user-conf,3af4475745964b93 --> get,python
get-mlperf-inference-src,4b57186581024797 --> detect,os
get-mlperf-inference-src,4b57186581024797 --> get,python3
get-mlperf-inference-src,4b57186581024797 --> get,git,repo,_branch.master,_repo.https://github.com/mlcommons/inference
generate-mlperf-inference-user-conf,3af4475745964b93 --> get,mlcommons,inference,src
get-mlperf-inference-sut-configs,c2fbf72009e2445b --> get,cache,dir,_name.mlperf-inference-sut-configs
generate-mlperf-inference-user-conf,3af4475745964b93 --> get,sut,configs
app-mlperf-inference-mlcommons-cpp,bf62405e6c7a44bf_(_onnxruntime,_retinanet,_cpu,_offline_) --> generate,user-conf,mlperf,inference
app-mlperf-inference-mlcommons-cpp,bf62405e6c7a44bf_(_retinanet,_offline,_cpu,_onnxruntime_) --> generate,user-conf,mlperf,inference
detect-cpu,586c8a43320142f7 --> detect,os
compile-program,c05042ba005a4bfa --> detect,cpu
compile-program,c05042ba005a4bfa --> get,compiler
detect-cpu,586c8a43320142f7 --> detect,os
get-compiler-flags,31be8b74a69742f8 --> detect,cpu
compile-program,c05042ba005a4bfa --> get,compiler-flags
app-mlperf-inference-mlcommons-cpp,bf62405e6c7a44bf_(_onnxruntime,_retinanet,_cpu,_offline_) --> compile,cpp-program
app-mlperf-inference-mlcommons-cpp,bf62405e6c7a44bf_(_retinanet,_offline,_cpu,_onnxruntime_) --> compile,cpp-program
detect-cpu,586c8a43320142f7 --> detect,os
benchmark-program,19f369ef47084895 --> detect,cpu
benchmark-program-mlperf,cfff0132a8aa4018 --> benchmark-program,program
app-mlperf-inference-mlcommons-cpp,bf62405e6c7a44bf_(_onnxruntime,_retinanet,_cpu,_offline_) --> benchmark-mlperf
app-mlperf-inference-mlcommons-cpp,bf62405e6c7a44bf_(_retinanet,_offline,_cpu,_onnxruntime_) --> benchmark-mlperf
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading

0 comments on commit 412dfd6

Please sign in to comment.