Skip to content

Commit

Permalink
update readme for mindspore version of 2.3.1
Browse files Browse the repository at this point in the history
  • Loading branch information
WongGawa committed Nov 14, 2024
1 parent f88bbd2 commit 7411827
Show file tree
Hide file tree
Showing 8 changed files with 201 additions and 190 deletions.
2 changes: 1 addition & 1 deletion GETTING_STARTED.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ to understand their behavior. Some common arguments are:
* Prepare your dataset in YOLO format. If trained with COCO (YOLO format), prepare it from [yolov5](https://github.com/ultralytics/yolov5) or the darknet.

<details onclose>

<summary><b>View More</b></summary>
```
coco/
{train,val}2017.txt
Expand Down
14 changes: 7 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,13 +14,13 @@

MindYOLO implements state-of-the-art YOLO series algorithms based on MindSpore.
The following is the corresponding `mindyolo` versions and supported `mindspore` versions.
| mindyolo | mindspore |
| :--: | :--: |
| master | master |
| 0.4 | 2.3.0 |
| 0.3 | 2.2.10 |
| 0.2 | 2.0 |
| 0.1 | 1.8 |
| mindyolo | mindspore |
| :------: | :---------: |
| master | master |
| 0.4 | 2.3.0/2.3.1 |
| 0.3 | 2.2.10 |
| 0.2 | 2.0 |
| 0.1 | 1.8 |

<img src="https://raw.githubusercontent.com/mindspore-lab/mindyolo/master/.github/000000137950.jpg" />

Expand Down
54 changes: 28 additions & 26 deletions configs/yolov3/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,38 +9,20 @@ We present some updates to YOLO! We made a bunch of little design changes to mak
<img src="https://raw.githubusercontent.com/zhanghuiyao/pics/main/mindyolo202304071143644.png"/>
</div>

## Results
## Requirements

<details open markdown>
<summary><b>performance tested on Ascend 910(8p) with graph mode</b></summary>

| Name | Scale | BatchSize | ImageSize | Dataset | Box mAP (%) | Params | Recipe | Download |
|--------| :---: | :---: | :---: |--------------| :---: | :---: | :---: | :---: |
| YOLOv3 | Darknet53 | 16 * 8 | 640 | MS COCO 2017 | 45.5 | 61.9M | [yaml](./yolov3.yaml) | [weights](https://download.mindspore.cn/toolkits/mindyolo/yolov3/yolov3-darknet53_300e_mAP455-adfb27af.ckpt) |
</details>

<details open markdown>
<summary><b>performance tested on Ascend 910*(8p)</b></summary>

| Name | Scale | BatchSize | ImageSize | Dataset | Box mAP (%) | ms/step | Params | Recipe | Download |
|--------| :---: | :---: | :---: |--------------| :---: | :---: | :---: | :---: | :---: |
| YOLOv3 | Darknet53 | 16 * 8 | 640 | MS COCO 2017 | 46.6 | 396.60 | 61.9M | [yaml](./yolov3.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindyolo/yolov3/yolov3-darknet53_300e_mAP455-81895f09-910v2.ckpt) |
</details>

<br>

#### Notes

- Box mAP: Accuracy reported on the validation set.
- We referred to a commonly used third-party [YOLOv3](https://github.com/ultralytics/yolov3) implementation.
| mindspore | ascend driver | firmware | cann toolkit/kernel
| :-------: | :-----------: | :----------: | :----------------:
| 2.3.1 | 24.1.RC2 | 7.3.0.1.231 | 8.0.RC2.beta1

## Quick Start

Please refer to the [GETTING_STARTED](https://github.com/mindspore-lab/mindyolo/blob/master/GETTING_STARTED.md) in MindYOLO for details.

### Training

<details open>
<details open markdown>
<summary><b>View More</b></summary>

#### - Pretraining Model

Expand Down Expand Up @@ -85,9 +67,29 @@ To validate the accuracy of the trained model, you can use `test.py` and parse t
python test.py --config ./configs/yolov3/yolov3.yaml --device_target Ascend --weight /PATH/TO/WEIGHT.ckpt
```

### Deployment
## Performance

Experiments are tested on Ascend 910*(8p) with mindspore 2.3.1 graph mode

| model name | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | mAP | recipe | weight |
| :------: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |
| YOLOv3 | 8 | 16 | 640x640 | O2 | 274.32s | 383.68 | 333.61 | 46.6% | [yaml](./yolov3.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindyolo/yolov3/yolov3-darknet53_300e_mAP455-81895f09-910v2.ckpt) |


Experiments are tested on Ascend 910(8p) with mindspore 2.3.1 graph mode

| model name | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | mAP | recipe | weight |
| :------: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |
| YOLOv3 | 8 | 16 | 640x640 | O2 | 160.80s | 409.66 | 312.45 | 45.5% | [yaml](./yolov3.yaml) | [weights](https://download.mindspore.cn/toolkits/mindyolo/yolov3/yolov3-darknet53_300e_mAP455-adfb27af.ckpt) |


<br>

### Notes

- Box mAP: Accuracy reported on the validation set.
- We referred to a commonly used third-party [YOLOv3](https://github.com/ultralytics/yolov3) implementation.

See [here](../../deploy/README.md).

## References

Expand Down
51 changes: 26 additions & 25 deletions configs/yolov4/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,30 +23,11 @@ AP (65.7% AP50) for the MS COCO dataset at a realtime speed of 65 FPS on Tesla V
<img src="https://github.com/yuedongli1/images/raw/master/mindyolo20230509.png"/>
</div>

## Results
## Requirements

<details open markdown>
<summary><b>performance tested on Ascend 910(8p) with graph mode</b></summary>

| Name | Scale | BatchSize | ImageSize | Dataset | Box mAP (%) | Params | Recipe | Download |
|--------| :---: | :---: | :---: |--------------| :---: | :---: | :---: | :---: |
| YOLOv4 | CSPDarknet53 | 16 * 8 | 608 | MS COCO 2017 | 45.4 | 27.6M | [yaml](./yolov4.yaml) | [weights](https://download.mindspore.cn/toolkits/mindyolo/yolov4/yolov4-cspdarknet53_320e_map454-50172f93.ckpt) |
| YOLOv4 | CSPDarknet53(silu) | 16 * 8 | 608 | MS COCO 2017 | 45.8 | 27.6M | [yaml](./yolov4-silu.yaml) | [weights](https://download.mindspore.cn/toolkits/mindyolo/yolov4/yolov4-cspdarknet53_silu_320e_map458-bdfc3205.ckpt) |
</details>

<details open markdown>
<summary><b>performance tested on Ascend 910*(8p)</b></summary>

| Name | Scale | BatchSize | ImageSize | Dataset | Box mAP (%) | ms/step | Params | Recipe | Download |
|--------| :---: | :---: | :---: |--------------| :---: | :---: | :---: | :---: | :---: |
| YOLOv4 | CSPDarknet53 | 16 * 8 | 608 | MS COCO 2017 | 46.1 | 337.25 | 27.6M | [yaml](./yolov4.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindyolo/yolov4/yolov4-cspdarknet53_320e_map454-64b8506f-910v2.ckpt) |
</details>

<br>

#### Notes

- Box mAP: Accuracy reported on the validation set.
| mindspore | ascend driver | firmware | cann toolkit/kernel
| :-------: | :-----------: | :----------: | :----------------:
| 2.3.1 | 24.1.RC2 | 7.3.0.1.231 | 8.0.RC2.beta1

## Quick Start

Expand All @@ -55,6 +36,7 @@ Please refer to the [GETTING_STARTED](https://github.com/mindspore-lab/mindyolo/
### Training

<details open>
<summary><b>View More</b></summary>

#### - Pretraining Model

Expand Down Expand Up @@ -104,9 +86,28 @@ To validate the accuracy of the trained model, you can use `test.py` and parse t
python test.py --config ./configs/yolov4/yolov4-silu.yaml --device_target Ascend --iou_thres 0.6 --weight /PATH/TO/WEIGHT.ckpt
```

### Deployment
## Performance

See [here](../../deploy/README.md).
Experiments are tested on Ascend 910*(8p) with mindspore 2.3.1 graph mode

| model name | backbone | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | mAP | recipe | weight |
| :--------: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |
| YOLOv4 | CSPDarknet53 | 8 | 16 | 608x608 | O2 | 467.47s | 308.43 | 415.01 | 46.1% | [yaml](./yolov4.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindyolo/yolov4/yolov4-cspdarknet53_320e_map454-64b8506f-910v2.ckpt) |


Experiments are tested on Ascend 910(8p) with mindspore 2.3.1 graph mode

| model name | backbone | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | mAP | recipe | weight
| :--------: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |
| YOLOv4 | CSPDarknet53 | 8 | 16 | 608x608 | O2 | 188.52s | 505.98 | 252.97 | 45.4% | [yaml](./yolov4.yaml) | [weights](https://download.mindspore.cn/toolkits/mindyolo/yolov4/yolov4-cspdarknet53_320e_map454-50172f93.ckpt) |
| YOLOv4 | CSPDarknet53(silu) | 8 | 16 | 608x608 | O2 | 274.18s | 443.21 | 288.80 | 45.8% | [yaml](./yolov4-silu.yaml) | [weights](https://download.mindspore.cn/toolkits/mindyolo/yolov4/yolov4-cspdarknet53_silu_320e_map458-bdfc3205.ckpt) |


<br>

### Notes

- Box mAP: Accuracy reported on the validation set.

## References

Expand Down
63 changes: 32 additions & 31 deletions configs/yolov5/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,36 +6,11 @@ YOLOv5 is a family of object detection architectures and models pretrained on th
<img src="https://raw.githubusercontent.com/zhanghuiyao/pics/main/mindyolo20230407113509.png"/>
</div>

## Results

<details open markdown>
<summary><b>performance tested on Ascend 910(8p) with graph mode</b></summary>

| Name | Scale | BatchSize | ImageSize | Dataset | Box mAP (%) | Params | Recipe | Download |
|--------| :---: | :---: | :---: |--------------| :---: | :---: | :---: | :---: |
| YOLOv5 | N | 32 * 8 | 640 | MS COCO 2017 | 27.3 | 1.9M | [yaml](./yolov5n.yaml) | [weights](https://download.mindspore.cn/toolkits/mindyolo/yolov5/yolov5n_300e_mAP273-9b16bd7b.ckpt) |
| YOLOv5 | S | 32 * 8 | 640 | MS COCO 2017 | 37.6 | 7.2M | [yaml](./yolov5s.yaml) | [weights](https://download.mindspore.cn/toolkits/mindyolo/yolov5/yolov5s_300e_mAP376-860bcf3b.ckpt) |
| YOLOv5 | M | 32 * 8 | 640 | MS COCO 2017 | 44.9 | 21.2M | [yaml](./yolov5m.yaml) | [weights](https://download.mindspore.cn/toolkits/mindyolo/yolov5/yolov5m_300e_mAP449-e7bbf695.ckpt) |
| YOLOv5 | L | 32 * 8 | 640 | MS COCO 2017 | 48.5 | 46.5M | [yaml](./yolov5l.yaml) | [weights](https://download.mindspore.cn/toolkits/mindyolo/yolov5/yolov5l_300e_mAP485-a28bce73.ckpt) |
| YOLOv5 | X | 16 * 8 | 640 | MS COCO 2017 | 50.5 | 86.7M | [yaml](./yolov5x.yaml) | [weights](https://download.mindspore.cn/toolkits/mindyolo/yolov5/yolov5x_300e_mAP505-97d36ddc.ckpt) |
</details>

<details open markdown>
<summary><b>performance tested on Ascend 910*(8p)</b></summary>

| Name | Scale | BatchSize | ImageSize | Dataset | Box mAP (%) | ms/step | Params | Recipe | Download |
|--------| :---: | :---: | :---: |--------------| :---: | :---: | :---: | :---: | :---: |
| YOLOv5 | N | 32 * 8 | 640 | MS COCO 2017 | 27.4 | 736.08 | 1.9M | [yaml](./yolov5n.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindyolo/yolov5/yolov5n_300e_mAP273-bedf9a93-910v2.ckpt) |
| YOLOv5 | S | 32 * 8 | 640 | MS COCO 2017 | 37.6 | 787.34 | 7.2M | [yaml](./yolov5s.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindyolo/yolov5/yolov5s_300e_mAP376-df4a45b6-910v2.ckpt) |
</details>

<br>
## Requirements

#### Notes

- Box mAP: Accuracy reported on the validation set.
- We refer to the official [YOLOV5](https://github.com/ultralytics/yolov5) to reproduce the P5 series model, and the differences are as follows:
1. We use 8x NPU(Ascend910) for training, and the single-NPU batch size is 32. This is different from the official code.
| mindspore | ascend driver | firmware | cann toolkit/kernel
| :-------: | :-----------: | :----------: | :----------------:
| 2.3.1 | 24.1.RC2 | 7.3.0.1.231 | 8.0.RC2.beta1

## Quick Start

Expand All @@ -44,6 +19,7 @@ Please refer to the [GETTING_STARTED](https://github.com/mindspore-lab/mindyolo/
### Training

<details open>
<summary><b>View More</b></summary>

#### - Distributed Training

Expand Down Expand Up @@ -79,9 +55,34 @@ To validate the accuracy of the trained model, you can use `test.py` and parse t
python test.py --config ./configs/yolov5/yolov5n.yaml --device_target Ascend --weight /PATH/TO/WEIGHT.ckpt
```

### Deployment
## Performance

Experiments are tested on Ascend 910*(8p) with mindspore 2.3.1 graph mode

See [here](../../deploy/README.md).
| model name | scale | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | mAP | recipe | weight |
| :--------: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |
| YOLOv5 | N | 8 | 32 | 640x640 | O2 | 377.81s | 520.79 | 491.56 | 27.4% | [yaml](./yolov5n.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindyolo/yolov5/yolov5n_300e_mAP273-bedf9a93-910v2.ckpt) |
| YOLOv5 | S | 8 | 32 | 640x640 | O2 | 378.18s | 526.49 | 486.30 | 37.6% | [yaml](./yolov5s.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindyolo/yolov5/yolov5s_300e_mAP376-df4a45b6-910v2.ckpt) |


Experiments are tested on Ascend 910(8p) with mindspore 2.3.1 graph mode

| model name | scale | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | mAP | recipe | weight |
| :--------: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |
| YOLOv5 | N | 8 | 32 | 640x640 | O2 | 233.25s | 650.57 | 393.50 | 27.3% | [yaml](./yolov5n.yaml) | [weights](https://download.mindspore.cn/toolkits/mindyolo/yolov5/yolov5n_300e_mAP273-9b16bd7b.ckpt) |
| YOLOv5 | S | 8 | 32 | 640x640 | O2 | 166.00s | 650.14 | 393.76 | 37.6% | [yaml](./yolov5s.yaml) | [weights](https://download.mindspore.cn/toolkits/mindyolo/yolov5/yolov5s_300e_mAP376-860bcf3b.ckpt) |
| YOLOv5 | M | 8 | 32 | 640x640 | O2 | 256.51s | 712.31 | 359.39 | 44.9% | [yaml](./yolov5m.yaml) | [weights](https://download.mindspore.cn/toolkits/mindyolo/yolov5/yolov5m_300e_mAP449-e7bbf695.ckpt) |
| YOLOv5 | L | 8 | 32 | 640x640 | O2 | 274.15s | 723.35 | 353.91 | 48.5% | [yaml](./yolov5l.yaml) | [weights](https://download.mindspore.cn/toolkits/mindyolo/yolov5/yolov5l_300e_mAP485-a28bce73.ckpt) |
| YOLOv5 | X | 8 | 16 | 640x640 | O2 | 436.18s | 569.96 | 224.58 | 50.5% | [yaml](./yolov5x.yaml) | [weights](https://download.mindspore.cn/toolkits/mindyolo/yolov5/yolov5x_300e_mAP505-97d36ddc.ckpt) |


<br>

### Notes

- Box mAP: Accuracy reported on the validation set.
- We refer to the official [YOLOV5](https://github.com/ultralytics/yolov5) to reproduce the P5 series model, and the differences are as follows:
1. We use 8x NPU(Ascend910) for training, and the single-NPU batch size is 32. This is different from the official code.

## References

Expand Down
Loading

0 comments on commit 7411827

Please sign in to comment.