Skip to content

[IJCV 2024] Hard-normal Example-aware Template Mutual Matching for Industrial Anomaly Detection

Notifications You must be signed in to change notification settings

NarcissusEx/HETMM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Hard-normal Example-aware Template Mutual Matching for Industrial Anomaly Detection

Zixuan Chen, Xiaohua Xie, Lingxiao Yang, Jian-Huang Lai,

International Journal of Computer Vision (IJCV), 2024.

TL;DR: HETMM is a simple yet effective framework for industrial anomaly detection based on template matching, which can accurately detect and locate unknown anomalies in a training-free manner.


Paper ArXiv Google Drive Baidu Cloud

PWC PWC PWC

Motivation

framework

Visualization of training data (ball) and queries (cube) via t-SNE. Visually, existing methods' decision boundaries are dominated by the overwhelming number of easy-normal examples (blue balls). Hence, the normal queries (green cubes) near the hard-normal examples (orange balls) are prone to be erroneously identified as anomalies (purple cubes), resulting in a high false-positive or missed-detection rate. To address this issue, we propose HETMM to construct a robust prototype-based decision boundary, which can accurately distinguish hard-normal examples from anomalies.

Framework

framework

The overall framework of our methods. In stage I, the original template set $\mathcal{T}^{(j)}$ is the aggregation of the features extracted by feeding $N$ collected normal images $\mathcal{Z}$ into the pre-trained backbone $\Phi$ with $M$ layers, where each color on $\mathcal{T}^{(j)}$ denotes that the feature is extracted from different normal images. To streamline $\mathcal{T}^{(j)}$ into a tiny set $\mathcal{T}^{(j)K}$ with $K$ sheets ($N\ge K$), PTS selects $K$ significant prototypes from $\mathcal{T}^{(j)}$ at each pixel coordinate through the sliding windows. In stage II, given a query image $q$, we first extract its features by the same pre-trained backbone $\Phi$ and then employ ATMM to obtain hierarchical anomaly maps $S^{(j)}$, where each $S^{(j)}$ is generated at the $j$-th layer. $S^\dagger$ is obtained as the final outputs.

Code Usage

1) Get start

  • Python 3.9.x
  • CUDA 11.1 or higher
  • NVIDIA RTX 3090
  • Torch 1.8.0 or higher

Create a python env using conda

conda create -n hetmm python=3.9 -y
conda activate hetmm

Install the required libraries

bash setup.sh

2) Template Generation

Original template set on MVTec AD:

python run.py --mode temp --ttype ALL --dataset MVTec_AD --datapath <data_path>

Tiny set formed by PTS (60 sheets) on MVTec AD:

python run.py --mode temp --ttype PTS --tsize 60 --dataset MVTec_AD --datapath <data_path>

Since generating pixel-level OPTICS clusters is time-consuming, you can download the "template" folder from Google Drive / Baidu Cloud and copy it into our main folder as:

HETMM/
    ├── configs/
    ├── template/
    ├── src/
    ├── run.py
    └── ...

3) Anomaly Prediction

Original template set on MVTec AD:

python run.py --mode test --ttype ALL --dataset MVTec_AD --datapath <data_path>

Tiny set formed by PTS (60 sheets) on MVTec AD:

python run.py --mode test --ttype PTS --tsize 60 --dataset MVTec_AD --datapath <data_path>

Please see "run.sh" and "run.py" for more details.

Citation

@article{Chen_2024_hetmm,
    author    = {Chen, Zixuan and Xie, Xiaohua and Yang, Lingxiao and Lai, Jianhuang},
    title     = {Hard-normal Example-aware Template Mutual Matching for Industrial Anomaly Detection},
    journal   = {International Journal of Computer Vision (IJCV)},
    publisher = {Springer},
    year      = {2024},
    doi       = {10.1007/s11263-024-02323-0},
}

About

[IJCV 2024] Hard-normal Example-aware Template Mutual Matching for Industrial Anomaly Detection

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published