You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Transforming Neural Architecture Search (NAS) into multi-objective optimization problems. A benchmark suite for testing evolutionary algorithms in deep learning.
Neural Architecture Search as Multiobjective Optimization Benchmarks: Problem Formulation and Performance Assessment arXiv
EvoXBench is a platfrom offering instant benchmarking of evolutionary multi-objective optimization (EMO) algorithms in neural architecture search (NAS), with ready to use test suites. It facilitates efficient performance assessments with NO requirement of GPUs or PyTorch/TensorFlow, enhancing accessibility for a broader range of research applications. It encompasses extensive test suites that cover a variety of datasets (CIFAR10, ImageNet, Cityscapes, etc.), search spaces (NASBench101, NASBench201, NATS, DARTS, ResNet50, Transformer, MNV3, MoSegNAS, etc.), and hardware devices (Eyeriss, GPUs, Samsung Note10, etc.). It provides a versatile interface compatible with multiple programming languages (Java, Matlab, Python, etc.).
📢 Latest News & Updates
EvoXBench has been updated to version 1.0.5! This latest release addresses bugs in CitySeg/MOP10 and HV calculation in the CitySeg/MOP test suite.
If you're already onboard with EvoXBench, give this command a spin: pip install evoxbench==1.0.5.
⭐️ Key Features
📐 General NAS Problem Formulation
Formulating NAS tasks into general multi-objective optimization problems.
Exploring NAS's nuanced traits through the prism of evolutionary optimization.
🛠️ Efficient Benchmarking Pipeline
Presenting an end-to-end worflow for instant benchmark assessments of EMO algorithms.
Providing instant fitness evaluations as numerical optimization.
📊 Comprehensive Test Suites
Encompassing a wide spectrum of datasets, search spaces, and hardware devices.
Ready-to-use test multi-objective optimization suites with up to eight objectives.
Get Started
Tap the image to embark on the introductory video voyage.
EvoX: A computing framework for distributed GPU-aceleration of evolutionary computation, supporting a wide spectrum of evolutionary algorithms and test problems. Check out here.
Citing EvoXBench
If you use EvoXBench in your research and want to cite it in your work, please use:
@article{EvoXBench,
title={Neural Architecture Search as Multiobjective Optimization Benchmarks: Problem Formulation and Performance Assessment},
author={Lu, Zhichao and Cheng, Ran and Jin, Yaochu and Tan, Kay Chen and Deb, Kalyanmoy},
journal={IEEE Transactions on Evolutionary Computation},
year={2023},
publisher={IEEE}
}
Acknowledgements
A big shoutout to the following projects that have made EvoXBench possible:
Transforming Neural Architecture Search (NAS) into multi-objective optimization problems. A benchmark suite for testing evolutionary algorithms in deep learning.