The official evaluation suite and dynamic data release for MixEval.
-
Updated
Nov 10, 2024 - Python
The official evaluation suite and dynamic data release for MixEval.
GERBIL - General Entity annotatoR Benchmark
Python Multi-Process Execution Pool: concurrent asynchronous execution pool with custom resource constraints (memory, timeouts, affinity, CPU cores and caching), load balancing and profiling capabilities of the external apps on NUMA architecture
MLOS is a project to enable autotuning for systems.
Benchmark framework of compute-in-memory based accelerators for deep neural network (on-chip training chip focused)
NPBench - A Benchmarking Suite for High-Performance NumPy
A toolkit for auto-generation of OpenAI Gym environments from RDDL description files.
Benchmark framework of compute-in-memory based accelerators for deep neural network (on-chip training chip focused)
A guidance for the design and evaluation of motion planners for quadrotors in Environments with Varying Complexities
SustainDC is a set of Python environments for Data Center simulation and control using Heterogeneous Multi Agent Reinforcement Learning. Includes customizable environments for workload scheduling, cooling optimization, and battery management, with integration into Gymnasium.
Arline Benchmarks platform allows to benchmark various algorithms for quantum circuit mapping/compression against each other on a list of predefined hardware types and target circuit classes
The greatest collection of the worst code
Benchmarking machine learning inferencing on embedded hardware.
Benchmark framework of compute-in-memory based accelerators for deep neural network (inference engine focused)
Telco pIPeline benchmarking SYstem
Benchmarking framework for Feature Selection and Feature Ranking algorithms 🚀
Command execution time meter
Write Benchmarks like Tests
PHP Micro & Router Framework Benchmark
CLAMH (Cross-LAnguage Microbenchmark Harness) is a language-independent benchmark harness design and the implementation of that design for different languages.
Add a description, image, and links to the benchmarking-framework topic page so that developers can more easily learn about it.
To associate your repository with the benchmarking-framework topic, visit your repo's landing page and select "manage topics."