This repository contains code implementing the algorithms proposed in the paper Generalizing Gaussian Smoothing for Random Search, Gao and Sener (ICML 2022).
In particular, we provide the code used to obtain the experimental results on linear regression and the Nevergrad benchmark. For online RL, we used the ARS repository; our proposed algorithms may be implemented by modifying the sampling distribution of the shared noise table. Please see the paper for additional details and the hyperparameters used.
The code is written in Python 3. Aside from the standard libraries, NumPy and Matplotlib are needed. For linear regression, you also need SciPy, and for Nevergrad the corresponding package.
Please see the READMEs in the LinearRegression
and benchmarks
folders for further instructions.
To cite this repository in your research, please reference the following paper:
Gao, Katelyn, and Ozan Sener. "Generalizing Gaussian Smoothing for Random Search." International Conference on Machine Learning. PMLR, 2022.
@inproceedings{gao2022generalizing,
title={Generalizing Gaussian Smoothing for Random Search},
author={Gao, Katelyn and Sener, Ozan},
booktitle={International Conference on Machine Learning},
pages={7077--7101},
year={2022},
organization={PMLR}
}
If you have questions, please contact [email protected].