-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathUntitled.rtf
12 lines (11 loc) · 1.54 KB
/
Untitled.rtf
1
2
3
4
5
6
7
8
9
10
11
12
{\rtf1\ansi\ansicpg1252\cocoartf1265\cocoasubrtf200
{\fonttbl\f0\fswiss\fcharset0 Helvetica;}
{\colortbl;\red255\green255\blue255;}
\paperw11900\paperh16840\margl1440\margr1440\vieww10800\viewh8400\viewkind0
\pard\tx566\tx1133\tx1700\tx2267\tx2834\tx3401\tx3968\tx4535\tx5102\tx5669\tx6236\tx6803\pardirnatural
\f0\fs24 \cf0 The main impact we tried to have with this paper is to propose a new approach to optimize the parameters of the Stochastic Local Search and a better understanding of the correlation between the restart and the noise parameters.\
\
In 3.4, the assumption of the uniform distribution on both parameters was a conservative one, as we don\'92t have any reason as to why one value of delta or mu should be more likely than another one. In practice, mu can roughly be estimated during the search process but estimating delta is very unlikely. This was a weak assumption to show how an effective optimization can be made in a real case if we know a little bit more about the distribution of our Markov chain.\
As you can see on 3.5, the purpose was to propose a tutorial and a framework be able to optimize the parameters of the SLS using SoftSLS.\
\
The first experiment was about showing the rough equivalence between the classic SLS and the proposed SoftSLS. Although this is intuitive, we thought that it is useful to precise that so that a parameter estimation in the SoftSLS can be applied to the original SLS. In the limit case, they completely converge as the probability of restart turns into its expected value that is equivalence with the classic SLS.}