Dreaming of MetaheuristicsThoughts about metaheuristics for hard optimizationtag:nojhan.free.fr,2019:/metah/index.php2007-12-24T10:14:27+01:00DotCleardaily12007-12-24T10:14:27+01:00Blog moving2007-12-24T10:14:27+01:00tag:nojhan.free.fr,2007-12-24:/metah/20nojhanDue to some problems with my web hosting (unmanaged spam, arbitrarily deleted tables, etc.), this blog is moving to the following URL: http://metah.nojhan.net
The new RSS feed is located at: http://metah.nojhan.net/feed/rss2... <p>Due to some problems with my web hosting (unmanaged spam, arbitrarily deleted tables, etc.), this blog is moving to the following URL: http://metah.nojhan.net</p>
<p>The new RSS feed is located at: http://metah.nojhan.net/feed/rss2</p>Classification of metaheuristics2007-10-12T16:35:51+02:00tag:nojhan.free.fr,2007-10-12:/metah/19nojhanI eventually find some time to try a graphical representation of how metaheuristics could be classified.
Here is a slidified version, that shows each classes independently:
And a static image version:
Note that some metaheuristics are not completely contained in certains classes, this... <p>I eventually find some time to try a graphical representation of how metaheuristics could be classified.</p>
<p>Here is a <em>slidified</em> version, that shows each classes independently:</p>
<object type="application/x-shockwave-flash" data="http://s3.amazonaws.com/slideshare/ssplayer.swf?id=118457&doc=metaheuristics-classifications1993" width="425" height="348"><param name="movie" value="http://s3.amazonaws.com/slideshare/ssplayer.swf?id=118457&doc=metaheuristics-classifications1993" /></object>
<p>And a static image version:
<a href="http://nojhan.free.fr/metah/images/metaheuristics_classification.jpeg"><img src="/metah/images/metaheuristics_classification.TN__jpeg" alt="Graphical classification of metaheuristics" /></a></p>
<p>Note that some metaheuristics are not completely contained in certains classes, this indicate that the method could be considered as part of the class or not, depending on your point of view.</p>
<p>I have reported the following metaheuristics:</p>
<ul>
<li>genetic programming, genetic algorithms, differential evolution,</li>
<li>evolution strategies,</li>
<li>estimation of distribution algorithms,</li>
<li>particle swarm optimization,</li>
<li>ant colony optimization,</li>
<li>simulated annealing,</li>
<li>tabu search,</li>
<li>GRASP,</li>
<li>variable neighborhood search,</li>
<li>iterated, stochastic and guided local search.</li>
</ul>
<p>And the following classes :</p>
<ul>
<li>metaheurisitcs <em>vs</em> local search,</li>
<li>population metaheuristics <em>vs</em> trajectory-based ones,</li>
<li>evolutionary computation or not,</li>
<li>nature-inspired methods or not,</li>
<li>dynamic objective function <em>vs</em> static ones,</li>
<li>memory-based algorithms <em>vs</em> memory-less,</li>
<li>implicit, explicit or direct metaheuristics.</li>
</ul>
<p>I proposed the last class, so that it may not be well-known. You will find more informations about it in the following paper:
<em><a href="http://www.nojhan.net/pro/spip.php?article19" hreflang="en">Adaptive Learning Search, a new tool to help comprehending metaheuristics</a>, J. Dreo, J.-P. Aumasson, W. Tfaili, P. Siarry, International Journal on Artificial Intelligence Tools, Vol. 16, No. 3.. - 1 June 2007</em></p>
<p>I didn't placed a <em>stochastic</em> category, as it seems a bit difficult to represent graphically. Indeed, a lot of methods could be "stochasticized" or "derandomized" in several ways.</p>
<p>There is surely several lacks or errors, feel free to give your point of view with a trackback, an email or by modifying the SVG source version (comments are disabled due to spam that I didn't have time to fight accurately).</p>Network of books about metaheuristics2007-09-04T15:06:54+02:00tag:nojhan.free.fr,2007-09-04:/metah/18nojhanPlaying with Touchgraph, I have tried to search on Amazon for books about metaheuristics. The results are interesting, as one can see on the following graphs :
This one only shows the books themselves. Note the lot of books not connected in the bottom-left side and the highly connected boks about... <p>Playing with <a href="http://www.touchgraph.com" hreflang="en">Touchgraph</a>, I have tried to <a href="http://www.touchgraph.com/TGAmazonBrowser.html" hreflang="en">search on Amazon</a> for books about metaheuristics. The results are interesting, as one can see on the following graphs :</p>
<p>This one only shows the books themselves. Note the lot of books not connected in the bottom-left side and the highly connected boks about evolutionary computation. I also find interesting that the machine-learning topic on the top and the connexions with computer programming on the right. It surprises me that my "metaheuristic for hard optimization" book is the only one connected with such topics... maybe because of the case studies.</p>
<p><a href="http://nojhan.free.fr/metah/images/touchgraph_books.png" hreflang="en"><img src="/metah/images/touchgraph_books.TN__.png" alt="" /></a></p>
<p>Check also the network of authors and subjects:</p>
<p><a href="http://nojhan.free.fr/metah/images/touchgraph_subjects-authors.png" hreflang="en"><img src="/metah/images/touchgraph_subjects-authors.TN__.png" alt="" /></a></p>
<p>The graphic for authors and publishers (Springer is the biggest, as one can expect):</p>
<p><a href="http://nojhan.free.fr/metah/images/touchgraph_publishers-authors.png" hreflang="en"><img src="/metah/images/touchgraph_publishers-authors.TN__.png" alt="" /></a></p>
<p>And finally, all the items on one graph:</p>
<p><a href="http://nojhan.free.fr/metah/images/touchgraph_all.png" hreflang="en"><img src="/metah/images/touchgraph_all.TN__.png" alt="" /></a></p>Hybridization : estimation of distribution as a meta-model filter generator for metaheuristics ?2007-07-27T15:56:41+02:00tag:nojhan.free.fr,2007-07-27:/metah/17nojhanAn interesting idea is to use meta-model (a priori representation of the problem) as a filter to bias the sample produced by metaheuristics. This approach seems especially promising for engineering problem, where computing the objective function is very expensive.
One simple form of meta-model is... <p>An interesting idea is to use meta-model (a priori representation of the problem) as a filter to bias the sample produced by metaheuristics. This approach seems especially promising for engineering problem, where computing the objective function is very expensive.</p>
<p>One simple form of meta-model is a probability density function, approximating the shape of the objective function. This PDF could thus be used to filter out bad points <em>before</em> evaluation.</p>
<p>Why, then, do not directly use EDA to generate the sample ? Because one can imagine that the problem shape is not well known, and that using a complex PDF is impossible (too expensive to compute, for example). Then, using a classical indirect metaheuristic (let say an evolutionary algorithm) should be preferable (computationnaly inexpensive) for the sample generation. If one know a good approximation to use for the distribution of the EDA (not too computationnaly expensive), one can imagine using the best part of the two worlds.</p>
<p>An example could be a problem with real variable : using an EDA with a multi-variate normal distribution is computationnaly expensive (due to the estimation of the co-variance, mainly), and using a mixture of gaussian kernels makes difficult to have an <em>a priori</em> on the problem. Thus, why not using a indirect metaheuristic to handle the sample generation, and use a meta-model which parameters are estimated from the previous sample, according to a chosen distribution ?</p>
<p>One more hybridization to try...</p>Error metrics2007-07-05T12:04:13+02:00tag:nojhan.free.fr,2007-07-05:/metah/16nojhanMany metrics are used to assess the quality of approximation found by metaheuristics. Two of them are used really often: distance to the true optimum according to its position and to its value.
Unfortunately, the objective function's shape can vary a lot in real-world problem, making these metrics... <p>Many metrics are used to assess the quality of approximation found by metaheuristics. Two of them are used really often: distance to the true optimum according to its <em>position</em> and to its <em>value</em>.</p>
<p>Unfortunately, the objective function's shape can vary a lot in real-world problem, making these metrics difficult to interpret. For example, if the optimum is in a very deep valley (in value), a solution close to it in position may not signifiate that the algorithm have well learn the shape of it. Inversely, a solution close to an optimum in value may not signifiate that it is in the same valley.</p>
<p>One metric that can counter thse drawbacks is a distance taking into account the parameters of the problem as well as the value dimension.</p>
<p>Anyway, the question of the type of distance to use is dependent of the problem.</p>Random draw in a sphere2007-03-28T19:00:23+02:00tag:nojhan.free.fr,2007-03-28:/metah/15nojhanWhen adapting combinatorial metaheuristics to continuous problems, one sometimes use a sphere as an approximation of the "neighborhood". The idea is thus to draw the neighbours around a solution, for example in order to apply a simple mutation in a genetic algorithm.
Sometimes, one choose to use... <p>When adapting combinatorial metaheuristics to continuous problems, one sometimes use a sphere as an approximation of the "neighborhood". The idea is thus to draw the neighbours around a solution, for example in order to apply a simple mutation in a genetic algorithm.</p>
<p>Sometimes, one choose to use an uniform law, but how to draw random vectors <em>uniformly</em> in an hyper-sphere ?</p>
<p>The first idea that comes to mind is to use a polar coordinate system and draw the radius <i>r</i> and the angles <i>a<sub>1</sub>...a<sub>2</sub>...a<sub>i</sub>...a<sub>N</sub></i> with a uniform law. Then, one convert the coordinates in the rectangular system, <i>x<sub>1</sub>...x<sub>2</sub>...x<sub>i</sub>...x<sub>N</sub></i>.</p>
<p>The result is interesting for a metaheuristic design, but is not a uniform distribution:</p>
<img src="/metah/images/randHS_false.png" />
<p>The correct method is to draw each <i>x<sub>i</sub></i> according to:
<i>x<sub>i</sub>=(r<sub>i</sub><sup>1/N</sup>a<sub>i</sub>)/√(∑(a<sub>i</sub>))</i>
<br />
(in L<sup>A</sup>T<sub>E</sub>X :
<code>$x_{i}=\frac{r^{\frac{1}{N}}_i\cdot a_{i}}{\sqrt{{\displaystyle \sum _{j=1}^{N}a_{i}}}}$</code>)</p>
<p>With <i>r<sub>i</sub></i> uniformly drawn in <i>U<sub>0,1</sub></i> and <i>a</i> following a normal law <i>N<sub>O,1</sub></i></p>
<p>The result is then a true uniform distribution:</p>
<img src="/metah/images/randHS_ok.png" />
<p>Credits goes to <a href="http://www.mauriceclerc.net/">Maurice Clerc</a> for detecting and solving the problem.</p>Evolving Objects 1.0 is out2007-01-07T20:10:31+01:00tag:nojhan.free.fr,2007-01-07:/metah/14nojhanThe EO framework has just reached the 1.0 version. This is one of the most interesting library for metaheuristics.
It is a templatized C++ framework, with a component based architecture. EO is focused on "evolutionary computing" (a synonym of metaheuristics, imho) and can be used for any... <p>The EO framework has just reached <a href="http://sourceforge.net/project/showfiles.php?group_id=9775" hreflang="en">the 1.0 version</a>. This is one of the most interesting library for metaheuristics.</p>
<p>It is a templatized C++ framework, with a component based architecture. EO is focused on "evolutionary computing" (a synonym of metaheuristics, imho) and can be used for any population-base metaheuristics. There exists versions for local searches, multi-objective optimization or parallel architectures... a real good piece of software :-)</p>Random and learning2006-12-21T09:00:43+01:00tag:nojhan.free.fr,2006-12-21:/metah/11nojhanThe Machine Learning (Theory) blog, by John Langford, has a very intersting post on "Explicit Randomization in Learning algorithms".
The post and its comments are talking about machine-learning, but can largely be applied to metaheuristics. The page is listing several reason for using... <p>The <em><a href="http://hunch.net/" hreflang="en">Machine Learning (Theory)</a></em> blog, by <a href="http://hunch.net/~jl/" hreflang="en">John Langford</a>, has a very intersting post on "<a href="http://hunch.net/wp-trackback.php?p=239" hreflang="en">Explicit Randomization in Learning algorithms</a>".</p>
<p>The post and its comments are talking about machine-learning, but can largely be applied to metaheuristics. The page is listing several reason for using randomization, from which some are of special intersts for metaheuristics:</p>
<ol>
<li>symmetry breaking as a way to make decision, which is of great importance for metaheuristics, which must learn and choose where are the "promising regions";</li>
<li>overfit avoidance, which is related to the intensification/diversification balance problem;</li>
<li>adversary defeating and bias suppression, which can be interpreted as trying to design a true <em>meta</em>-heuristic (i.e. that can be applied on several problems without major changes).</li>
</ol>
<p>Of course, it should be possible to design a completely deterministic algorithm that takes decisions, achieve a correct i/d balance and can tackle all problems... Even if this force to integrate the problems themselves in the algorithm, it <em>should</em> be possible. The drawback is that it is computationally intractable.</p>
<p>In fact, metaheuristics (and, as far as I understand, machine-learning algorithms) are located somewhere between random search algorithms and deterministic ones. The compromise between these two tendencies is dependent of the problem and of the offered computational effort.</p>Metaheuristics and experimental research2006-12-20T09:52:07+01:00tag:nojhan.free.fr,2006-12-20:/metah/13nojhanSpringer has just published a book on "Experimental Research in Evolutionary Computation", written by Thomas Bartz-Beielstein.
Thomas Bartz-Beielstein is working on the statistical analysis of the behaviour of metaheuristics (see its tutorials at GECCO and CEC), and the publication of its book is... <p>Springer has just published a book on "<a href="http://www.springer.com/east/home/computer/foundations?SGWID=5-156-22-142872140-detailsPage=ppmmedia" hreflang="en">Experimental Research in Evolutionary Computation</a>", written by <a href="http://ls11-www.cs.uni-dortmund.de/people/tom/" hreflang="en">Thomas Bartz-Beielstein</a>.</p>
<p>Thomas Bartz-Beielstein is working on the statistical analysis of the behaviour of metaheuristics (see <a href="http://ls11-www.cs.uni-dortmund.de/people/tom/#Talks_and_Tutorials" hreflang="en">its tutorials</a> at GECCO and CEC), and the publication of its book is a really great thing. I haven't read it yet, but the table of content seems really promising. There is a true need for such work in the metaheuristics community, and in stochastic optimization in general.</p>
<p>A friend said to me that the lack of experimental culture in the computer science community was a form of consensus, perhaps because theoretical aspects of mathematics was the "only way to make true science". This is a true problem when you deal with stochastic algorithm, applied to real world problem. Despite the fact that several papers early call for more rigourous experimental studies of metaheuristcs (<a href="http://mistic.heig-vd.ch/taillard/" hreflang="en">E.D. Taillard</a> has written papers on this problem several years ago, for example), the community does not seems to quickly react.</p>
<p>Yet, things are changing, after the series of CEC special sessions on benchmark for metaheuristics, there is more and more papers on how to test stochastic optimization algorithms and outline the results. I think this book is coming timely... the next step will be to promote the dissemination of the results data (and code!), in an open format, along with the papers.</p>Metaheuristics and machine-learning2006-12-19T10:34:49+01:00tag:nojhan.free.fr,2006-12-19:/metah/12nojhanMetaheuristics and machine-learning algorithms shares a large number of characteristics, like stochastic processes, manipulaton of probability density functions, etc.
One of the interesting evolution of the research on metaheuristics these years is the increasing bridge-building with... <p>Metaheuristics and machine-learning algorithms shares a large number of characteristics, like stochastic processes, manipulaton of probability density functions, etc.</p>
<p>One of the interesting evolution of the research on metaheuristics these years is the increasing bridge-building with machine-learning. I see at least two interesting pathways: the use of metaheuristics in machine-learning and the use of machine-learning in metaheuristics.</p>
<p>The first point is not really new, machine-learning heavily use optimization, and it was natural to try stochastic algorithms where local search or exact algorithms failed. Nevertheless, there is now a sufficient litterature to organize some special sessions in some symposium. For 2007, there will be a <em><a href="http://seal.tst.adfa.edu.au/~alar/gbml2007/" hreflang="en">special session on Genetics-Based Machine Learning</a></em> at CEC, and a <em><a href="http://www.sigevo.org/gecco-2007/organizers-tracks.html" hreflang="en">track on Genetics-Based Machine Learning and Learning Classifier Systems</a></em> at GECCO. These events are centered around "genetic" algortihm (see the posts on the IlliGAL blog : <a href="http://www-illigal.ge.uiuc.edu/system/components/com_jd-wp/wp-trackback.php?p=745" hreflang="en">1</a>, <a href="http://www-illigal.ge.uiuc.edu/system/components/com_jd-wp/wp-trackback.php?p=746" hreflang="en">2</a>), despite the fact that there are several papers using other metaheuritics, like simulated annealing, but this is a common drawback, and does not affect the interest of the subject.</p>
<p>The second point is less exploited, but I find it of great interest. A simple example of what can be done with machine-learning inside metaheuristic can be shown with estimation of distribution algorithms. In these metaheuristics, a probability density function is used to explicitely build a new sample of the objective function (a "population", in the evolutionary computation terminology) at each iteration. It is then crucial to build a probability density function that is related to the structure of the objective function (the "fitness landscape"). There, it should be really interesting to build the model of the pdf itself from a selected sample, using a machine-learning algorithm. There is some interesting papers talking about that.</p>
<p>If you mix these approaches with the problem of estimating a Boltzmann distribution (the basis of simulated annealing), you should have an awesome research field...</p>