Dreaming of Metaheuristics

Thoughts about metaheuristics for hard optimization

Go to the content | Go to the menu | Go to the search page

Wednesday 30 August 2006

Metaheuristics & benchmarks at CEC

The IEEE Congress on Evolutionary Computation (CEC) is a well-known event that take place every year.

Since 2005, there is an interesting group of special sessions, organized by Ponnuthurai Nagaratnam Suganthan:

What is really interesting in these sessions is the systematic presence of an implemented generalistic benchmark, built after discussion between researchers.

This is an extremely necessary practice, which is, unfortunately, not generalized. Indeed, this is the first step toward a rigourous performance assessment of metaheuristics (the second one being a true statistical approach, and the third one a considered data presentation).

Saturday 26 August 2006

Ant Colony Optimization Authorship

One of the scientist key policy is always to refer to people who did the first work (as it is pointed out by the "hard blogging scientists" manifest). It is due to the fact that researchers want to share free science with everybody (at least at little cost), and that recognition is a form of remuneration (in a similar way, Eric S. Raymond explain such mechanism for hackers, in his essay "The Cathedral and the Bazaar").

Recently is increasing a (rather small, but interesting) controversy about the authorship of the Ant Colony Optimization (ACO) idea.

The "orthodox" seminal paper about ACO is a technical report, written by Dorigo, Maniezzo and Colorni, submitted in 1991. As technical report are not a rigourous publication, perhaps a more pertinent citation would be the corresponding paper published in the proceedings of the First European Conference on Artificial Life, in 1992, or the Dorigo's PhD thesis, completed in 1991. Dorigo et al. refers to a work by Denebourg, Pasteels and Verhaeghe ("Probabilistic Behaviour in Ants: a Strategy of Errors?", published in 1983), as the key paper that give them the ACO idea.

Nowadays, M. Dorigo has become the reference researcher on ACO, for instance, the ANTS conference is helded every two years in his lab, in Brussels.

Where is the controversy ?

It seems that there is an oldest paper talking about using ant colonies behavior to design algorithms. This paper, written by Manderick and Moyson (The collective behavior of ants: An example of self-organization in massive parallelism) and published in 1988 in the proceedings of the AAAI Spring Symposium on Parallel Models of Intelligence, also refers to the work by Deneubourg et al., but is not cited by Dorigo et al..

Well, this does not seems to be a big problem, everyone can miss a paper in the huge amount of ideas published every days.

I first read the question of the authorship of ACO in the "habilitation" thesis of Pierre Collet (published in 2004), and it recently rised up with the modification of the ACO article on Wikipedia, the 11th of august, by an anonymous editor (with the nick "MoyMan"), who add the Manderick paper to the references. The 15th of august, Daniel Angus posted an email on the ACO mailing-list (and a post on his laboratory blog), asking "why this paper has not received more attention from the field?". The same day, a contributor called "MDorigo" delete the reference on Wikipedia, in order to give M. Dorigo the authorship. But a few hours later, the first addition was reintroduced.

In my opinion, M. Dorigo just saw the addition as a kind of vandalism, trying to introduce a fake reference, as it sometimes occurs on Wikipedia, and did a revert without verifying the validity of the source. Note that M. Dorigo did not enter an edit war after the revert, which prove his bona fide.

The funniest (well, I should say "sadiest") about this story is that people are already talking about it around me... Why such gossiping? I think this is because M. Dorigo has a strong reputation. I personaly think that such reputation is incontrovertible when you begin to be known for your work. And M. Dorigo is certainly known for his work, as it is a very good one, acknowledge by his peers.

Finally, let say that the controversy does'nt have any basis, as M. Dorigo point out in a message on the ACO mailing-list, replying to the question of D. Angus. Firstly, if the Moyson and Manderick paper is about using ant colonies behavior in the AI field (about using the theory of complex dynamical systems to design massively parallel systems, to be sharply exact [1]), it is not about using it to tackle optimization problem. Secondly, M. Dorigo cite the papers that gave him the key idea, and Moyson & Manderick one is not concerned, even if it is interesting. And finally, Bernard Manderick read the paper before its submission, and didn't felt necessary to tell to M. Dorigo that his paper should be cited.

This stupid story (when people are gossiping without having all the informations, it is always stupid) is about an insignifiant event. Only ten persons in the world will feel concerned. But I find it interesting for the following reasons:

  • this show that scientist can have an ego, and are finally as subjective as "humans": when a researcher gain popularity, he also gain criticisms, despite objectivity;
  • it illustrate how much a reputation is decisive when judging other's work;
  • and how much transparency is necessary;
  • finally, this show that, despite criticisms, Wikipedia can be a place for researchers to exchange informations, even if it can be in a rather rough manner.


[1] Here is the abstract of the Moyson & Manderick paper:
Massive parallelism needs a new methodology. The theory of complex dynamical systems provides such a methodology in those cases where massively parallel systems can be modelled as complex dynamical systems.
In particular for AI, complex dynamical systems are intersting in another respect: they have the ability to self-organize. Moreover, this self-organizational behavior can be adaptive. Important is that the emergence of self-organization is not programmed or a consequence of external instructions but results from local interactions at the microscopic level and the interplay between the system and its environment.
The new methodology and the emergence of self-organization are illustrated by the adaptive response of ant colonies to their environment. This behavior is simulated by an intrinsically parallel algorithm. Also, a mathematical model of the behavior is proposed and used to fix the parameters in the simulation. For these parameter values, self-organization is observed in the ant colony.

Friday 25 August 2006

Some blogs about metaheuristics

As this blog is just starting, I have searched some peoples talking about metaheuristics, and found some that are really interesting.

  • IlliGAL Blogging: the blog of David E. Goldberg... well, he is a great researcher and everybody knows his work. His blog is really interesting.
  • Pensive Pondering: an awesome blog, Jason Brownlee talks about biologically inspired computation. Surprisingly, I found an old post on his blog about the definition of metaheuristics, where one can see that we share a lot of common ideas on the subject (see the previous post).
  • Adaptare: Julián García writes on "Evolutionary computation, evolutionary dynamics, cultural evolution and the like", an interesting and active blog.

These are the best blogs I have found, you will find some other (less interesting) blogs in my OPML feedlist.

Wednesday 23 August 2006

What are metaheuristics ?

Despite the title of this blog, the term metaheuristic is not really well defined.

One of the first occurence of the term can (of course) be found in a paper by Fred Glover[1]: Future Paths for Integer Programming and Links to Artificial Intelligence[2]. In the section concerning tabu search, he talks about meta-heuristic:

Tabu search may be viewed as a "meta-heuristic" superimposed on another heuristic. The approach undertakes to transcend local optimality by a strategy of forbidding (or, more broadly, penalizing) certain moves.

In the AI field, a heuristic is a specific method that help solving a problem (from the greek for to find), but how must we understand the meta word ? Well, in greek, it means "after", "beyond" (like in metaphysic) or "about" (like in metadata). Reading Glover, metaheuristics seems to be heuristics beyond heuristics, which seems to be a good old definition, but what is the definition nowadays ? The litterature is really prolific on this subject, and the definitions are numerous.

There is at least three tendencies :

  1. one that consider that the most important part of metaheuristcs is the gathering of several heuristics,
  2. one other that promotes the fact that metaheuristics are designed as generalistic methods, that can tackle several problems without major changes in their design,
  3. the last one that use the term only for evolutionnary algorithms when they are hybridicized with local searches (methods that are called memetic algorithms in the other points of vue).

The last one is quite minor in the generalistic litterature, it can mainly be found in the field of evolutionnary computation, separate out the two other tendencies is more difficult.

Here are some definitions gathered in more or less generalistic papers:

"iterative generation process which guides a subordinate heuristic by combining intelligently different concepts for exploring and exploiting the search space" (Osman and Laporte, 1996[3])

"(metaheuristics) combine basic heuristic methods in higher level frameworks aimed at efficiently and effectively exploring a search space" (Blum and Roli, 2003[4])

"a metaheuristic can be seen as a general-purpose heuristic method designed to guide an underlying problem-specific heuristic (...) A metaheuristic is therefore a general algorithmic framework which can be applied to different optimization problems with relative few modifications to make them adapted to a specific problem." (Dorigo and Stützle, 2004[5])

"(metaheuristics) apply to all kinds of problems (...) are, at least to some extent, stochastic (...) direct, i.e. they do not resort to the calculation of the gradients of the objective function (...) inspired by analogies: with physics, biology or ethology" (Dréo, Siarry, Petrowski and Taillard, 2006[6])

One can summarize by enumerating the expected characteristics:

  • optimization algorithms,
  • with an iterative design,
  • combining low level heuristics,
  • aiming to tackle a large scale of "hard" problems.

As it is pointed out by the last reference, a large majority of metaheuristics (well, not to say all) use at least one stochastic (probabilistic) process and does not use more information than the solution and the associated value(s) of the objective function.

Talking about combining heuristics seems to be appropriate for Ant Colony Optimization, that specifically needs one (following Dorigo's point of vue), it can be less obvious for Evolutionnary Algorithms. One can consider that mutation, or even the method's strategy itself, is a heuristic, but isn't it too generalistic to be called a heuristic ?

If we forget the difficulty to demarcate what can be called a heuristic and what is the scope of the term meta, one can simply look at the use of the term among specialists. Despite the fact that the definition can be used in several fields (data mining, machine learning, etc.), the term is used for optimization algorithms. This is perhaps the best reason among others: the term permits to separate a research field from others, thus adding a little bit of marketing...

I would thus use this definition:

Metaheuristics are algorithms designed to tackle "hard" optimization problems, with the help of iterative stochastic processes. These methods are manipulating direct samples of the objective function, and can be applied to several problems without major changes in their design.


[1] A recurrent joke says that whatever is your new idea, it has already be written down by Glover

[2] Comput. & Ops. Res.Vol. 13, No.5, pp. 533-549, 1986

[3] Metaheuristic: A bibliography, Annals of Operations Research, vol. 63, pp. 513-623, 1996

[4] Metaheuristics in combinatorial optimization: Overview and conceptual comparison, ACM Computing Surveys, vol. 35, issue 3, 2003

[5] Ant Colony Optimization, MIT Press, 2004

[6] Metaheuristics for Hard Optimization, Springer, 2006

Tuesday 22 August 2006

A new book about "stigmergic optimization"

Springer has published a new book in their Studies in Computational Intelligence series.

Edited by Abraham, Grosan and Ramos, it covers "Stigmergic Optimization". Looking at the table of content, one can see that we talk about Ant Colony Optimization.

Of course, it also get onto the so-called swarm intelligence and self-organization, but also onto Particle Swarm Optimization, which is a method that gather more and more researchers these months.

Wednesday 2 August 2006

Introductive bibliography to metaheuristics for hard optimization

Note that this bibliography is quite old (2003) and aim french students.

If you need only one reference, this is (of course, because I'm one of the authors) this one :

  • Dréo, J. ; Petrowski, A. ; Taillard, E. ; Siarry, P. ; Metaheuristics for Hard Optimization Methods and Case Studies, Springer, 2006, XII, 369 p., 140 illus., Hardcover. ISBN: 3-540-23022-X



  • Glover, F. W. ; Kochenberger, G. A. ; 2003 : Handbook of Metaheuristics, Kluwer Academic Publishers, International series in operations research and management science, Boston Hardbound.
  • Teghem, J. ; Pirlot, M. ; 2002 : Optimisation approchée en recherche opérationnelle. Recherches locales, réseaux neuronaux et satisfaction de contraintes, Hermès.
  • Pham, D.T. ; Karaboga, D. ; 2000 : Intelligent optimisation techniques. Genetic Algorithms, Tabu Search, Simulated Annealing and Neural Networks, Springer.
  • Saït, S.M. ; Youssef, H. ; 1999 : Iterative computer algorithms with applications in engineering, IEEE Computer Society Press.
  • Reeves, C.R., 1995 : Modern Heuristic Techniques for Combinatorial Problems, Mc Graw-Hill, Advances topics in computer science.


Simulated Annealing

  • Siarry, P.; Dreyfus, G. ; 1989 : La méthode du recuit simulé : théorie et applications, ESPCI -- IDSET, 10 rue Vauquelin, 75005 Paris.

Tabu Search

  • Glover, F. ; Laguna, M. ; 1997 : Tabu search, Kluwer Academic Publishers, Dordrecht.
  • Glover, F. ; 1989 : Tabu search --- part I, ORSA Journal on Computing, vol. 1, 190--206.
  • Glover, F. ; 1990 : Tabu search --- part II, ORSA Journal on Computing, vol. 2, 4--32.

Evolutionary Algorithms (aka Genetic Algorithm)

  • Baeck, T. ; Fogel, D. B. ; Michalewicz, Z. ; 2000 : Evolutionary computation 1: basic algorithms and operators, Institute of Physics Publishing.
  • Baeck, T. ; Fogel, D. B. ; Michalewicz, Z. ; 2000 : Evolutionary computation 2: advanced algorithms and operators, Institute of Physics Publishing.
  • Goldberg, D. E. ; 1994 : Algorithmes génétiques. exploration, optimisation et apprentissage automatique, Addison-Wesley France.
  • Koza, J. R. ; 1992 : Genetic programming I: on the programming of computers by means of natural selection, MIT Press.
  • Koza, J. R. ; 1994 : Genetic programming II: automatic discovery of reusable programs, MIT Press.

Ant Colony Algorithms

  • Bonabeau, E. ; Dorigo, M. ; Theraulaz, G. ; 1999 : Swarm Intelligence, From Natural to Artificial Systems, Oxford University Press.

Greedy Randomized Adaptive Search Procedure (GRASP)

Tech Reports
  • Resende, M.G.C. ; 2000 : Greedy randomized adaptive search procedures (GRASP), AT&T Labs-Research, TR 98.41.1.

Partical Swarm Optimization

  • Eberhart, R.C. ; Kennedy, J. ; Shi, Y. ; 2001 : Swarm Intelligence, Morgan Kaufmann, Evolutionnary Computation.

Estimation of Distribution Algorithms

  • Larranaga, P. ; Lozano, J.A. ; 2002 : Estimation of Distribution Algorithms, A New Tool for Evolutionnary Computation, Kluwer Academic Publishers, Genetic Algorithms and Evolutionnary Computation.

Related Topics

Multi-Objective Optimization

  • Collette, Y. ; Siarry, P. ; 2002 : Optimisation multiobjectif, Eyrolles.
  • Deb, K. ; 2001 : Multi-objective optimization using evolutionary algorithms, John Wiley and sons.

Constrainted Optimization

  • Michalewicz, Z. ; 1996 : Genetic algorithms + data structures = evolution programs, Springer Verlag, troisième édition révisée.


  • Camazine, S. ; Deneubourg, J.L. ; Franks, N. ; Sneyd, J. ; Theraulaz, G. ; Bonabeau, E. ; 2000 : Self-Organization in Biological Systems, Princeton University Press.

Tuesday 1 August 2006

About this blog

This blog is an attempt to publish thoughts about metaheuristics and to share them with others. Indeed, blogs are fun, blogs are popular, ok... but most of all, blogs can be very usefull for researchers, that constently need to communicate, share ideas and informations.

Metaheuristics are (well, that's one definition among others, but in my opinion the better one) iterative stochastic algorithms for "hard" optimization. Well known metaheuristics are the so-called "genetic algorithms" (lets call them evolutionary ones), but these are not the only class: dont forget simulated annealing, tabu search, ant colony algorithms, estimation of distribution, etc.

This blog will try to focuse on the theory, the design, the understanding, the application, the implementation and the use of metaheuristics. I hope this blog will be profitable to other peoples (researchers as well as users), and will be a place to share thoughts.

Welcome aboard, and lets sleep with metaheuristics.