The development of MAs

1st generation

The first generation of MA refers to hybrid algorithms, a marriage between a population-based global search (often in the form of an evolutionary algorithm) coupled with a cultural evolutionary stage. This first generation of MA although encompasses characteristics of cultural evolution (in the form of local refinement) in the search cycle, it may not qualify as a true evolving system according to Universal Darwinism, since all the core principles of inheritance/ memetic transmission, variation and selection are missing. This suggests why the term MA stirred up criticisms and controversies among researchers when first introduced in The Selfish Gene.


2nd generation

Multi-meme , Hyper-heuristic and Meta-Lamarckian MA are referred to as second generation MA exhibiting the principles of memetic transmission and selection in their design. In Multi-meme MA, the memetic material is encoded as part of the genotype. Subsequently, the decoded meme of each respective individual / chromosome is then used to perform a local refinement. The memetic material is then transmitted through a simple inheritance mechanism from parent to offspring(s). On the other hand, in hyper-heuristic and meta-Lamarckian MA, the pool of candidate memes considered will compete, based on their past merits in generating local improvements through a reward mechanism, deciding on which meme to be selected to proceed for future local refinements. Memes with a higher reward have a greater chance of being replicated or copied. For a review on second generation MA, i.e., MA considering multiple individual learning methods within an evolutionary system, the reader is referred to .

3rd generation

Co-evolution and self-generation MAs may be regarded as 3rd generation MA where all three principles satisfying the definitions of a basic evolving system has been considered. In contrast to 2nd generation MA which assumes the pool of memes to be used being known a priori, a rule-based representation of local search is co-adapted alongside candidate solutions within the evolutionary system, thus capturing regular repeated features or patterns in the problem space.

Some design notes
The frequency and intensity of individual learning directly define the degree of evolution (exploration) against individual learning (exploitation) in the MA search, for a given fixed limited computational budget. Clearly, a more intense individual learning provides greater chance of convergence to the local optima but limits the amount of evolution that may be expended without incurring excessive computational resources. Therefore, care should be taken when setting these two parameters to balance the computational budget available in achieving maximum search performance. When only a portion of the population individuals undergo learning, the issue on which subset of individuals to improve need to be considered to maximize the utility of MA search. Last but not least, the individual learning procedure/meme used also favors a different neighborhood structure, hence the need to decide which meme or memes to use for a given optimization problem at hand would be required.

How often should individual learning be applied?
One of the first issues pertinent to memetic algorithm design is to consider how often the individual learning should be applied, i.e., individual learning frequency. In the effect of individual learning frequency on MA search performance was considered where various configurations of the individual learning frequency at different stages of the MA search were investigated. Conversely, it was shown in that it may be worthwhile to apply individual learning on every individual if the computational complexity of the individual learning is relatively low.

On which solutions should individual learning be used?
On the issue of selecting appropriate individuals among the EA population that should undergo individual learning, fitness-based and distribution-based strategies were studied for adapting the probability of applying individual learning on the population of chromosomes in continuous parametric search problems with Land extending the work to combinatorial optimization problems. Bambha et al. introduced a simulated heating technique for systematically integrating parameterized individual learning into evolutionary algorithms to achieve maximum solution quality.