*The optimization algorithm developed by researchers of the Siberian Institute of Applied Systems Analysis named after A.N. Antamoshkin took first place in the global competition as part of the Congress on Evolutionary Computing, which was held this year in Poland at the end of June. The essence of these competitions and the reasons why our scientists consistently show significant results are explained by Professor E. S. Semenkin, Scientific Supervisor of SibIASA.*

### Seeing the meaning of what is happening in a much deeper and more precise way

**– Evgenii Stanislavovich, in an interview with the corporate newspaper «Gorizont» you called yourself a ‘professional optimist’, referring to the fact that your field of science is optimization. Does research in this field really help you and your colleagues to be optimists?**

– Of course. In that interview, I mentioned that our young scientists took second place in the global competition for optimization algorithms in 2018, and that they have many new achievements ahead of them. And now, not even three years later, our young scientists Vladimir Stanovov and Shakhnaz Akhmedova have developed a winning algorithm in this annual competition. Furthermore, this year the contest was held in conjunction with two leading world conferences on Evolutionary Computation – the Congress on Evolutionary Computation, CEC 2021, and the Genetic and Evolutionary Computation Conference, GECCO 2021. Progress looks highly optimistic, doesn’t it?

In addition, an understanding of optimization theory makes it possible to be what we call a ‘well-informed optimist’. You often hear in speeches and read in various texts expressions like ‘… we managed to find a more optimal solution…’ or ‘… this solution is the most optimal…’. It is immediately clear to us that this phrase belongs to an uneducated person who is trying to look professional. By definition, the optimal solution is the best solution under the given conditions. There cannot, of course, be a ‘more optimal’ or ‘most optimal’ solution. If the new solution looks even better than the known ‘optimal’ solution, then either the conditions have changed and that solution is no longer optimal, or (if the conditions have not changed) that solution was not optimal in the first place. This means that the newly found solution is merely optimal. Or it is a gross error, and the new solution is not correct at all. This theoretical information, well mastered and repeatedly applied in practice, allows you to be an optimist. A well-informed optimist. See the optimal solution and do everything to achieve it. Under the current conditions. The great German-Russian mathematician Leonhard Euler once said that ‘Nothing at all takes place in the universe in which some rule of maximum or minimum does not appear’. Mastering the science of optimization allows you to see the meaning of what is happening in a much deeper and more precise way.

### Just perform all your work accurately

**– What is the main idea of the competition, who participates, and how does it work?**

– Single-criteria optimization algorithms are the foundation on which methods for solving more complex optimization problems, such as multicriteria, conditional, non-stationary problems, with algorithmically defined functions are built.

Accordingly, the improvement of such algorithms is very important as it has a significant impact on the development of all areas of optimization, and hence other important areas, including decision support, complex systems design, machine learning and artificial intelligence.

To identify the best algorithmic schemes, the organizers of the competition specifically design test problems that pose the greatest challenges for the algorithms. The best professionals in the field of evolutionary optimization, who are perfectly aware of all the strengths and weaknesses of the algorithms and understand exactly what makes how they work exceptionally complicated, generate these test problems. They also announce the conditions of the competition and the methodology of performance evaluation. All that remains for the authors of the algorithms is just to perform all the work accurately according to the methodology, and submit the results to the organizers. Examples of graphs of test functions with two variables are shown in the box. Of course, algorithms do not compete on functions with two variables, but deal with functions with tens and hundreds of variables, graphs of which cannot be drawn.

The organizers of the competition from Singapore and India, after thoroughly checking the source codes of all the methods and independent testing, identify the advantages of the competing algorithms. The list of participating algorithms is shown in Table 1. Scientific groups from Egypt, India, Slovenia, Poland, the Czech Republic and other countries participated. The algorithm of our scientists is the sixth in the list (E-125).

Table 1.

This is not the first time our ‘optimizers’ (V.V. Stanovov and Sh.A. Akhmedova) have participated in such international competitions of leading scientific groups.

The aim of such competitions is to create the most advanced universal algorithmic schemes, which can then be adopted in other fields of science, where the tasks of finding the optimal solution are required. In 2018, the previous version of the algorithm, L-SHADE-RSP, took second place in the same competition behind the work of Chinese researchers, which, however, required significantly greater computing resources.

### Raising the level of optimism

**– So the competition started and what happened next?**

– The ‘optimizers’ measured their optimism against one another. Under progressively more difficult conditions. At first, some algorithms looked better than others, but as the problems became more complicated, the leaders changed. For instance, the first function shown above is difficult to optimize, but its graph is almost symmetric and the desired global optimum is at the origin. Algorithms explicitly or implicitly using such properties (symmetry or non-displacement from the centre) work very effectively in this case. But, in general, the advantage of algorithms on such functions means little in more complicated conditions. As Vladimir Stanovov observed ironically in the discussion with the organizers, it is not difficult for us to improve our algorithm in this respect, we just need to make the first step a check as to whether the global optimum is at zero, and it will be even better than the competitors. Clearly, this already sounds like an anecdote. The organizers recognized the validity of the question and, as a result, the final evaluation was carried out based on the success of the algorithms on the most complex functions (for example, the second function above). And here the leaders changed. It turned out that not all optimists under favourable conditions remain as optimistic when the conditions become more complicated.

As a result, the organizers of the competition established the advantage of the new algorithm NL-SHADE-RSP developed by Vladimir Vadimovich Stanovov and Shakhnaz Agasuvar kyzy Akhmedova, associate professors of the department of higher mathematics, over other methods (see Table 2). In particular, NL-SHADE-RSP performed best of all for the most difficult test functions which had specific properties that significantly complicate the work of the algorithms. Thus, our optimists are not only the most optimistic in the world, but they also have properties that raise their level of optimism under the most difficult conditions.

Table 2.

### 15% ahead of the chaser

**– And what kind of algorithm is so optimistic?**

– It’s a stochastic one, meaning that it’s based on the use of randomness. An algorithm that simulates evolutionary processes, or to be more precise, it’s a differential evolution algorithm, whose scheme was proposed earlier by overseas scientists and finalized by our researchers. In the list of participants in the competition, it can be seen that not only our ‘optimizers’ modified the initial algorithm (DE), but ours did much better. The silver medallist is as much as 15% behind ours. The new algorithm by SibIASA researchers implements several original ideas, including the adaptive use of the archive of previously found solutions, non-linear population size variation, the sorting of crossover probabilities and rank selective pressure. The current work by the research team also shows that some elements of search algorithms can be designed automatically without human intervention. That is, in the course of solving the optimization problem, the algorithm, depending on the acquired experience, will redesign itself for the problem being solved. And this is already an automation of informed optimism and a direct path to the optimal optimization algorithm. Whatever that means.

Our congratulations to the world champions of evolutionary optimism!

The IEEE Congress on Evolutionary Computing (CEC) is one of the largest and most important conferences in Evolutionary Computing (EC). CEC covers most subsections of EC, such as Evolutionary Robotics, Multiple Purpose Optimization, Evolutionary Hardware and Evolutionary Design. The conference usually attracts several hundred participants, resulting in the publication of hundreds of papers.