^{1}

^{2}

^{2}

^{1}

^{2}

In real world, many optimization problems are dynamic, which means that their model elements vary with time. These problems have received increasing attention over time, especially from the viewpoint of metaheuristics methods. In this context, experimentation is a crucial task because of the stochastic nature of both algorithms and problems. Currently, there are several technologies whose methods, problems, and performance measures can be implemented. However, in most of them, certain features that make the experimentation process easy are not present. Examples of such features are the statistical analysis of the results and a graphical user interface (GUI) that allows an easy management of the experimentation process. Bearing in mind these limitations, in the present work, we present DynOptLab, a software tool for experimental analysis in dynamic environments. DynOptLab has two main components: (1) an object-oriented framework to facilitate the implementation of new proposals and (2) a graphical user interface for the experiment management and the statistical analysis of the results. With the aim of verifying the benefits of DynOptLab’s main features, a typical case study on experimentation in dynamic environments was carried out.

Several decision-making scenarios can be modeled as optimization problems. Among them, a special class is the known dynamic optimization problems (DOPs), which are characterized by the presence of certain time-varying elements of the mathematical model (e.g., objective function and search space). Because of the complexity involved in these problems, the application of metaheuristics methods has gained an increasing interest in the last decade [

In this context, as in other similar fields, experimentation plays an important role if one takes into account the stochastic nature of metaheuristics and DOPs. In fact, most of the existing results reported about experimentation in dynamic environments rely on metaheuristics solving artificial DOPs [

Although there are several technologies that help in the implementation of algorithms, problems, and performance measures, generally they require a great effort by the researcher in regard to simulation of the experiments and the processing of the results. This latter aspect is frequently carried out through descriptive and inference statistics. In the context of dynamic environments, there is no evidence of technology that fulfills those requirements at the same time.

Bearing in mind these limitations, in this work, we propose DynOptLab, a free noncommercial tool for experimental analysis in dynamic environments. DynOptLab is composed of two main elements: (1) an object-oriented framework for implementing algorithms, problems, and performance measures and (2) a graphical user interface (GUI) for the management of the computational experiments. In particular, DynOptLab’s GUI also includes a module to perform statistical analysis of the results.

In order to better describe our proposal, the rest of the paper is organized as follows. Section

A dynamic optimization problem is formally defined as follows. Being

There are different criteria to measure the algorithm performance in dynamic environments. Most of the existing measures are based on the absolute error, in terms of the function values, of the best solution attained by the algorithm and the current optimum of the problem. For example, [

Similar to performance measures, in literature, there are several artificial DOPs, which are crucial in the study and comparison of the algorithms in dynamic environments. In this context, two popular problem generators are the problem generators Moving Peaks Benchmark (MPB) [

Once a computational experiment ends, the results, in terms of the performance measures, are used to analyze the algorithm at hand. Often, to statistically process the results is recommended. Using descriptive statistics is the common way to do so [

The interested reader in the topic of experimentation in dynamic environments is referred to the works of [

Currently, there are many technologies that can be employed by researchers for experimentation in dynamic environments. Most of them are software libraries or application frameworks, which have been conceived for stationary optimization. In what follows we review some of these available technologies, which can at least allow for

implementing problems, algorithms, and performance measures in continuous domain,

executing computational experiments,

displaying the experimental results through a graphical user interface (GUI),

performing statistical tests for algorithm comparison.

Table

Comparison among some available technologies for experimentation in dynamic environments.

Technology | Visualization of the results? | Statistical tests? |
---|---|---|

EvolvingObjects | No | No |

EASEA | Yes | No |

GUIDE | Yes | No |

CILib | No | No |

GAUL | No | No |

Apache Commons Math | No | Yes |

MATLAB Optimization Toolbox | Yes | No |

Similarly, CILib (

On the other hand, the GAUL (

In the last years, an important project has been developed by the Apache Software Foundation related to mathematics. This project, named

Another relevant technology in this context is the

Summarizing this section, one can see that there are many alternatives in the selection of technologies for experimentation in dynamic environments. However, most of these technologies do not fulfill all the requirements stated at the beginning of this section. Of course, a solution to this issue could be to properly extend some of those frameworks (e.g., EvolvingObjects or CILib). Unfortunately, it generally requires (1) an in-depth knowledge on the framework at hand in order to extend it and (2) agreeing with the employed software license. For these reasons, we have developed DynOptLab from scratch, with aim of adding some extra features, as we shall explain in the next section.

The proposed tool, DynOptLab (Dynamic Optimization Laboratory), was programmed on Java technology, which is an efficient, high-level, and multiplatform language developed by Sun Microsystems (Oracle Corporation) (

An object-oriented framework (OOF) is a reusable design of a system that describes how this system should be decomposed in a set of interacting objects. Different from software architectures, an OOF is expressed by a programming language, and it is based on a specific problem domain [

Class diagram corresponding to the framework of DynOptLab. Classes in bold are the framework hot spots. Some methods and attributes are excluded for a better understanding.

In spite of the above technical aspects, the researcher only needs to interact with interfaces related to algorithms, problems, and measures. The framework also allows the parameter setting of algorithms, problems, and measures, during the run of the application. This online assignation of parameters is possible thanks to the library

The GUI of DynOptLab is very simple and intuitive. It was developed on the library SWT (Standard Widget Toolkit) from the Eclipse (

DynOptLab’s main window is composed of five tabs, as is shown in Figure

All the XML files have simple and similar structure; that is, they contain (1) a short name, (2) a detailed description, (3) the full name of the implement class, and (4) the parameter settings, represented as lists of values. In the particular case of performance measures, the XML file also includes the measurement time, the aggregation function, and the report type. The next sections explain these and other specific features of DynOptLab.

One of the main features of DynOptLab is the management of factors and response variables of the experiments. In our case, the experiment factors are the parameters that define problems and algorithms, while the response variables are the performance measures [

It is worth observing that this feature of setting several values for each parameter allows for obtaining multiple instances of problems and algorithms. So, the development of multifactorial experiments is possible. For instance, if a given problem has two parameters that are set with

Once the problems, algorithms, and measures are selected, the next step is to manage the experiments execution. To this end, the user can use the

Additionally, the

To start the simulation, the user has to click the button

The results of the experiments (from the

With the aim of organizing the display and analysis better, the

Despite the benefits of this descriptive summary, it is usually interesting to compare several algorithms in certain problems. To this end, DynOptLab allows for multiple comparisons based on the loaded experiments from the left zone. At the bottom of this left zone, the button

Similar to the

Regarding the statistical analysis, DynOptLab provides two nonparametric, statistical tests using as input the data in the comparison table. Friedman and Iman-Davenport tests, which are devoted to detecting general differences among all algorithms, have been specifically included. The results of the tests are visualized by the

Inside the

With the aim of seeing DynOptLab in action, in this section, we will use it for handling a typical experimental study in dynamic environments. This case study has been illustrated by the figures we used. So, in what follows we only comment on the specific details of the figures that are related to our case study. Essentially, we want to analyze the performance of four algorithms: mQSO [

peakFunction =

vlength =

changeFrequency =

As was mentioned before, the presence of several values in a problem (or algorithm) is interpreted by DynOptLab as combination of factors. Hence, the above parameter setting leads to 18 different problem instances, which together with the 4 algorithms we considered give 72 experiments (i.e., pairs of problem-algorithm to be executed).

For assessing the algorithm performance, we rely on the

To see how DynOptLab can handle this design of experiments, consider first the class diagram of Figure

Example of how to extend DynOptLab’s framework.

DynOptLab’s interfaces related to the configuration of the experiment are shown in Figures

Selection and configuration of the algorithms in DynOptLab.

Selection and configuration of performance measures in DynOptLab.

In this work, we proposed DynOptLab, a free and noncommercial tool for experimental analysis in dynamic environments. This tool provides not only a framework to easily include new problems, algorithms, and performance measures, but also a graphic user interface to efficiently manage experiments and to statistically analyze the results.

The main features of DynOptLab were observed through the study of a typical case in the context of experimentation in dynamic environments. In that sense, DynOptLab can efficiently handle the considered design of the experiment.

Despite this progress, we believe that this is a first step to obtain a better tool. Our future work will be devoted to the inclusion of other problems and algorithms, with the aim of obtaining a framework with the state-of-the-art exponents on the subject.

DynOptLab is currently available at the website of the Models of Decision and Optimization (MODO) Research Group, specifically at the following URL:

The authors declare that there is no conflict of interests regarding the publication of this paper.

P. Novoa-Hernández has the support of a postdoctoral scholarship from the Eureka SD project (Erasmus Mundus Action 2) coordinated by the University of Oldenburg, Germany. C. Cruz Corona and D. A. Pelta acknowledge support from Projects TIN2011-27696-C02-01, Spanish Ministry of Economy and Competitiveness, P11-TIC-8001 from the Andalusian Government (including FEDER funds from the European Union), and GENIL-PYR-2014-9 Project from University of Granada.