^{1}

^{1, 2}

^{3}

^{1}

^{2}

^{3}

The problem of data fitting is very important in many theoretical and applied fields. In this paper, we consider the problem of optimizing a weighted Bayesian energy functional
for data fitting by using global-support approximating curves. By global-support curves we mean curves expressed as a linear combination of basis functions whose support is the whole domain of the problem, as opposed to other common approaches in CAD/CAM and computer graphics driven by piecewise functions (such as B-splines and NURBS) that provide local control of the shape of the curve. Our method applies a powerful nature-inspired metaheuristic algorithm called

The problem of data fitting is very important in many theoretical and applied fields [

Depending on the nature of these data points, two different approaches can be employed:

There are two key components for a good approximation of data points with curves: a proper choice of the approximating function and a suitable parameter tuning. Due to their good mathematical properties regarding evaluation, continuity, and differentiability (among many others), the use of polynomial functions (especially splines) is a classical choice for the approximation function [

In this paper, we consider the problem of optimizing a weighted Bayesian energy functional for data fitting by using global-support approximating curves. In particular, our goal is to obtain the global-support approximating curve that fits the data points better while keeping the number of free parameters of the model as low as possible. To this aim, we formulate this problem as a minimization problem by using a weighted Bayesian energy functional for global-support curves. This is one of the major contributions of this paper. Our functional is comprised of two competing terms aimed at minimizing the fitting error between the original and the reconstructed data points while simultaneously minimizing the degrees of freedom of the problem. Furthermore, the functional can be modified and extended to include various additional constraints, such as the fairness and smoothness constraints typically required in many industrial operations in computer-aided manufacturing, such as CNC (computer numerically controlled) milling, drilling, and machining [

Unfortunately, our formulation in previous paragraph leads to a nonlinear continuous optimization problem that cannot be properly addressed by conventional mathematical optimization techniques. To overcome this limitation, in this paper we apply a powerful nature-inspired metaheuristic algorithm called cuckoo search, introduced in 2009 by Yang and Deb to solve optimization problems [

A critical problem when using metaheuristic approaches concerns the parameter tuning, which is well known to be time-consuming and problem-dependent. In this regard, a major advantage of the cuckoo search with Lévy flights is its simplicity: it only requires two parameters, many fewer than other metaheuristic approaches, so the parameter tuning becomes a very simple task. The paper shows that this new approach can be successfully applied to solve our optimization problem. To check the performance of our approach, it has been applied to five illustrative examples of different types, including open and closed 2D and 3D curves that exhibit challenging features, such as cusps and self-intersections. Our results show that the method performs pretty well, being able to solve our minimization problem in an astonishingly straightforward way.

The structure of this paper is as follows: in Section

The problem of curve data fitting has been the subject of research for many years. First approaches in the field were mostly based on numerical procedures [

Unfortunately, the optimization problems given by those energy functionals and their constraints are very difficult and cannot be generally solved by conventional mathematical optimization techniques. On the other hand, some interesting research carried out during the last two decades has shown that the application of artificial intelligence techniques can achieve remarkable results regarding such optimization problems [

In this paper we assume that the solution to our fitting problem is given by a model function

the canonical polynomial basis:

the Bernstein basis:

Let us suppose now that we are given a finite set of data points

This functional

Considering the vectors

In general, the blending functions

Cuckoo search (CS) is a nature-inspired population-based metaheuristic algorithm originally proposed by Yang and Deb in 2009 to solve optimization problems [

This interesting and surprising breeding behavioral pattern is the metaphor of the cuckoo search metaheuristic approach for solving optimization problems. In the cuckoo search algorithm, the eggs in the nest are interpreted as a pool of candidate solutions of an optimization problem, while the cuckoo egg represents a new coming solution. The ultimate goal of the method is to use these new (and potentially better) solutions associated with the parasitic cuckoo eggs to replace the current solution associated with the eggs in the nest. This replacement, carried out iteratively, will eventually lead to a very good solution of the problem.

In addition to this representation scheme, the CS algorithm is also based on three idealized rules [

Each cuckoo lays one egg at a time and dumps it in a randomly chosen nest.

The best nests with high quality of eggs (solutions) will be carried over to the next generations.

The number of available host nests is fixed, and a host can discover an alien egg with a probability

For simplicity, the third assumption can be approximated by a fraction

Based on these three rules, the basic steps of the CS algorithm can be summarized as shown in the pseudocode reported in Algorithm

Objective function

Generate initial population of

Replace

Postprocess results and visualization

For each iteration

The CS method then evaluates the fitness of the new solution and compares it with the current one. In case the new solution brings better fitness, it replaces the current one. On the other hand, a fraction of the worse nests (according to the fitness) are abandoned and replaced by new solutions so as to increase the exploration of the search space looking for more promising solutions. The rate of replacement is given by the probability

This algorithm is applied in an iterative fashion until a stopping criterion is met. Common terminating criteria are that a solution is found that satisfies a lower threshold value, that a fixed number of generations have been reached, or that successive iterations no longer produce better results.

We have applied the cuckoo search algorithm discussed in previous section to our optimization problem described in Section

Regarding the fitness function, it is given by either the weighted Bayesian energy functional in (

A critical issue when working with metaheuristic approaches concerns the choice of suitable parameter values for the method. This issue is of paramount importance since the proper choice of such values will largely determine the good performance of the method. Unfortunately, it is also a very difficult task. On one hand, the field still lacks sufficient theoretical results to answer this question on a general basis. On the other hand, the choice of parameter values is strongly problem-dependent, meaning that good parameter values for a particular problem might be completely useless (even counterproductive) for any other problem. These facts explain why the choice of adequate parameter values is so troublesome and very often a bottleneck in the development and application of metaheuristic techniques.

The previous limitations have been traditionally overcome by following different strategies. Perhaps the most common one is to obtain good parameter values empirically. In this approach, several runs or executions of the method are carried out for different parameter values and a statistical analysis is performed to derive the values leading to the best performance. However, this approach is very time-consuming, especially when different parameters influence each other. This problem is aggravated when the metaheuristic depends on many different parameters, leading to an exponential growth in the number of executions. The cuckoo search method is particularly adequate in this regard because of its simplicity. In contrast to other methods that typically require a large number of parameters, the CS only requires two parameters, namely, the population size

Some previous works have addressed the issue of parameter tuning for CS. They showed that the method is relatively robust to the variation of parameters. For instance, authors in [

Regarding the implementation, all computations in this paper have been performed on a 2.6 GHz Intel Core i7 processor with 8 GB RAM. The source code has been implemented by the authors in the native programming language of the popular scientific program MATLAB, version 2012a. We remark that an implementation of the CS method has been described in [

We have applied the CS method described in previous sections to different examples of curve data fitting. To keep the paper in manageable size, in this section we describe only five of them, corresponding to different families of global-support basis functions and also to open and closed 2D and 3D curves. In order to replicate the conditions of real-world applications, we assume that our data are irregularly sampled and subjected to noise. Consequently, we consider a nonuniform sampling of data in all our examples. Data points are also perturbed by an additive Gaussian white noise of small intensity given by a SNR (signal-to-noise ratio) of 60 in all reported examples.

First example corresponds to a set of 100 noisy data points obtained by nonuniform sampling from the Agnesi curve. The curve is obtained by drawing a line

Application of the cuckoo search algorithm to the Agnesi curve: (a) original data points (red emptied circles) and reconstructed points (in blue plus symbol); (b) data points (black plus symbol) and fitting curve (solid blue line).

Second example corresponds to the Archimedean spiral curve (also known as the arithmetic spiral curve). This curve is the locus of points corresponding to the locations over time of a point moving away from a fixed point with a constant speed along a line which rotates with constant angular velocity. In this example, we consider a set of 100 noisy data points from such a curve that are subsequently fitted by using the canonical polynomial basis functions. Our results for this example are depicted in Figure

Application of the cuckoo search algorithm to the Archimedean spiral curve: (a) original data points (red emptied circles) and reconstructed points (in blue plus symbol); (b) data points (black plus symbol) and fitting curve (solid blue line).

Third example corresponds to a hypocycloid curve. This curve belongs to a set of a much larger family of curves called the roulettes. Roughly speaking, a roulette is a curve generated by tracing the path of a point attached to a curve that is rolling upon another fixed curve without slippage. In principle, they can be any two curves. The particular case of a hypocycloid corresponds to a roulette traced by a point attached to a circle of radius

Application of the cuckoo search algorithm to the hypocycloid curve example: (a) original data points (red emptied circles) and reconstructed points (in blue plus symbol); (b) data points (black plus symbol) and fitting curve (solid blue line).

Fourth example corresponds to the so-called piriform curve, which can be defined procedurally in a rather complex way. Once again, we consider a set of 100 noisy data points fitted by using the Bernstein basis functions. Our results are shown in Figure

Application of the cuckoo search algorithm to the piriform curve example: (a) original data points (red emptied circles) and reconstructed points (in blue plus symbol); (b) data points (black plus symbol) and fitting curve (solid blue line).

The last example corresponds to a 3D closed curve called Eight Knot curve. Two images of the curve from different viewpoints are shown in Figure

Two different viewpoints of the 3D Eight Knot curve.

Application of the cuckoo search algorithm to the 3D Eight Knot curve example: (a) original data points (red emptied circles) and reconstructed points (in blue plus symbol); (b) data points (black plus symbol) and fitting curve (solid blue line).

This paper addresses the problem of approximating a set of data points by using global-support curves while simultaneously minimizing the degrees of freedom of the model function and satisfying other additional constraints. This problem is formulated in terms of a weighted Bayesian energy functional that encapsulates all these constraints into a single mathematical expression. In this way, the original problem is converted into a continuous nonlinear multivariate optimization problem, which is solved by using a metaheuristic approach. Our method is based on the cuckoo search, a powerful nature-inspired metaheuristic algorithm introduced recently to solve optimization problems. Cuckoo search (especially its variant that uses Lévy flights) has been successfully applied to difficult optimization problems in different fields. However, to the best of our knowledge, this is the first paper applying the cuckoo search methodology in the context of geometric modeling and data fitting.

Our approach based on the cuckoo search method has been tested on five illustrative examples of different types, including open and closed 2D and 3D curves. Some examples also exhibit challenging features, such as cusps and self-intersections. They have been fitted by using two different families of global-support functions (Bernstein basis functions and the canonical polynomial basis) with satisfactory results in all cases. The experimental results show that the method performs pretty well, being able to solve our difficult minimization problem in an astonishingly straightforward way. We conclude that this new approach can be successfully applied to solve our optimization problem.

A major advantage of this method is its simplicity: cuckoo search requires only two parameters, many fewer than other metaheuristic approaches, so the parameter tuning becomes a very simple task. This simplicity is also reflected in the CPU runtime of our examples. Even though we are dealing with a constrained continuous multivariate nonlinear optimization problem and with curves exhibiting challenging features such as cusps and self-intersections, a typical single execution takes less than 10 seconds of CPU time for all the examples reported in this paper. In addition, the method is simple to understand, easy to implement and does not require any further pre-/postprocessing.

In spite of these encouraging results, further research is still needed to determine the advantages and limitations of the present method at full extent. On the other hand, some modifications of the original cuckoo search have been claimed to outperform the initial method on some benchmarks. Our implementation has been designed according to the specifications of the original method and we did not test any of its subsequent modifications yet. We are currently interested in exploring these issues as part of our future work. The hybridization of this approach with other competitive methods for better performance is also part of our future work.

The authors declare that there is no conflict of interests regarding the publication of this paper. Any commercial identity mentioned in this paper is cited solely for scientific purposes.

This research has been kindly supported by the Computer Science National Program of the Spanish Ministry of Economy and Competitiveness, Project Ref. no. TIN2012-30768, Toho University (Funabashi, Japan), and the University of Cantabria (Santander, Spain). The authors are particularly grateful to the Department of Information Science of Toho University for all the facilities given to carry out this work.