Search tips
Search criteria 


Logo of nihpaAbout Author manuscriptsSubmit a manuscriptHHS Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
J Appl Biomed. Author manuscript; available in PMC 2010 April 21.
Published in final edited form as:
J Appl Biomed. 2006 June; 4(2): 87–94.
PMCID: PMC2857782

GOSA, a simulated annealing-based program for global optimization of nonlinear problems, also reveals transyears


Transyears in biology have been documented thus far by the extended cosinor approach, including linear-nonlinear rhythmometry. We here confirm the existence of transyears by simulated annealing, a method originally developed for a much broader use, but described and introduced herein for validating its application to time series. The method is illustrated both on an artificial test case with known components and on biological data. We provide a table comparing results by the two methods and trust that the procedure will serve the budding sciences of chronobiology (the study of mechanisms underlying biological time structure), chronomics (the mapping of time structures in and around us), and chronobioethics, using the foregoing disciplines to add to concern for illnesses of individuals, and to budding focus on diseases of nations and civilizations.

Keywords: simulated annealing, linear-nonlinear rhythmometry, transyears, cosinor


Fitting multidimensional experimental data to complex multi-parameter nonlinear models presents a host of serious problems, the most important of which is finding a set of starting values for the searched parameters. This requirement is necessary in the case of deterministic minimization techniques, based on calculation of gradients of the target function with respect to the adjusted parameters. In this context, the target function is a sum of squared deviations between the measured points and those calculated on the basis of a selected theoretical model. The objective is to find the global minimum of the error function. The use of deterministic search techniques, such as the steepest descent, conjugate gradients or the Newton-Raphson method (Press et al., 1992) is widespread mainly because of their efficiency, i.e. rapid convergence to the minimum. The necessity to provide the starting point at the beginning of the calculation, i.e. concrete values for all model parameters, however, is a serious obstacle.

In the multi-dimensional parameter space, the landscape of the minimized function may be very complex. Since the deterministic algorithms use the sign and the value of the target function’s gradients at any given point to determine the direction of the search and the size of the step in that direction, they are bound to follow the path, which descends from the starting point to the nearest minimum. There is no guarantee that the minimum reached corresponds to the global minimum of the minimized function. These algorithms are not capable of jumping over a barrier, resulting from the function’s profile, to find a deeper minimum.

One way to overcome this obstacle is to run the minimization procedure several times, each time with a different starting point, and to retain the best result, corresponding to the smallest value of the target function. By increasing the number of runs with different starting points, we increase our confidence that the best fit will approach the global minimum (within the limits of a predetermined accuracy). We can never be certain, however, that the data set cannot be described more satisfactorily by a different combination of parameters. Moreover, this procedure increases significantly the overall time necessary to accomplish the task, which greatly reduces the appeal of this approach in the multidimensional case.

Unfortunately, when working with high-dimensional experimental data, it is nearly impossible to find initial values by a trial-and-error method. The correct solution to the problem can be found only if the starting point is close enough to the minimum, which means that one must have some a priori knowledge of at least the order of magnitude of the parameters involved. This prerequisite, however, is contrary to the reason for which we do the fitting; we prefer to obtain the results without having to guess anything. Therefore, we decided to develop an alternative to the currently existing software packages: an easy-to-use program, based on simulated annealing, which is a stochastic minimization algorithm devoid of the drawbacks of the deterministic algorithms. It should be noted that currently many efforts are underway to achieve global optimization by deterministic algorithms. At present, this approach requires the use of auxiliary functions, whose form depends on the nature of the investigated problem. This lack of generality prevented us from opting for this solution: our software was designed to deal with a broad class of problems.


Most of the commercially available programs are useful for analyses of one-dimensional cases. In practice, we often deal with multi-dimensional studies, where several independent variables determine the outcome. In such cases a number of curves are obtained, a circumstance which increases the number of unknowns and the time required to find the best fit to the experimental data. Taking into account all these factors, we chose to work with stochastic algorithms, of which the simulated annealing (SA) technique (Vanderbilt and Louie 1984, Bohachevsky et al. 1986, Corana et al. 1987) is the most powerful one. The SA algorithm is based on the concept of attaining the lowest energy states through slow cooling (e.g. annealing of metals) and is currently used in molecular modeling (Nilges et al. 1988). Much of its success is due to random sampling of the parameter space, based on the probabilistic Monte Carlo method (Metropolis et al. 1953). A comprehensive overview of the SA algorithm has been given by Goffe et al. (1994).

The stochastic character of the SA algorithm provides one of its main advantages: it is no longer necessary to make choices concerning the starting point. In fact, the initial set of parameters is generated randomly in order to avoid any bias in the choice of the subsequent search trajectory. On the other hand, since the sampling of the parameter space must be adequate, the time necessary to complete a single SA run is longer than that required by a single run of a deterministic algorithm. Since deterministic programs have to be run many times with different initial data, the total working time of the two methods, however, becomes comparable, with the advantage in favor of the stochastic methods, because the procedure is totally automated.

Herein, we illustrate a practical implementation of the SA algorithm, using the GOSA (Global Optimization by Simulated Annealing) software, applied to the nearly 16-year record of systolic blood pressure analyzed chronobiologically in Figures 3a–f in Halberg et al. (2006a).


Both the absence of a calendar year and the presence instead of transyears are corroborated, Tables 1 and and2.2. The agreement between the two approaches can be seen from the overlapping 95% confidence intervals of the periods of the different components resolved by the two methods. Components resolved by only one of the two approaches are indicated by a ‘+’ on the right. Table 3 further compares the performance of the GOSA software with linear-nonlinear rhythmometry (Halberg 1980, Cornélissen and Halberg 2005, Halberg et al. 2006a, b), originally tested on a simulated data series consisting of two cosine curves with close periods (Rummel et al. 1974), Figure 1.

Fig. 1
Test series originally discussed in Rummel et al. (1974)
Table 1
Results from the GOSA software applied to 16-year record of blood pressure and heart rate of a man
Table 2
Results from linear-nonlinear rhythmometry (extended cosinor) applied to 16-year record of blood pressure and heart rate of a man
Table 3
Comparison of GOSA software and linear – nonlinear rhytmometry applied to test series in Fig. 1

The aim of this work was to provide biologists and biochemists with an easy-to-use, reliable program capable of finding global minima of an arbitrary function. The results of test cases reported above indicate that this goal has been achieved. The program, based on the simulated annealing algorithm, finds the global minimum of a specified function in a given range of variability of unknown parameters. The values of the resulting parameters are reported along with their uncertainties, estimated on the basis of the corresponding covariance matrix. Absolute values of summed squared deviations and the number of degrees of freedom for the studied model are reported, which render comparisons between different models possible.

The program has been developed and tested on several different platforms: Unix-based SGI workstations, PC computers running Microsoft Windows OS and Intel-based Linux clusters. GOSA is capable of solving a wide array of problems from different domains of research. The software is available from Bio-Log ( The accompanying documentation contains numerous examples of its use.


GM-13981 (FH), Dr. h.c. mult. Earl Bakken Fund (GC, FH)


  • Bohachevsky IO, Johnson MI, Stein ML. Generalized Simulated Annealing for Function Optimization. Technometrics. 1986;28:209–217.
  • Corana A, Marchesi M, Martini C, Ridella S. Minimizing multimodal functions of continuous variables with the ‘simulated annealing algorithm’ ACM Trans Math Softw. 1987;13:262–280.
  • Cornélissen G, Halberg F. Chronomedicine. In: Armitage P, Colton T, editors. Encyclopedia of Biostatistics. 2. John Wiley & Sons Ltd; Chichester, UK: 2005. pp. 796–812.
  • Goffe WL, Ferrier GD, Rogers J. Global optimization of statistical functions with simulated annealing. J Econom. 1994;60:65–99.
  • Halberg F. Chronobiology: methodological problems. Acta med rom. 1980;18:399–440.
  • Halberg F, Cornélissen G, Katinas G, et al. Chronobiology’s progress Part I, season’s appreciations 2004–2005: time-, frequency-, phase-, variable-, individual-, age- and site-specific chronomics. J Appl Biomed. 2006a;4:1–38.
  • Halberg F, Cornélissen G, Katinas G, et al. Chronobiology’s progress Part II, chronomics for an immediately applicable biomedicine. J Appl Biomed. 2006b;4:73–86.
  • Metropolis N, Rosenbluth AW, Rosenbluth MN, et al. Equations of state calculations by fast computing machines. J Chem Phys. 1953;21:1087–1092.
  • Nilges M, Clore GM, Gronenborn AM. Determination of Three-Dimensional Structures of Proteins From Interproton Distance Data by Dynamical Simulated Annealing From a Random Array of Atoms. FEBS Lett. 1988;239:129–136. [PubMed]
  • Press WH, Flannery B, Teukolsky S, Vetterling WT. Numerical Recipes in Fortran. Cambridge University Press; New York: 1992.
  • Rummel JA, Lee JK, Halberg F. Combined linear-nonlinear chronobiologic windows by least-squares resolve neighboring components in a physiologic rhythm spectrum. In: Ferin M, Halberg F, Richart RM, Vande Wiele R, editors. Biorhythms and Human Reproduction, Int Inst for the Study of Human Reproduction Conf Proc. John Wiley & Sons; New York: 1974. pp. 53–82.
  • Vanderbilt D, Louie SG. A Monte Carlo simulated annealing approach to optimization over continuous variables. J Comp Phys. 1984;56:259–271.