next up previous
Next: Gravity Modeling and Inversion Up: Optimization of Non-Linear Gravity Previous: Introduction

Download GSA program (in FORTRAN)

Generalized Simulated Annealing

The application of an iterative improvement algorithm presupposes the definition of configurations, a cost function and a generation mechanism, i.e. a simple prescription to generate a transition from a configuration to another one.

The GSA method leads to the minimization of a cost function tex2html_wrap_inline272 through a stochastic cooling. The procedure consists in comparing the cost function for two random vectors tex2html_wrap_inline274 (or configuration) obtained using the generalized visiting distribution. In this sense, the hypersurface defined by tex2html_wrap_inline272 is randomly visited and the algorithm terminates when a configuration is obtained whose cost is no worse than any of its neighbors.

Differently of the gradient methods, solutions obtained by the SA, do not depend on the initial configuration and have a cost usually close to the minimum one. The simulated annealing is a technique that has attracted significant attention in a diversity of problems, especially those where a desired global extremum is hidden among many local extrema. The GSA algorithm can now be viewed as a sequence of generalized Metropolis algorithms evaluated at a sequence of decreasing values of the control parameter.

In summary, we describe the principal steps to built the numerical procedure [5, 8] in order to map the global minima of a function tex2html_wrap_inline272 depend on of a multidimensional continuous parameter tex2html_wrap_inline280

(i) Fix the acceptance and visiting indexes tex2html_wrap_inline282 and tex2html_wrap_inline284 (we recall that tex2html_wrap_inline286 and (1,2) respectively correspond to the Boltzmann and Cauchy machines). Start, at t=1, with an arbitrary value tex2html_wrap_inline292 and a high enough value tex2html_wrap_inline294 (visiting temperature) and cool as follows:

  equation52

where t is the discrete time corresponding to the computer iteration.

(ii) Then randomly generate each component tex2html_wrap_inline298 ( tex2html_wrap_inline300 ) of tex2html_wrap_inline302 using the visiting distribution probability tex2html_wrap_inline304 given by

  equation64

where tex2html_wrap_inline306 is the gamma function; D is the number of components of tex2html_wrap_inline308 . This procedure assures that the system can both escape from any local minimum and explore the entire cost function.

(iii) Then calculate the cost function tex2html_wrap_inline310 with the following conditions:

If tex2html_wrap_inline310 tex2html_wrap_inline314 , replace tex2html_wrap_inline316 by tex2html_wrap_inline318

If tex2html_wrap_inline320 , accept tex2html_wrap_inline322 with the acceptance probability tex2html_wrap_inline324 defined by

  equation104

with tex2html_wrap_inline326 [5].

(iv) Calculate the new temperature tex2html_wrap_inline328 using Eq.(1) and go back to (ii) until the minimum of tex2html_wrap_inline330 is reached within the desired precision.


next up previous
Next: Gravity Modeling and Inversion Up: Optimization of Non-Linear Gravity Previous: Introduction

Kleber Mundim