On the probability of finding local minima in optimization problems

Document Type

Conference Proceeding

Publication Date

1-1-2006

Abstract

The standard method in optimization problems consists in a random search of the global minimum: a neuron network relaxes in the nearest local minimum from some randomly chosen initial configuration. This procedure is to be repeated many times in order to find as deep energy minimum as possible. However the question about the reasonable number of such random starts and if the result of the search can be treated as successful remains always open. In this paper by analyzing the generalized Hopfield model we obtain expressions, which yield the relationship between the depth of a local minimum and the size of the basin of attraction. Based on this, we present the probability of finding a local minimum as a function of the depth of the minimum. Such a relation can be used in optimization applications: it allows one, basing on a series of already found minima, to estimate the probability of finding a deeper minimum, and decide in favor of or against further running the program. The theory is in a good agreement with experimental results. © 2006 IEEE.

Publication Title

IEEE International Conference on Neural Networks - Conference Proceedings

First Page Number

3243

Last Page Number

3248

DOI

10.1109/ijcnn.2006.247318

This document is currently not available here.

Share

COinS