Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Proceed order?

  • 1
    Electronic Resource
    Electronic Resource
    Journal of scientific computing 7 (1992), S. 197-228 
    ISSN: 1573-7691
    Keywords: Optimization ; gradient methods ; global convergence
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science
    Notes: Abstract The idea of hierarchical gradient methods for optimization is considered. It is shown that the proposed approach provides powerful means to cope with some global convergence problems characteristic of the classical gradient methods. Concerning global convergence problems, four topics are addressed: The “detour” effect, the problem of multiscale models, the problem of highly ill-conditioned objective functions, and the problem of local-minima traps related to ambiguous regions of attractions. The great potential of hierarchical gradient algorithms is revealed through a hierarchical Gauss-Newton algorithm for unconstrained nonlinear least-squares problems. The algorithm, while maintaining a superlinear convergence rate like the common conjugate gradient or quasi-Newton methods, requires the evaluation of partial derivatives with respect to only one variable on each iteration. This property enables economized consumption of CPU time in case the computer codes for the derivatives are intensive CPU consumers, e.g., when the gradient evaluations of ODE or PDE models are produced by numerical differentiation. The hierarchical Gauss-Newton algorithm is extended to handle interval constraints on the variables and its effectiveness demonstrated by computational results.
    Type of Medium: Electronic Resource
    Signatur Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...