Estudo Geralhttps://estudogeral.sib.uc.ptThe DSpace digital repository system captures, stores, indexes, preserves, and distributes digital research material.Thu, 12 Dec 2019 11:06:33 GMT2019-12-12T11:06:33Z50111A particle swarm pattern search method for bound constrained nonlinear optimizationhttp://hdl.handle.net/10316/11372Title: A particle swarm pattern search method for bound constrained nonlinear optimization
Authors: Vaz, A. Ismael F.; Vicente, L. N.
Abstract: In this paper we develop, analyze, and test a new algorithm for the
global minimization of a function subject to simple bounds without the use of
derivatives. The underlying algorithm is a pattern search method, more speci cally
a coordinate search method, which guarantees convergence to stationary points from
arbitrary starting points.
In the optional search phase of pattern search we apply a particle swarm scheme to
globally explore the possible nonconvexity of the objective function. Our extensive
numerical experiments showed that the resulting algorithm is highly competitive
with other global optimization methods also based on function values
Sun, 01 Jan 2006 00:00:00 GMThttp://hdl.handle.net/10316/113722006-01-01T00:00:00ZPSwarm: A hybrid solver for linearly constrained global derivative-free optimizationhttp://hdl.handle.net/10316/11221Title: PSwarm: A hybrid solver for linearly constrained global derivative-free optimization
Authors: Vaz, A. Ismael F.; Vicente, L. N.
Abstract: PSwarm was developed originally for the global optimization of functions
without derivatives and where the variables are within upper and lower bounds.
The underlying algorithm used is a pattern search method, more specifically a coordinate
search method, which guarantees convergence to stationary points from
arbitrary starting points. In the (optional) search step of coordinate search, the
algorithm incorporated a particle swarm scheme for dissemination and thus it can
globally explore the possible nonconvexity of the objective function. Our extensive
numerical experiments showed that the resulting algorithm is highly competitive
with other global optimization methods also based on function values.
PSwarm is extended is this paper to handle general linear constraints. The poll
step incorporates now positive generators for the tangent cone of the approximated
active constraints, including a provision for the degenerate case. The search step
has also been adapted accordingly. In particular, the initial population for particle
swarm used in the search step is computed by first inscribing an ellipsoid of
maximum volume to the feasible set. We have again compared PSwarm to other
solvers (including some designed for global optimization) and the results confirm its
competitiveness in terms of efficiency and robustness.
Tue, 01 Jan 2008 00:00:00 GMThttp://hdl.handle.net/10316/112212008-01-01T00:00:00ZImplicitly and densely discrete black-box optimization problemshttp://hdl.handle.net/10316/11219Title: Implicitly and densely discrete black-box optimization problems
Authors: Vicente, L. N.
Abstract: This paper addresses derivative-free optimization problems where the
variables lie implicitly in an unknown discrete closed set. The evaluation of the
objective function follows a projection onto the discrete set, which is assumed dense
rather than sparse. Such a mathematical setting is a rough representation of what
is common in many real-life applications where, despite the continuous nature of
the underlying models, a number of practical issues dictate rounding of values or
projection to nearby feasible figures.
We discuss a definition of minimization for these implicitly discrete problems and
outline a direct search algorithm framework for its solution. The main asymptotic
properties of the algorithm are analyzed and numerically illustrated.
Tue, 01 Jan 2008 00:00:00 GMThttp://hdl.handle.net/10316/112192008-01-01T00:00:00ZAnalysis of direct searches for non-Lipschitzian functionshttp://hdl.handle.net/10316/13641Title: Analysis of direct searches for non-Lipschitzian functions
Authors: Vicente, L. N.; Custódio, A. L.
Abstract: It is known that the Clarke generalized directional derivative is nonnegative
along the limit directions generated by directional direct-search methods
at a limit point of certain subsequences of unsuccessful iterates, if the function being
minimized is Lipschitz continuous near the limit point.
In this paper we generalize this result for non-Lipschitzian functions using Rockafellar
generalized directional derivatives (upper subderivatives). We show that
Rockafellar derivatives are also nonnegative along the limit directions of those subsequences
of unsuccessful iterates when the function values converge to the function
value at the limit point. This result is obtained assuming that the function is
directionally Lipschitzian with respect to the limit direction.
It is also possible under appropriate conditions to establish more insightful results
by showing that the sequence of points generated by these methods eventually
approaches the limit point along the locally best branch or step function (when the
number of steps is equal to two).
The results of this paper are presented for constrained optimization and illustrated
numerically.
Thu, 01 Jan 2009 00:00:00 GMThttp://hdl.handle.net/10316/136412009-01-01T00:00:00ZBilevel derivative-free optimization and its application to robust optimizationhttp://hdl.handle.net/10316/13700Title: Bilevel derivative-free optimization and its application to robust optimization
Authors: Conn, Andrew R.; Vicente, L. N.
Abstract: We address bilevel programming problems when the derivatives of both
the upper and the lower level objective functions are unavailable.
The core algorithms used for both levels are trust-region interpolation-based
methods, using minimum Frobenius norm quadratic models when the number of
points is smaller than the number of basis components. We take advantage of the
problem structure to derive conditions (related to the global convergence theory of
the underlying trust-region methods, as far as possible) under which the lower level
can be solved inexactly and sample points can be reused for model building. In addition,
we indicate numerically how effective these expedients can be. A number of
other issues are also discussed, from the extension to linearly constrained problems
to the use of surrogate models for the lower level response.
One important application of our work appears in the robust optimization of
simulation-based functions, which may arise due to implementation variables or
uncertain parameters. The robust counterpart of an optimization problem without
derivatives falls in the category of the bilevel problems under consideration here.
We provide numerical illustrations of the application of our algorithmic framework
to such robust optimization examples
Fri, 01 Jan 2010 00:00:00 GMThttp://hdl.handle.net/10316/137002010-01-01T00:00:00ZDirect multisearch for multiobjective optimizationhttp://hdl.handle.net/10316/13698Title: Direct multisearch for multiobjective optimization
Authors: Custódio, A. L.; Madeira, J. F. A.; Vaz, A. I. F.; Vicente, L. N.
Abstract: In practical applications of optimization it is common to have several
conflicting objective functions to optimize. Frequently, these functions are subject to
noise or can be of black-box type, preventing the use of derivative-based techniques.
We propose a novel multiobjective derivative-free methodology, calling it direct
multisearch (DMS), which does not aggregate any of the objective functions. Our
framework is inspired by the search/poll paradigm of direct-search methods of directional
type and uses the concept of Pareto dominance to maintain a list of nondominated
points (from which the new iterates or poll centers are chosen). The aim
of our method is to generate as many points in the Pareto front as possible from the
polling procedure itself, while keeping the whole framework general enough to accommodate
other disseminating strategies, in particular when using the (here also)
optional search step. DMS generalizes to multiobjective optimization (MOO) all
direct-search methods of directional type.
We prove under the common assumptions used in direct search for single optimization
that at least one limit point of the sequence of iterates generated by
DMS lies in (a stationary form of) the Pareto front. However, extensive computational
experience has shown that our methodology has an impressive capability of
generating the whole Pareto front, even without using a search step.
Two by-products of this paper are (i) the development of a collection of test
problems for MOO and (ii) the extension of performance and data profiles to MOO,
allowing a comparison of several solvers on a large set of test problems, in terms of
their efficiency and robustness to determine Pareto fronts.
Fri, 01 Jan 2010 00:00:00 GMThttp://hdl.handle.net/10316/136982010-01-01T00:00:00ZWorst case complexity of direct searchhttp://hdl.handle.net/10316/13699Title: Worst case complexity of direct search
Authors: Vicente, L. N.
Abstract: In this paper we prove that direct search of directional type shares
the worst case complexity bound of steepest descent when sufficient decrease is
imposed using a quadratic function of the step size parameter. This result is proved
under smoothness of the objective function and using a framework of the type of
GSS (generating set search). We also discuss the worst case complexity of direct
search when only simple decrease is imposed and when the objective function is
non-smooth.
Fri, 01 Jan 2010 00:00:00 GMThttp://hdl.handle.net/10316/136992010-01-01T00:00:00ZOptimizing radial basis functions by D.C. programming and its use in direct search for global derivative-free optimizationhttp://hdl.handle.net/10316/13642Title: Optimizing radial basis functions by D.C. programming and its use in direct search for global derivative-free optimization
Authors: Le Thi, Hoai An; Vaz, A. I. F.; Vicente, L. N.
Abstract: In this paper we address the global optimization of functions subject
to bound and linear constraints without using derivatives of the objective function.
We investigate the use of derivative-free models based on radial basis functions
(RBFs) in the search step of direct-search methods of directional type. We also
study the application of algorithms based on difference of convex (d.c.) functions
programming to solve the resulting subproblems which consist of the minimization
of the RBF models subject to simple bounds on the variables. Extensive numerical
results are reported with a test set of bound and linearly constrained problems.
Thu, 01 Jan 2009 00:00:00 GMThttp://hdl.handle.net/10316/136422009-01-01T00:00:00ZUpdating the Multipliers Associated with Inequality Constraints in an Augmented Lagrangian Multiplier Methodhttp://hdl.handle.net/10316/7755Title: Updating the Multipliers Associated with Inequality Constraints in an Augmented Lagrangian Multiplier Method
Authors: Avelino, C. P.; Vicente, L. N.
Abstract: This paper contributes to the development of the field of augmented Lagrangian multiplier methods for general nonlinear programming by introducing a new update for the multipliers corresponding to inequality constraints. The update maintains naturally the nonnegativity of the multipliers without the need for a positive-orthant projection, as a result of the verification of the first-order necessary conditions for the minimization of a modified augmented Lagrangian penalty function.
Wed, 01 Jan 2003 00:00:00 GMThttp://hdl.handle.net/10316/77552003-01-01T00:00:00ZLocal Convergence of the Affine-Scaling Interior-Point Algorithm for Nonlinear Programminghttp://hdl.handle.net/10316/7756Title: Local Convergence of the Affine-Scaling Interior-Point Algorithm for Nonlinear Programming
Authors: Vicente, L. N.
Abstract: This paper addresses the local convergence properties of the affine-scaling interior-point algorithm for nonlinear programming. The analysis of local convergence is developed in terms of parameters that control the interior-point scheme and the size of the residual of the linear system that provides the step direction. The analysis follows the classical theory for quasi-Newton methods and addresses q-linear, q-superlinear, and q-quadratic rates of convergence.
Sat, 01 Jan 2000 00:00:00 GMThttp://hdl.handle.net/10316/77562000-01-01T00:00:00ZEstimation of Risk-Neutral Density Surfaceshttp://hdl.handle.net/10316/13330Title: Estimation of Risk-Neutral Density Surfaces
Authors: Monteiro, A. M.; Tütüncü, R. H.; Vicente, L. N.
Abstract: Option price data is often used to infer risk-neutral densities for future prices of an underlying asset. Given the prices of a set of options on the same underlying asset with different strikes and maturities, we propose a nonparametric approach for estimating risk-neutral densities associated with several maturities. Our method uses bicubic splines in order to achieve the desired smoothness for the estimation and an optimization model to choose the spline functions that best fit the price data. Semidefinite programming is employed to guarantee the nonnegativity of the densities. We illustrate the process using synthetic option price data generated using log-normal and absolute diffusion processes as well as actual price data for options on the S&P500 index. We also used the risk-neutral densities that we computed to price exotic options and observed that this approach generates prices that closely approximate the market prices of these options.
Fri, 01 Jan 2010 00:00:00 GMThttp://hdl.handle.net/10316/133302010-01-01T00:00:00Z