Please use this identifier to cite or link to this item: https://hdl.handle.net/10316/45246
Title: Worst case complexity of direct search under convexity
Authors: Dodangeh, Mahdi 
Vicente, Luís Nunes 
Issue Date: 2016
Publisher: Springer Berlin Heidelberg
Project: info:eu-repo/grantAgreement/FCT/COMPETE/132981/PT 
Serial title, monograph or event: Mathematical Programming
Volume: 155
Issue: 1-2
Abstract: In this paper we prove that the broad class of direct-search methods of directional type, based on imposing sufficient decrease to accept new iterates, exhibits the same worst case complexity bound and global rate of the gradient method for the unconstrained minimization of a convex and smooth function. More precisely, it will be shown that the number of iterations needed to reduce the norm of the gradient of the objective function below a certain threshold is at most proportional to the inverse of the threshold. It will be also shown that the absolute error in the function values decay at a sublinear rate proportional to the inverse of the iteration counter. In addition, we prove that the sequence of absolute errors of function values and iterates converges r-linearly in the strongly convex case.
URI: https://hdl.handle.net/10316/45246
DOI: 10.1007/s10107-014-0847-0
10.1007/s10107-014-0847-0
Rights: embargoedAccess
Appears in Collections:I&D CMUC - Artigos em Revistas Internacionais

Files in This Item:
File Description SizeFormat
cds.pdf521.97 kBAdobe PDFView/Open
Show full item record

SCOPUSTM   
Citations

18
checked on Apr 15, 2024

WEB OF SCIENCETM
Citations 10

16
checked on Apr 2, 2024

Page view(s) 20

613
checked on Apr 23, 2024

Download(s)

181
checked on Apr 23, 2024

Google ScholarTM

Check

Altmetric

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.