Milan Hladík. An extension of the αBB-type underestimation to linear parametric Hessian matrices. J. Glob. Optim., 64(2):217–231, 2016.
[PDF] [gzipped postscript] [postscript] [HTML]
The classical alpha-BB method is a global optimization method the important step of which is to determine a convex underestimator of an objective function on an interval domain. Its particular point is to enclose the range of a Hessian matrix in an interval matrix. To have a tighter estimation of the Hessian matrices, we investigate a linear parametric form enclosure in this paper. One way to obtain this form is by using a slope extension of the Hessian entries. Numerical examples indicate that our approach can sometimes significantly reduce overestimation on the objective function. However, the slope extensions highly depend on a choice of the center of linearization. We compare some naive choices and also propose a heuristic one, which performs well in executed examples, but it seems there is no one global winner.
@article{Hla2016a, author = "Milan Hlad\'{\i}k", title = "An extension of the $\alpha${BB}-type underestimation to linear parametric {Hessian} matrices", webtitle = "An extension of the αBB-type underestimation to linear parametric {Hessian} matrices", journal = "J. Glob. Optim.", fjournal = "Journal of Global Optimization", volume = "64", number = "2", pages = "217-231", year = "2016", doi = "10.1007/s10898-015-0304-5", issn = "0925-5001", url = "https://doi.org/10.1007/s10898-015-0304-5", bib2html_dl_html = "https://link.springer.com/article/10.1007%2Fs10898-015-0304-5", bib2html_dl_pdf = "https://rdcu.be/cnoZG", abstract = "The classical alpha-BB method is a global optimization method the important step of which is to determine a convex underestimator of an objective function on an interval domain. Its particular point is to enclose the range of a Hessian matrix in an interval matrix. To have a tighter estimation of the Hessian matrices, we investigate a linear parametric form enclosure in this paper. One way to obtain this form is by using a slope extension of the Hessian entries. Numerical examples indicate that our approach can sometimes significantly reduce overestimation on the objective function. However, the slope extensions highly depend on a choice of the center of linearization. We compare some naive choices and also propose a heuristic one, which performs well in executed examples, but it seems there is no one global winner.", keywords = "Global optimization; Interval computation; Convex relaxation", }
Generated by bib2html.pl (written by Patrick Riley ) on Wed Oct 23, 2024 08:16:44