Каталог / Фізико-математичні науки / Обчислювальна математика
скачать файл: 
- Назва:
- Численные методы решения негладких задач выпуклой оптимизации с функциональными ограничениями / Numerical Methods for Non-Smooth Convex Optimization Problems with Functional Constraints Алкуса Мохаммад
- Альтернативное название:
- Numerical Methods for Non-Smooth Convex Optimization Problems with Functional Constraints Alkusa Mohammad
- ВНЗ:
- Московский физико-технический институт (национальный исследовательский университет)
- Короткий опис:
- Алкуса, Мохаммад.
Численные методы решения негладких задач выпуклой оптимизации с функциональными ограничениями = Numerical Methods for Non-Smooth Convex Optimization Problems with Functional Constraints : Numerical Methods for Non-Smooth Convex Optimization Problems with Functional Constraints : диссертация ... кандидата физико-математических наук : 01.01.07 / Mohammad S. Alkousa; [Место защиты: ФГАОУ ВО «Московский физико-технический институт (национальный исследовательский университет)»]. - Москва, 2020. - 150 с. : ил.
Оглавление диссертациикандидат наук Алкуса Мохаммад
In each chapter of the thesis, the results of numerical experiments illustrating the advantages of the proposed and studied procedures for some examples were presented.
Table of Contents
Contents Page
Acknowledgments iii
Abstract v
List of Figures ix
List of Tables xi
Introduction
1 Some Basic Tools in Convex Analysis and Optimization
1.1 Convex Analysis Tools
1.1.1 Convex sets
1.1.2 Differentiable convex functions
1.1.3 Non-differentiable convex functions
1.1.4 Lipschitz continuity
1.2 Convex Optimization Tools
1.2.1 Properties of convex optimization problems
1.2.2 Numerical methods for convex optimization problems
1.2.3 Mirror Descent basics
2 Mirror Descent Algorithms for Deterministic and Stochastic Constrained Optimization Problems
2.1 Mirror Descent Algorithms for Deterministic Constrained Optimization
Problems
2.1.1 Problem statement and standard Mirror Descent methods
2.1.2 The modification of an adaptive Mirror Descent algorithm to minimize the Lipschitz-continuous objective function
2.1.3 The modification of an adaptive Mirror Descent algorithm to minimize the objective function with Lipschitz gradient
2.1.4 Partial adaptive Mirror Descent algorithm to minimize the objective function with Lipschitz gradient
2.1.5 The estimates of the convergence rate of the proposed Mirror Descent algorithms
2.1.6 Numerical experiments
2.2 Mirror Descent Algorithms for Special Strongly Convex Optimization
Problems
2.2.1 Adaptive Mirror Descent algorithm for Lipschitz-continuous strongly convex function
2.2.2 Adaptive Mirror Descent algorithm for strongly convex functions
with Lipschitz gradient
2.2.3 Partial adaptive Mirror Descent algorithm for strongly convex objective function with Lipschitz gradient
2.2.4 Numerical experiments
2.3 Mirror Descent Algorithms for Stochastic Constrained Optimization Problems
2.3.1 Problem statement and standard concerning basics
2.3.2 Adaptive stochastic Mirror Descent algorithm for Lipschitz-continuous function
2.3.3 The modification of an adaptive stochastic Mirror Descent algorithm for Lipschitz-continuous function
2.3.4 Numerical experiments
2.3.5 Additional experiments: Fermat-Torricelli-Steiner problem
3 Mirror Descent and Constrained Online Optimization Problems
3.1 Deterministic Constrained Online Optimization Problems
3.1.1 Non-adaptive Mirror Descent algorithm for the case of nonnegative regret
3.1.2 Adaptive Mirror Descent algorithm for the case of non-negative regret
3.1.3 Numerical experiments
3.2 Stochastic Constrained Online Optimization Problems
3.2.1 Non-adaptive stochastic algorithm for constrained online optimization problems
3.2.2 Adaptive stochastic algorithm for constrained online optimization problems
3.2.3 Numerical experiments
4 Accelerated Methods for Saddle Point Problems
4.1 Accelerated Methods for Composite Non-Bilinear Saddle Point Problem
4.1.1 Problem statement and some related concepts
4.1.2 Some basic lemmas related to the considered problem
4.1.3 Fundamental results and the scheme of accelerated methods for
the considered saddle point problem
4.2 On Some Algorithms for Strongly Convex Optimization Problems with
One Functional Constraint
4.2.1 Problem statement and related basics
4.2.2 The proposed algorithm and the estimates of the accuracy of the solution
4.2.3 An estimate of the convergence rate for the Lipschitz-continuous functions
4.2.4 Numerical experiments
References
List of Publications
List of Figures
1.1 The best-known complexities of the first-order methods for several classes
of problems
2.1 The results of Algorithms 1 (AMD-L), 2 (AMD-LG), 3 (Mod.AMD-L)
and 4 (Mod.AMD-LG), with m = r = 50, n = 500 and different values of e
2.2 The results of Algorithms 1 (AMD-L), 2 (AMD-LG), 3 (Mod.AMD-L)
and 4 (Mod.AMD-LG), with m = 100,n = 1000, r = 50 and e =
2.3 The results of Algorithms 1 (AMD-L), 2 (AMD-LG), 3 (Mod.AMD-L)
and 4 (Mod.AMD-LG), with m = 100,n = 1000, r = 50 and e =
2.4 The results of Algorithms 1 (AMD-L), 2 (AMD-LG), 3 (Mod.AMD-L) and 4 (Mod.AMD-LG), for Fermat-Torricelli-Steiner problem, with m =
50, n = 100, r = 25 and different values of e
2.5 The results of Algorithms 2 (AMD-LG) and 5 (PAMD-LG), when Mg < 1, with m = 500, r = 100, n = 2000 and different values of e
2.6 The results of Algorithms 2 (AMD-LG) and 5 (PAMD-LG), when Mg > 1, with m = 500, r = 100, n = 2000 and different values of e
2.7 The results of Algorithms 2 (AMD-LG) and 5 (PAMD-LG), when Mg < 1, with m = 500, r = 100, n = 2000 and e = 0.05. The logarithmic scale on both axes in all figures
2.8 The results of Algorithms 2 (AMD-LG) and 5 (PAMD-LG), when Mg > 1, with m = 500, r = 100, n = 2000 and e = 0.05. The logarithmic scale on both axes in all figures
2.9 The results of Algorithms 1 (AMD-L) and 6 (AMD-SL), with different values of e
2.10 The results of Algorithms 2 (AMD-LG) and 7 (AMD-SLG), with different values of e
2.11 The results of Algorithms 9 (ASMD-L) and 10 (Mod.ASMD-L), for Fermat-Torricelli-Steiner problem, with m = 250, n = 1000, r = 100 and different values of e
3.1 The results of Algorithms 11 (Non Adap-OMD), 12 (Adap-OMD) and 13 (Mod.Adap-OMD), with m = 300, n = 100 and different values of N
3.2 The results of Algorithms 14 (Non Adap-SOMD) and 15 (Adap-SOMD),
for objective function (3.37) with m = 10, n = 20 and different values of N
3.3 The results of Algorithms 14 (Non Adap-SOMD) and 15 (Adap-SOMD),
for Example 20, with m = 200, n = 500 and different values of N
3.4 The results of Algorithm 15 (Adap-SOMD), for Example 20, with m =
and different values of n and N
List of Tables
2.1 The results of Algorithms 1-4, with constraints (2.35). m = 50 and n =
2.2 The results of Algorithms 1-4, with constraints (2.36). m = 100 and
n =
2.3 The results of Algorithms 1-4, with constraints (2.37). m = 50 and n =
2.4 The results of Algorithms 2 and 5, when Mg < 1, with m = 500, r =
100, n =
2.5 The results of Algorithms 2 and 5, when Mg > 1, with m = 500, r =
100, n =
2.6 Results of Algorithms 9 and 10, with m = 50, n = 1500 and different values
of r
2.7 The results of Algorithms 9 and 10, with m = 50, n = 100 and different values of r
3.1 The results of Algorithms 11 and 12, with m = 100 and n =
3.2 The results of Algorithm 13 (Mod.Adap-OMD), with m = 100 and n =
3.3 Results of Algorithms 14 and 15, with m = 10, n = 20 and different values
of N
4.1 The best-known results regarding the complexity of solving the problem (4.6)
4.2 The results of Algorithms 20 and 21, for Example 21, with r = m =
and n =
4.3 The results of Algorithms 20 and 21, for Example 22, with m = 50 and
n =
4.4 The results of Algorithms 6, 20 and 21, for Example 23, with p = 10, m =
50 and n =
4.5 The results of Algorithms 6, 20 and 21, for Example 24, with r = 50, m =
50 and n =
- Стоимость доставки:
- 230.00 руб