找回密碼
 To register

QQ登錄

只需一步,快速開始

掃一掃,訪問微社區(qū)

打印 上一主題 下一主題

Titlebook: Convex Optimization with Computational Errors; Alexander J. Zaslavski Book 2020 Springer Nature Switzerland AG 2020 convex optimization.ma

[復制鏈接]
查看: 26033|回復: 49
樓主
發(fā)表于 2025-3-21 16:29:14 | 只看該作者 |倒序瀏覽 |閱讀模式
書目名稱Convex Optimization with Computational Errors
編輯Alexander J. Zaslavski
視頻videohttp://file.papertrans.cn/238/237847/237847.mp4
概述Studies the influence of computational errors in numerical optimization, for minimization problems on unbounded sets, and time zero-sum games with two players.Explains that for every algorithm its ite
叢書名稱Springer Optimization and Its Applications
圖書封面Titlebook: Convex Optimization with Computational Errors;  Alexander J. Zaslavski Book 2020 Springer Nature Switzerland AG 2020 convex optimization.ma
描述The book is devoted to the study of approximate solutions of optimization problems in the presence of computational errors. It contains a number of results on the convergence behavior of algorithms in a Hilbert space, which are known as important tools for solving optimization problems. The research presented in the book is the continuation and the further development of the author‘s (c) 2016 book .Numerical Optimization with Computational Errors., Springer 2016. Both books study the algorithms taking into account computational errors which are always present in practice. The main goal is, for a known computational error, to find out what an approximate solution can be obtained and how many iterates one needs for this.?.The main difference between this new book and the 2016 book is that in this present book the discussion takes into consideration the fact that for every algorithm, its iteration consists of several steps and that computational errors for different steps are generally, different. This fact, which was not taken into account in the previous book, is indeed important in practice. For example, the subgradient projection algorithm consists of two steps. The first step is
出版日期Book 2020
關鍵詞convex optimization; mathematical programming; computational error; nonlinear analysis; solving real-wor
版次1
doihttps://doi.org/10.1007/978-3-030-37822-6
isbn_softcover978-3-030-37824-0
isbn_ebook978-3-030-37822-6Series ISSN 1931-6828 Series E-ISSN 1931-6836
issn_series 1931-6828
copyrightSpringer Nature Switzerland AG 2020
The information of publication is updating

書目名稱Convex Optimization with Computational Errors影響因子(影響力)




書目名稱Convex Optimization with Computational Errors影響因子(影響力)學科排名




書目名稱Convex Optimization with Computational Errors網(wǎng)絡公開度




書目名稱Convex Optimization with Computational Errors網(wǎng)絡公開度學科排名




書目名稱Convex Optimization with Computational Errors被引頻次




書目名稱Convex Optimization with Computational Errors被引頻次學科排名




書目名稱Convex Optimization with Computational Errors年度引用




書目名稱Convex Optimization with Computational Errors年度引用學科排名




書目名稱Convex Optimization with Computational Errors讀者反饋




書目名稱Convex Optimization with Computational Errors讀者反饋學科排名




單選投票, 共有 0 人參與投票
 

0票 0%

Perfect with Aesthetics

 

0票 0%

Better Implies Difficulty

 

0票 0%

Good and Satisfactory

 

0票 0%

Adverse Performance

 

0票 0%

Disdainful Garbage

您所在的用戶組沒有投票權限
沙發(fā)
發(fā)表于 2025-3-21 23:24:06 | 只看該作者
Subgradient Projection Algorithm,f convex–concave functions, under the presence of computational errors. The problem is described by an objective function and a set of feasible points. For this algorithm each iteration consists of two steps. The first step is a calculation of a subgradient of the objective function while in the sec
板凳
發(fā)表于 2025-3-22 01:39:12 | 只看該作者
Gradient Algorithm with a Smooth Objective Function,rs. The problem is described by an objective function and a set of feasible points. For this algorithm each iteration consists of two steps. The first step is a calculation of a gradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these
地板
發(fā)表于 2025-3-22 04:43:17 | 只看該作者
Continuous Subgradient Method,nvex–concave functions, under the presence of computational errors. The problem is described by an objective function and a set of feasible points. For this algorithm we need a calculation of a subgradient of the objective function and a calculation of a projection on the feasible set. In each of th
5#
發(fā)表于 2025-3-22 09:04:28 | 只看該作者
6#
發(fā)表于 2025-3-22 16:41:01 | 只看該作者
7#
發(fā)表于 2025-3-22 20:34:39 | 只看該作者
PDA-Based Method for Convex Optimization, steps. In each of these two steps there is a computational error. In general, these two computational errors are different. We show that our algorithm generates a good approximate solution, if all the computational errors are bounded from above by a small positive constant. Moreover, if we know the
8#
發(fā)表于 2025-3-22 22:56:58 | 只看該作者
9#
發(fā)表于 2025-3-23 03:08:49 | 只看該作者
A Projected Subgradient Method for Nonsmooth Problems,this class of problems, an objective function is assumed to be convex but a set of admissible points is not necessarily convex. Our goal is to obtain an .-approximate solution in the presence of computational errors, where . is a given positive number.
10#
發(fā)表于 2025-3-23 08:50:44 | 只看該作者
Convex Optimization with Computational Errors978-3-030-37822-6Series ISSN 1931-6828 Series E-ISSN 1931-6836
 關于派博傳思  派博傳思旗下網(wǎng)站  友情鏈接
派博傳思介紹 公司地理位置 論文服務流程 影響因子官網(wǎng) 吾愛論文網(wǎng) 大講堂 北京大學 Oxford Uni. Harvard Uni.
發(fā)展歷史沿革 期刊點評 投稿經(jīng)驗總結 SCIENCEGARD IMPACTFACTOR 派博系數(shù) 清華大學 Yale Uni. Stanford Uni.
QQ|Archiver|手機版|小黑屋| 派博傳思國際 ( 京公網(wǎng)安備110108008328) GMT+8, 2025-10-7 19:49
Copyright © 2001-2015 派博傳思   京公網(wǎng)安備110108008328 版權所有 All rights reserved
快速回復 返回頂部 返回列表
江源县| 青岛市| 共和县| 张家口市| 阜康市| 漳浦县| 论坛| 陈巴尔虎旗| 沧州市| 内乡县| 石林| 贵溪市| 安乡县| 南宁市| 通许县| 房山区| 兰州市| 青海省| 岐山县| 宁晋县| 宁城县| 宁安市| 宽城| 林周县| 将乐县| 旌德县| 汤阴县| 灵寿县| 保靖县| 兴山县| 南投县| 四川省| 交城县| 石嘴山市| 南丰县| 马边| 六安市| 凤庆县| 肥城市| 合作市| 宝山区|