找回密碼
 To register

QQ登錄

只需一步,快速開(kāi)始

掃一掃,訪問(wèn)微社區(qū)

打印 上一主題 下一主題

Titlebook: Nonlinear Conjugate Gradient Methods for Unconstrained Optimization; Neculai Andrei Book 2020 The Editor(s) (if applicable) and The Author

[復(fù)制鏈接]
查看: 8816|回復(fù): 47
樓主
發(fā)表于 2025-3-21 18:33:28 | 只看該作者 |倒序?yàn)g覽 |閱讀模式
書(shū)目名稱Nonlinear Conjugate Gradient Methods for Unconstrained Optimization
編輯Neculai Andrei
視頻videohttp://file.papertrans.cn/668/667362/667362.mp4
概述An explicit and thorough treatment of the conjugate gradient algorithms for unconstrained optimization properties and convergence.A clear illustration of the numerical performances of the algorithms d
叢書(shū)名稱Springer Optimization and Its Applications
圖書(shū)封面Titlebook: Nonlinear Conjugate Gradient Methods for Unconstrained Optimization;  Neculai Andrei Book 2020 The Editor(s) (if applicable) and The Author
描述.Two approaches are known?for solving?.large-scale.?unconstrained optimization problems—the limited-memory quasi-Newton method (truncated Newton method) and the conjugate gradient method. This is the first book to detail conjugate gradient methods, showing their properties and convergence characteristics as well as their performance in solving large-scale unconstrained optimization problems and applications. Comparisons to the limited-memory and truncated Newton methods are also discussed. Topics studied in detail include: linear conjugate gradient methods, standard conjugate gradient methods, acceleration of conjugate gradient methods, hybrid, modifications of the standard scheme, memoryless BFGS preconditioned, and three-term. Other conjugate gradient methods with clustering the eigenvalues or with the minimization of the condition number of the iteration matrix, are also treated. For each method, the convergence analysis, the computational performances and thecomparisons versus other conjugate gradient methods are given. ?.The theory behind the conjugate gradient algorithms presented as a methodology is developed with a clear, rigorous, and friendly exposition; the reader will g
出版日期Book 2020
關(guān)鍵詞conjugate gradient method; conjugate gradient algorithm; quasi-Newton method; steepest descent method; B
版次1
doihttps://doi.org/10.1007/978-3-030-42950-8
isbn_softcover978-3-030-42952-2
isbn_ebook978-3-030-42950-8Series ISSN 1931-6828 Series E-ISSN 1931-6836
issn_series 1931-6828
copyrightThe Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerl
The information of publication is updating

書(shū)目名稱Nonlinear Conjugate Gradient Methods for Unconstrained Optimization影響因子(影響力)




書(shū)目名稱Nonlinear Conjugate Gradient Methods for Unconstrained Optimization影響因子(影響力)學(xué)科排名




書(shū)目名稱Nonlinear Conjugate Gradient Methods for Unconstrained Optimization網(wǎng)絡(luò)公開(kāi)度




書(shū)目名稱Nonlinear Conjugate Gradient Methods for Unconstrained Optimization網(wǎng)絡(luò)公開(kāi)度學(xué)科排名




書(shū)目名稱Nonlinear Conjugate Gradient Methods for Unconstrained Optimization被引頻次




書(shū)目名稱Nonlinear Conjugate Gradient Methods for Unconstrained Optimization被引頻次學(xué)科排名




書(shū)目名稱Nonlinear Conjugate Gradient Methods for Unconstrained Optimization年度引用




書(shū)目名稱Nonlinear Conjugate Gradient Methods for Unconstrained Optimization年度引用學(xué)科排名




書(shū)目名稱Nonlinear Conjugate Gradient Methods for Unconstrained Optimization讀者反饋




書(shū)目名稱Nonlinear Conjugate Gradient Methods for Unconstrained Optimization讀者反饋學(xué)科排名




單選投票, 共有 0 人參與投票
 

0票 0%

Perfect with Aesthetics

 

0票 0%

Better Implies Difficulty

 

0票 0%

Good and Satisfactory

 

0票 0%

Adverse Performance

 

0票 0%

Disdainful Garbage

您所在的用戶組沒(méi)有投票權(quán)限
沙發(fā)
發(fā)表于 2025-3-21 22:19:41 | 只看該作者
Book 2020each method, the convergence analysis, the computational performances and thecomparisons versus other conjugate gradient methods are given. ?.The theory behind the conjugate gradient algorithms presented as a methodology is developed with a clear, rigorous, and friendly exposition; the reader will g
板凳
發(fā)表于 2025-3-22 02:27:45 | 只看該作者
地板
發(fā)表于 2025-3-22 06:42:14 | 只看該作者
5#
發(fā)表于 2025-3-22 12:10:34 | 只看該作者
6#
發(fā)表于 2025-3-22 15:15:35 | 只看該作者
Conjugate Gradient Methods as Modifications of the Standard Schemes,g unconstrained optimization problems. These methods have good convergence properties and their iterations do not involve any matrices, making them extremely attractive for solving large-scale problems.
7#
發(fā)表于 2025-3-22 17:28:43 | 只看該作者
Linear Conjugate Gradient Algorithm,The linear conjugate gradient algorithm is dedicated to minimizing convex quadratic functions (or solving linear algebraic systems of equations with positive definite matrices). This algorithm was introduced by Hestenes and Stiefel (1952).
8#
發(fā)表于 2025-3-23 01:07:35 | 只看該作者
General Convergence Results for Nonlinear Conjugate Gradient Methods,General convergence results for nonlinear conjugate gradient methods.
9#
發(fā)表于 2025-3-23 04:19:48 | 只看該作者
10#
發(fā)表于 2025-3-23 05:40:35 | 只看該作者
Acceleration of Conjugate Gradient Algorithms,It is common knowledge that in conjugate gradient algorithms, the search directions tend to be poorly scaled and consequently the line search must perform more function evaluations in order to obtain a suitable stepsize ..
 關(guān)于派博傳思  派博傳思旗下網(wǎng)站  友情鏈接
派博傳思介紹 公司地理位置 論文服務(wù)流程 影響因子官網(wǎng) 吾愛(ài)論文網(wǎng) 大講堂 北京大學(xué) Oxford Uni. Harvard Uni.
發(fā)展歷史沿革 期刊點(diǎn)評(píng) 投稿經(jīng)驗(yàn)總結(jié) SCIENCEGARD IMPACTFACTOR 派博系數(shù) 清華大學(xué) Yale Uni. Stanford Uni.
QQ|Archiver|手機(jī)版|小黑屋| 派博傳思國(guó)際 ( 京公網(wǎng)安備110108008328) GMT+8, 2025-10-9 19:28
Copyright © 2001-2015 派博傳思   京公網(wǎng)安備110108008328 版權(quán)所有 All rights reserved
快速回復(fù) 返回頂部 返回列表
建德市| 洛阳市| 岳西县| 沙坪坝区| 宜丰县| 织金县| 界首市| 阳朔县| 宾川县| 综艺| 惠来县| 呼伦贝尔市| 安阳县| 湟中县| 思南县| 阜城县| 东光县| 磴口县| 公安县| 东阳市| 临夏县| 海丰县| 弋阳县| 龙游县| 于田县| 莱芜市| 巫山县| 子长县| 宿州市| 拉萨市| 万全县| 东兰县| 维西| 山东省| 日土县| 无极县| 安吉县| 福鼎市| 石狮市| 金乡县| 化州市|