找回密碼
 To register

QQ登錄

只需一步,快速開始

掃一掃,訪問微社區(qū)

打印 上一主題 下一主題

Titlebook: An Information-Theoretic Approach to Neural Computing; Gustavo Deco,Dragan Obradovic Book 1996 Springer-Verlag New York, Inc. 1996 calculu

[復(fù)制鏈接]
查看: 15044|回復(fù): 45
樓主
發(fā)表于 2025-3-21 17:22:33 | 只看該作者 |倒序?yàn)g覽 |閱讀模式
期刊全稱An Information-Theoretic Approach to Neural Computing
影響因子2023Gustavo Deco,Dragan Obradovic
視頻videohttp://file.papertrans.cn/156/155053/155053.mp4
學(xué)科分類Perspectives in Neural Computing
圖書封面Titlebook: An Information-Theoretic Approach to Neural Computing;  Gustavo Deco,Dragan Obradovic Book 1996 Springer-Verlag New York, Inc. 1996 calculu
影響因子Neural networks provide a powerful new technology to model and control nonlinear and complex systems. In this book, the authors present a detailed formulation of neural networks from the information-theoretic viewpoint. They show how this perspective provides new insights into the design theory of neural networks. In particular they show how these methods may be applied to the topics of supervised and unsupervised learning including feature extraction, linear and non-linear independent component analysis, and Boltzmann machines. Readers are assumed to have a basic understanding of neural networks, but all the relevant concepts from information theory are carefully introduced and explained. Consequently, readers from several different scientific disciplines, notably cognitive scientists, engineers, physicists, statisticians, and computer scientists, will find this to be a very valuable introduction to this topic.
Pindex Book 1996
The information of publication is updating

書目名稱An Information-Theoretic Approach to Neural Computing影響因子(影響力)




書目名稱An Information-Theoretic Approach to Neural Computing影響因子(影響力)學(xué)科排名




書目名稱An Information-Theoretic Approach to Neural Computing網(wǎng)絡(luò)公開度




書目名稱An Information-Theoretic Approach to Neural Computing網(wǎng)絡(luò)公開度學(xué)科排名




書目名稱An Information-Theoretic Approach to Neural Computing被引頻次




書目名稱An Information-Theoretic Approach to Neural Computing被引頻次學(xué)科排名




書目名稱An Information-Theoretic Approach to Neural Computing年度引用




書目名稱An Information-Theoretic Approach to Neural Computing年度引用學(xué)科排名




書目名稱An Information-Theoretic Approach to Neural Computing讀者反饋




書目名稱An Information-Theoretic Approach to Neural Computing讀者反饋學(xué)科排名




單選投票, 共有 0 人參與投票
 

0票 0%

Perfect with Aesthetics

 

0票 0%

Better Implies Difficulty

 

0票 0%

Good and Satisfactory

 

0票 0%

Adverse Performance

 

0票 0%

Disdainful Garbage

您所在的用戶組沒有投票權(quán)限
沙發(fā)
發(fā)表于 2025-3-21 23:56:40 | 只看該作者
https://doi.org/10.1007/978-3-642-92788-1asily applied are stochastic Boolean networks, i.e. Boltzmann Machines, which are the main topic of this chapter. The simplicity is due to the fact that the outputs of the network units are binary (and therefore very limited) and that the output probabilities can be explicitly calculated.
板凳
發(fā)表于 2025-3-22 02:31:07 | 只看該作者
Die Prognose der Essentiellen Hypertoniering learning or by appropriately modifying the cost function. Akaike’s and Rissanen’s methods for formulating cost functions which naturally include model complexity terms are presented in Chapter 7 while the problem of generalization over an infinite ensemble of networks is presented in Chapter 8.
地板
發(fā)表于 2025-3-22 06:30:28 | 只看該作者
Nonlinear Feature Extraction: Boolean Stochastic Networksasily applied are stochastic Boolean networks, i.e. Boltzmann Machines, which are the main topic of this chapter. The simplicity is due to the fact that the outputs of the network units are binary (and therefore very limited) and that the output probabilities can be explicitly calculated.
5#
發(fā)表于 2025-3-22 12:35:20 | 只看該作者
Information Theory Based Regularizing Methodsring learning or by appropriately modifying the cost function. Akaike’s and Rissanen’s methods for formulating cost functions which naturally include model complexity terms are presented in Chapter 7 while the problem of generalization over an infinite ensemble of networks is presented in Chapter 8.
6#
發(fā)表于 2025-3-22 16:44:55 | 只看該作者
Book 1996rks, but all the relevant concepts from information theory are carefully introduced and explained. Consequently, readers from several different scientific disciplines, notably cognitive scientists, engineers, physicists, statisticians, and computer scientists, will find this to be a very valuable introduction to this topic.
7#
發(fā)表于 2025-3-22 20:50:10 | 只看該作者
Book 1996mulation of neural networks from the information-theoretic viewpoint. They show how this perspective provides new insights into the design theory of neural networks. In particular they show how these methods may be applied to the topics of supervised and unsupervised learning including feature extra
8#
發(fā)表于 2025-3-22 22:52:58 | 只看該作者
Linear Feature Extraction: Infomax Principleenables processing of the higher order cognitive functions. Chapter 3 and Chapter 4 focus on the case of linear feature extraction. Linear feature extraction removes redundancy from the data in a linear fashion.
9#
發(fā)表于 2025-3-23 04:20:09 | 只看該作者
Konstitutioneller h?molytischer IkterusThis chapter presents a brief overview of the principal concepts and fundaments of information theory and the theory of neural networks.
10#
發(fā)表于 2025-3-23 09:07:29 | 只看該作者
 關(guān)于派博傳思  派博傳思旗下網(wǎng)站  友情鏈接
派博傳思介紹 公司地理位置 論文服務(wù)流程 影響因子官網(wǎng) 吾愛論文網(wǎng) 大講堂 北京大學(xué) Oxford Uni. Harvard Uni.
發(fā)展歷史沿革 期刊點(diǎn)評(píng) 投稿經(jīng)驗(yàn)總結(jié) SCIENCEGARD IMPACTFACTOR 派博系數(shù) 清華大學(xué) Yale Uni. Stanford Uni.
QQ|Archiver|手機(jī)版|小黑屋| 派博傳思國(guó)際 ( 京公網(wǎng)安備110108008328) GMT+8, 2025-10-23 12:49
Copyright © 2001-2015 派博傳思   京公網(wǎng)安備110108008328 版權(quán)所有 All rights reserved
快速回復(fù) 返回頂部 返回列表
淳安县| 新津县| 富裕县| 安义县| 大城县| 嘉黎县| 巩义市| 丘北县| 洛浦县| 壤塘县| 工布江达县| 永清县| 奎屯市| 两当县| 略阳县| 湘阴县| 靖江市| 容城县| 双流县| 双辽市| 雷山县| 星座| 多伦县| 石城县| 新兴县| 奇台县| 呼伦贝尔市| 靖宇县| 安陆市| 德保县| 从化市| 确山县| 台江县| 太谷县| 莲花县| 揭东县| 都江堰市| 汕尾市| 普定县| 赤峰市| 尚志市|