找回密碼
 To register

QQ登錄

只需一步,快速開始

掃一掃,訪問微社區(qū)

打印 上一主題 下一主題

Titlebook: An Information-Theoretic Approach to Neural Computing; Gustavo Deco,Dragan Obradovic Book 1996 Springer-Verlag New York, Inc. 1996 calculu

[復(fù)制鏈接]
查看: 15038|回復(fù): 45
樓主
發(fā)表于 2025-3-21 17:22:33 | 只看該作者 |倒序瀏覽 |閱讀模式
期刊全稱An Information-Theoretic Approach to Neural Computing
影響因子2023Gustavo Deco,Dragan Obradovic
視頻videohttp://file.papertrans.cn/156/155053/155053.mp4
學(xué)科分類Perspectives in Neural Computing
圖書封面Titlebook: An Information-Theoretic Approach to Neural Computing;  Gustavo Deco,Dragan Obradovic Book 1996 Springer-Verlag New York, Inc. 1996 calculu
影響因子Neural networks provide a powerful new technology to model and control nonlinear and complex systems. In this book, the authors present a detailed formulation of neural networks from the information-theoretic viewpoint. They show how this perspective provides new insights into the design theory of neural networks. In particular they show how these methods may be applied to the topics of supervised and unsupervised learning including feature extraction, linear and non-linear independent component analysis, and Boltzmann machines. Readers are assumed to have a basic understanding of neural networks, but all the relevant concepts from information theory are carefully introduced and explained. Consequently, readers from several different scientific disciplines, notably cognitive scientists, engineers, physicists, statisticians, and computer scientists, will find this to be a very valuable introduction to this topic.
Pindex Book 1996
The information of publication is updating

書目名稱An Information-Theoretic Approach to Neural Computing影響因子(影響力)




書目名稱An Information-Theoretic Approach to Neural Computing影響因子(影響力)學(xué)科排名




書目名稱An Information-Theoretic Approach to Neural Computing網(wǎng)絡(luò)公開度




書目名稱An Information-Theoretic Approach to Neural Computing網(wǎng)絡(luò)公開度學(xué)科排名




書目名稱An Information-Theoretic Approach to Neural Computing被引頻次




書目名稱An Information-Theoretic Approach to Neural Computing被引頻次學(xué)科排名




書目名稱An Information-Theoretic Approach to Neural Computing年度引用




書目名稱An Information-Theoretic Approach to Neural Computing年度引用學(xué)科排名




書目名稱An Information-Theoretic Approach to Neural Computing讀者反饋




書目名稱An Information-Theoretic Approach to Neural Computing讀者反饋學(xué)科排名




單選投票, 共有 0 人參與投票
 

0票 0%

Perfect with Aesthetics

 

0票 0%

Better Implies Difficulty

 

0票 0%

Good and Satisfactory

 

0票 0%

Adverse Performance

 

0票 0%

Disdainful Garbage

您所在的用戶組沒有投票權(quán)限
沙發(fā)
發(fā)表于 2025-3-21 23:56:40 | 只看該作者
https://doi.org/10.1007/978-3-642-92788-1asily applied are stochastic Boolean networks, i.e. Boltzmann Machines, which are the main topic of this chapter. The simplicity is due to the fact that the outputs of the network units are binary (and therefore very limited) and that the output probabilities can be explicitly calculated.
板凳
發(fā)表于 2025-3-22 02:31:07 | 只看該作者
Die Prognose der Essentiellen Hypertoniering learning or by appropriately modifying the cost function. Akaike’s and Rissanen’s methods for formulating cost functions which naturally include model complexity terms are presented in Chapter 7 while the problem of generalization over an infinite ensemble of networks is presented in Chapter 8.
地板
發(fā)表于 2025-3-22 06:30:28 | 只看該作者
Nonlinear Feature Extraction: Boolean Stochastic Networksasily applied are stochastic Boolean networks, i.e. Boltzmann Machines, which are the main topic of this chapter. The simplicity is due to the fact that the outputs of the network units are binary (and therefore very limited) and that the output probabilities can be explicitly calculated.
5#
發(fā)表于 2025-3-22 12:35:20 | 只看該作者
Information Theory Based Regularizing Methodsring learning or by appropriately modifying the cost function. Akaike’s and Rissanen’s methods for formulating cost functions which naturally include model complexity terms are presented in Chapter 7 while the problem of generalization over an infinite ensemble of networks is presented in Chapter 8.
6#
發(fā)表于 2025-3-22 16:44:55 | 只看該作者
Book 1996rks, but all the relevant concepts from information theory are carefully introduced and explained. Consequently, readers from several different scientific disciplines, notably cognitive scientists, engineers, physicists, statisticians, and computer scientists, will find this to be a very valuable introduction to this topic.
7#
發(fā)表于 2025-3-22 20:50:10 | 只看該作者
Book 1996mulation of neural networks from the information-theoretic viewpoint. They show how this perspective provides new insights into the design theory of neural networks. In particular they show how these methods may be applied to the topics of supervised and unsupervised learning including feature extra
8#
發(fā)表于 2025-3-22 22:52:58 | 只看該作者
Linear Feature Extraction: Infomax Principleenables processing of the higher order cognitive functions. Chapter 3 and Chapter 4 focus on the case of linear feature extraction. Linear feature extraction removes redundancy from the data in a linear fashion.
9#
發(fā)表于 2025-3-23 04:20:09 | 只看該作者
Konstitutioneller h?molytischer IkterusThis chapter presents a brief overview of the principal concepts and fundaments of information theory and the theory of neural networks.
10#
發(fā)表于 2025-3-23 09:07:29 | 只看該作者
 關(guān)于派博傳思  派博傳思旗下網(wǎng)站  友情鏈接
派博傳思介紹 公司地理位置 論文服務(wù)流程 影響因子官網(wǎng) 吾愛論文網(wǎng) 大講堂 北京大學(xué) Oxford Uni. Harvard Uni.
發(fā)展歷史沿革 期刊點評 投稿經(jīng)驗總結(jié) SCIENCEGARD IMPACTFACTOR 派博系數(shù) 清華大學(xué) Yale Uni. Stanford Uni.
QQ|Archiver|手機版|小黑屋| 派博傳思國際 ( 京公網(wǎng)安備110108008328) GMT+8, 2025-10-23 07:14
Copyright © 2001-2015 派博傳思   京公網(wǎng)安備110108008328 版權(quán)所有 All rights reserved
快速回復(fù) 返回頂部 返回列表
广东省| 阳新县| 安岳县| 兴业县| 自贡市| 内江市| 宁海县| 镇江市| 永泰县| 禄丰县| 绥阳县| 大城县| 明水县| 文登市| 三亚市| 无极县| 彰武县| 东乌珠穆沁旗| 涡阳县| 河源市| 兰坪| 孟村| 扎鲁特旗| 通山县| 本溪市| 元朗区| 静海县| 兴仁县| 九台市| 秦安县| 长兴县| 宜兰市| 莆田市| 华坪县| 吉安市| 喀喇沁旗| 泌阳县| 杨浦区| 斗六市| 永顺县| 鄂托克前旗|