標題: Titlebook: Connectionist, Statistical and Symbolic Approaches to Learning for Natural Language Processing; Stefan Wermter,Ellen Riloff,Gabriele Schel [打印本頁] 作者: FETID 時間: 2025-3-21 19:48
書目名稱Connectionist, Statistical and Symbolic Approaches to Learning for Natural Language Processing影響因子(影響力)
書目名稱Connectionist, Statistical and Symbolic Approaches to Learning for Natural Language Processing影響因子(影響力)學(xué)科排名
書目名稱Connectionist, Statistical and Symbolic Approaches to Learning for Natural Language Processing網(wǎng)絡(luò)公開度
書目名稱Connectionist, Statistical and Symbolic Approaches to Learning for Natural Language Processing網(wǎng)絡(luò)公開度學(xué)科排名
書目名稱Connectionist, Statistical and Symbolic Approaches to Learning for Natural Language Processing被引頻次
書目名稱Connectionist, Statistical and Symbolic Approaches to Learning for Natural Language Processing被引頻次學(xué)科排名
書目名稱Connectionist, Statistical and Symbolic Approaches to Learning for Natural Language Processing年度引用
書目名稱Connectionist, Statistical and Symbolic Approaches to Learning for Natural Language Processing年度引用學(xué)科排名
書目名稱Connectionist, Statistical and Symbolic Approaches to Learning for Natural Language Processing讀者反饋
書目名稱Connectionist, Statistical and Symbolic Approaches to Learning for Natural Language Processing讀者反饋學(xué)科排名
作者: Abnormal 時間: 2025-3-21 22:23 作者: 犬儒主義者 時間: 2025-3-22 02:45 作者: 郊外 時間: 2025-3-22 08:05 作者: 從容 時間: 2025-3-22 11:29
X. B. Reed Jr.,L. Spiegel,S. Hartlandkind of discriminatory power provided by the Principles and Parameters linguistic framework, or Government and Binding theory. We investigate the following models: feed-forward neural networks, Frasconi-Gori-Soda and Back-Tsoi locally recurrent neural networks, Williams and Zipser and Elman recurren作者: 有說服力 時間: 2025-3-22 16:39
Alexander J. Smits,Jean-Paul Dussaugee refinement and network learning. This paper describes a decompositional rule extraction technique which generates rules governing the firing of individual nodes in a feedforward neural network. The technique employs heuristics to reduce the complexity in searching for rules which explain the behav作者: 有說服力 時間: 2025-3-22 17:49
https://doi.org/10.1007/b137383res for the description of relevant meanings of plural definiteness. A small training set (30 sentences) was created by linguistic criteria, and a functional mapping from the semantic feature representation to the overt category of indefinite/definite article was learned. The learned function was ap作者: 專橫 時間: 2025-3-23 00:46
Boundary Layer Turbulence Behavior, besides the necessary input is the analysis of the various text and document structures. In our prototype CONCAT we use neural network technology to learn about the relations within the concept and document space of an existing domain. The results are quite encouraging because with existing input d作者: 歌劇等 時間: 2025-3-23 04:11
https://doi.org/10.1007/b137383stem using a large number of connectionist and symbolic modules. Our system SCREEN learns a flat syntactic and semantic analysis of incremental streams of word hypothesis sequences. In this paper we focus on techniques for improving the quality of pruned hypotheses from a speech recognizer using aco作者: 背書 時間: 2025-3-23 05:54
Alexander J. Smits,Jean-Paul Dussauge linguistic characteristics. This paper presents SKOPE, a connectionist/symbolic spoken Korean processing engine, emphasizing that: 1) connectionist and symbolic techniques must be selectively applied according to their relative strength and weakness, and 2) linguistic characteristics of Korean must作者: impaction 時間: 2025-3-23 12:11
Boundary Layer Turbulence Behavior,nslated. Our multilingual translation system JANUS-2 is able to translate English and German spoken input into either English, German, Spanish, Japanese or Korean output. Getting optimal acoustic and language models as well as developing adequate dictionaries for all these languages requires a lot o作者: 難聽的聲音 時間: 2025-3-23 14:48 作者: 使出神 時間: 2025-3-23 19:50
Alexander J. Smits,Jean-Paul Dussaugeignificantly improves performance. The bulk of the paper, however, attempts to answer the question: what did the program learn that would account for this improvement? We show that the program has learned many linguistically recognized forms of lexical information, particularly verb case frames and 作者: 仲裁者 時間: 2025-3-24 01:08
Turbulent Shear Layers in Supersonic Flowoming an important issue in grammar building and parsing. The statistical induction of grammars and the statistical training of (hand written) grammars are ways to attain or improve a score, but a stochastic grammar does not reflect the often stereotypical use of words depending on their semantical 作者: 錯 時間: 2025-3-24 05:58 作者: corn732 時間: 2025-3-24 07:22
https://doi.org/10.1007/3-540-33591-9articularly difficult case. We describe a robust PP disambiguation procedure that learns from a text corpus. The method is based on a loglinear model, a type of statistical model that is able to account for combinations of multiple categorial features. A series of experiments that compare the loglin作者: antidote 時間: 2025-3-24 14:29 作者: 數(shù)量 時間: 2025-3-24 16:28 作者: 歹徒 時間: 2025-3-24 20:35 作者: APEX 時間: 2025-3-25 01:07
Lecture Notes in Computer Sciencems typically require experts to hand-build dictionaries of extraction patterns for each new type of information to be extracted. This paper presents a system that can learn dictionaries of extraction patterns directly from user-provided examples of texts and events to be extracted from them. The sys作者: 不真 時間: 2025-3-25 06:39
Languages acceptable with logarithmic space,nformation extraction task, automatically inferring the meanings of unknown words from context. Unlike many previous lexical acquisition systems, Camille was thoroughly tested within a complex, real-world domain. The implementation of this system produced many lessons which are applicable to languag作者: 搖曳 時間: 2025-3-25 08:11 作者: neologism 時間: 2025-3-25 12:16 作者: 轉(zhuǎn)折點 時間: 2025-3-25 19:44
Learning approaches for natural language processing,eld, summarize the work that is presented here, and provide some additional references. In the final section we will highlight important general issues and trends based on the workshop discussions and book contributions.作者: 減弱不好 時間: 2025-3-25 21:47
A statistical syntactic disambiguation program and what it learns,prepositional preferences for nouns and adjectives. We also show that viewed simply as a learner of lexical information the program is also a success, performing slightly better than hand-crafted learning programs for the same tasks.作者: anchor 時間: 2025-3-26 03:28
Automatic classification of dialog acts with Semantic Classification Trees and Polygrams, Trees and Polygrams. For both methods the classification algorithm is trained automatically from a corpus of labeled data. The novel idea with respect to SCTs is the use of dialog state dependent CTs and with respect to Polygrams it is the use of competing language models for the classification of dialog acts.作者: 金桌活畫面 時間: 2025-3-26 07:05
Learning information extraction patterns from examples,tem, called LIEP, learns patterns that recognize relationships between key constituents based on local syntax. Sets of patterns learned by LIEP for a sample extraction task perform nearly at the level of a hand-built dictionary of patterns.作者: infelicitous 時間: 2025-3-26 12:00
X. B. Reed Jr.,L. Spiegel,S. Hartlandly for comparison. We find that the Elman and Williams & Zipser recurrent neural networks are able to find a representation for the grammar which we believe is more parsimonious. These models exhibit the best performance.作者: Diskectomy 時間: 2025-3-26 15:04
GNAB — Die legale P2P Download-Plattformdel to find classes of related words in natural language texts. It turns out that for this task, which can be seen as a ‘degenerate’ case of grammar learning, our approach gives quite good results. As opposed to many other approaches, it also provides a clear ‘stopping criterion’ indicating at what point the learning process should stop.作者: surmount 時間: 2025-3-26 20:21
Natural language grammatical inference: A comparison of recurrent neural networks and machine learnly for comparison. We find that the Elman and Williams & Zipser recurrent neural networks are able to find a representation for the grammar which we believe is more parsimonious. These models exhibit the best performance.作者: fabricate 時間: 2025-3-26 22:29 作者: 收藏品 時間: 2025-3-27 02:04 作者: 簡潔 時間: 2025-3-27 08:52
0302-9743 the state of the art in the most promising current approaches to learning for NLP and is thus compulsory reading for researchers in the field or for anyone applying the new techniques to challenging real-world NLP problems.978-3-540-60925-4978-3-540-49738-7Series ISSN 0302-9743 Series E-ISSN 1611-3349 作者: 愛哭 時間: 2025-3-27 10:51 作者: 嚴厲譴責 時間: 2025-3-27 15:51 作者: Eulogy 時間: 2025-3-27 18:23
Turbulent Shear Layers in Supersonic Flowl class. This method does not depend on any specific grammar or set of semantical categories, so it can be used on (almost) any existing system. We present experimental results that show our method gives a considerable improvement over regular stochastic grammars.作者: bile648 時間: 2025-3-27 23:09
https://doi.org/10.1007/3-540-33591-9he accuracy of the statistical method remains 10% below the performance of human experts. This suggests a limit on what can be learned automatically from text, and points to the need to combine machine learning with human expertise.作者: 包庇 時間: 2025-3-28 04:15
Lecture Notes in Computer Sciencel natural language processing. We report experimental results of applying a specific type of committee-based selection during training of a stochastic part-of-speech tagger, and demonstrate substantially improved learning rates over complete training using all of the text.作者: 返老還童 時間: 2025-3-28 10:12
Separating learning and representation,ed the potential to correctly recognise embeddings of any length. These findings illustrate the benefits of the study of representation, which can provide a basis for the development of novel learning rules.作者: 煉油廠 時間: 2025-3-28 14:18 作者: 整潔 時間: 2025-3-28 14:34
Training stochastic grammars on semantical categories,l class. This method does not depend on any specific grammar or set of semantical categories, so it can be used on (almost) any existing system. We present experimental results that show our method gives a considerable improvement over regular stochastic grammars.作者: 暖昧關(guān)系 時間: 2025-3-28 22:14 作者: 休戰(zhàn) 時間: 2025-3-29 00:21 作者: CLEAR 時間: 2025-3-29 05:12 作者: HEPA-filter 時間: 2025-3-29 07:34
Boundary Layer Turbulence Behavior,learn about the relations within the concept and document space of an existing domain. The results are quite encouraging because with existing input data a usable representation of the knowledge space can be obtained.作者: 即席 時間: 2025-3-29 14:37
https://doi.org/10.1007/b137383s of word hypothesis sequences. In this paper we focus on techniques for improving the quality of pruned hypotheses from a speech recognizer using acoustic, syntactic, and semantic knowledge. We show that the developed architecture is able to cope with real-world spontaneously spoken language in an incremental and parallel manner.作者: 窒息 時間: 2025-3-29 15:34 作者: facetious 時間: 2025-3-29 19:43
Knowledge acquisition in concept and document spaces by using self-organizing neural networks,learn about the relations within the concept and document space of an existing domain. The results are quite encouraging because with existing input data a usable representation of the knowledge space can be obtained.作者: 牌帶來 時間: 2025-3-30 01:05
Using hybrid connectionist learning for speech/language analysis,s of word hypothesis sequences. In this paper we focus on techniques for improving the quality of pruned hypotheses from a speech recognizer using acoustic, syntactic, and semantic knowledge. We show that the developed architecture is able to cope with real-world spontaneously spoken language in an incremental and parallel manner.作者: Enliven 時間: 2025-3-30 07:22
Implications of an automatic lexical acquisition system,lle was thoroughly tested within a complex, real-world domain. The implementation of this system produced many lessons which are applicable to language learning in general. This paper describes Camille‘s implications for evaluation, for knowledge representation, and for cognitive modeling.作者: SHOCK 時間: 2025-3-30 09:15
Turbulent Particle-Laden Gas Flowseld, summarize the work that is presented here, and provide some additional references. In the final section we will highlight important general issues and trends based on the workshop discussions and book contributions.作者: 伸展 時間: 2025-3-30 15:35 作者: ALLEY 時間: 2025-3-30 17:43 作者: 一起平行 時間: 2025-3-31 00:23
Lecture Notes in Computer Sciencetem, called LIEP, learns patterns that recognize relationships between key constituents based on local syntax. Sets of patterns learned by LIEP for a sample extraction task perform nearly at the level of a hand-built dictionary of patterns.作者: facilitate 時間: 2025-3-31 02:29 作者: 鞭打 時間: 2025-3-31 07:43 作者: 極大的痛苦 時間: 2025-3-31 12:59
https://doi.org/10.1007/b137383al determiners from textually derived semantic representations, where the target category was removed from the input. Because texts are semantically underdetermined, these representations have some degree of noise. In generation we can still assign the correct category in many cases (83%). These res作者: 很是迷惑 時間: 2025-3-31 17:25 作者: deadlock 時間: 2025-3-31 19:57 作者: 污點 時間: 2025-3-31 21:45 作者: Nomogram 時間: 2025-4-1 05:48
Extracting rules for grammar recognition from Cascade-2 networks,ucted using the Cascade 2 algorithm and multi-layer perceptrons trained using backpropagation. The specific networks of interest are those trained to recognise if simple fixed length sentences are grammatically correct.作者: Fatten 時間: 2025-4-1 07:00
Generating English plural determiners from semantic representations: A neural network learning appral determiners from textually derived semantic representations, where the target category was removed from the input. Because texts are semantically underdetermined, these representations have some degree of noise. In generation we can still assign the correct category in many cases (83%). These res作者: bioavailability 時間: 2025-4-1 12:10
Integrating different learning approaches into a multilingual spoken language translation system,n syntactic analysis is done. Concept based speech translation and a connectionist parser that learns to parse into feature structures are introduced. Furthermore, different repair mechanisms to recover from recognition errors will be described.作者: 溫和女孩 時間: 2025-4-1 14:49
Learning language using genetic algorithms,e model of language structure. The GA is statistically sensitive in that the utility of frequent patterns is reinforced by the persistence of efficient substructures. It also supports the view of language learning as a “bootstrapping problem,” a learning domain where it appears necessary to simultan作者: 震驚 時間: 2025-4-1 19:42