找回密碼
 To register

QQ登錄

只需一步,快速開始

掃一掃,訪問微社區(qū)

打印 上一主題 下一主題

Titlebook: Joint Training for Neural Machine Translation; Yong Cheng Book 2019 Springer Nature Singapore Pte Ltd. 2019 Machine Translation.Neural Mac

[復(fù)制鏈接]
查看: 22532|回復(fù): 42
樓主
發(fā)表于 2025-3-21 16:12:43 | 只看該作者 |倒序?yàn)g覽 |閱讀模式
書目名稱Joint Training for Neural Machine Translation
編輯Yong Cheng
視頻videohttp://file.papertrans.cn/502/501166/501166.mp4
概述Nominated by Tsinghua University as an outstanding Ph.D. thesis.Reports on current challenges and important advances in neural machine translation.Addresses training jointly bidirectional neural machi
叢書名稱Springer Theses
圖書封面Titlebook: Joint Training for Neural Machine Translation;  Yong Cheng Book 2019 Springer Nature Singapore Pte Ltd. 2019 Machine Translation.Neural Mac
描述.This book presents four approaches to jointly training bidirectional neural machine translation (NMT) models. First, in order to improve the accuracy of the attention mechanism, it proposes an agreement-based joint training approach to help the two complementary models agree on word alignment matrices for the same training data. Second, it presents a semi-supervised approach that uses an autoencoder to reconstruct monolingual corpora, so as to incorporate these corpora into neural machine translation. It then introduces a joint training algorithm for pivot-based neural machine translation, which can be used to mitigate the data scarcity problem. Lastly it describes an end-to-end bidirectional NMT model to connect the source-to-target and target-to-source translation models, allowing the interaction of parameters between these two directional models..
出版日期Book 2019
關(guān)鍵詞Machine Translation; Neural Machine Translation; Joint Training; Joint Modeling; Bidirectional Model
版次1
doihttps://doi.org/10.1007/978-981-32-9748-7
isbn_ebook978-981-32-9748-7Series ISSN 2190-5053 Series E-ISSN 2190-5061
issn_series 2190-5053
copyrightSpringer Nature Singapore Pte Ltd. 2019
The information of publication is updating

書目名稱Joint Training for Neural Machine Translation影響因子(影響力)




書目名稱Joint Training for Neural Machine Translation影響因子(影響力)學(xué)科排名




書目名稱Joint Training for Neural Machine Translation網(wǎng)絡(luò)公開度




書目名稱Joint Training for Neural Machine Translation網(wǎng)絡(luò)公開度學(xué)科排名




書目名稱Joint Training for Neural Machine Translation被引頻次




書目名稱Joint Training for Neural Machine Translation被引頻次學(xué)科排名




書目名稱Joint Training for Neural Machine Translation年度引用




書目名稱Joint Training for Neural Machine Translation年度引用學(xué)科排名




書目名稱Joint Training for Neural Machine Translation讀者反饋




書目名稱Joint Training for Neural Machine Translation讀者反饋學(xué)科排名




單選投票, 共有 1 人參與投票
 

1票 100.00%

Perfect with Aesthetics

 

0票 0.00%

Better Implies Difficulty

 

0票 0.00%

Good and Satisfactory

 

0票 0.00%

Adverse Performance

 

0票 0.00%

Disdainful Garbage

您所在的用戶組沒有投票權(quán)限
沙發(fā)
發(fā)表于 2025-3-21 23:27:18 | 只看該作者
Agreement-Based Joint Training for Bidirectional Attention-Based Neural Machine Translation,n the same training data. Experiments on ChineseEnglish and English-French translation tasks show that agreement-based joint training significantly improves both alignment and translation quality over independent training.
板凳
發(fā)表于 2025-3-22 00:39:12 | 只看該作者
Semi-supervised Learning for Neural Machine Translation,et and target-to-source translation models serve as the encoder and decoder, respectively. Our approach can not only exploit the monolingual corpora of the target language, but also of the source language. Experiments on the ChineseEnglish dataset show that our approach achieves significant improvements over state-of-the-art SMT and NMT systems.
地板
發(fā)表于 2025-3-22 06:50:26 | 只看該作者
5#
發(fā)表于 2025-3-22 09:56:30 | 只看該作者
6#
發(fā)表于 2025-3-22 15:36:46 | 只看該作者
7#
發(fā)表于 2025-3-22 19:05:00 | 只看該作者
8#
發(fā)表于 2025-3-22 23:00:42 | 只看該作者
Book 2019 of the attention mechanism, it proposes an agreement-based joint training approach to help the two complementary models agree on word alignment matrices for the same training data. Second, it presents a semi-supervised approach that uses an autoencoder to reconstruct monolingual corpora, so as to i
9#
發(fā)表于 2025-3-23 01:35:18 | 只看該作者
10#
發(fā)表于 2025-3-23 08:01:13 | 只看該作者
Related Work,T. Next we summarize a number of work which incorporate additional data resources, such as monolingual corpora and pivot language corpora, into machine translation systems. Finally, we make a simple review of the studies about contrastive learning, which is a key technique in our fourth work.
 關(guān)于派博傳思  派博傳思旗下網(wǎng)站  友情鏈接
派博傳思介紹 公司地理位置 論文服務(wù)流程 影響因子官網(wǎng) 吾愛論文網(wǎng) 大講堂 北京大學(xué) Oxford Uni. Harvard Uni.
發(fā)展歷史沿革 期刊點(diǎn)評 投稿經(jīng)驗(yàn)總結(jié) SCIENCEGARD IMPACTFACTOR 派博系數(shù) 清華大學(xué) Yale Uni. Stanford Uni.
QQ|Archiver|手機(jī)版|小黑屋| 派博傳思國際 ( 京公網(wǎng)安備110108008328) GMT+8, 2025-10-6 23:43
Copyright © 2001-2015 派博傳思   京公網(wǎng)安備110108008328 版權(quán)所有 All rights reserved
快速回復(fù) 返回頂部 返回列表
昆明市| 天峻县| 南开区| 桐乡市| 尚志市| 九龙县| 如东县| 巴南区| 福海县| 依兰县| 台州市| 江陵县| 陇西县| 盐城市| 新沂市| 田东县| 南乐县| 东莞市| 隆昌县| 将乐县| 虎林市| 英吉沙县| 锦屏县| 南城县| 伊春市| 白朗县| 池州市| 申扎县| 新乡县| 通辽市| 府谷县| 张家港市| 凌海市| 康定县| 乐亭县| 高青县| 民权县| 龙南县| 荔波县| 安仁县| 平凉市|