删除或更新信息,请邮件至freekaoyan#163.com(#换成@)

融合依存句法和LSTM的神经机器翻译模型

本站小编 Free考研考试/2024-10-07

作者:郑鑫,陈海龙,马玉群,王青
Authors:ZHENG Xin,CHEN Hailong,MA Yuqun,WANG Qing摘要:针对神经机器翻译模型Transformer缺少语言学知识,以及位置信息编码方式不够灵活等问题,引入依存句法分析和长短时记忆网络LSTM(long short-term memory, LSTM),在神经机器翻译系统中构建源语言句法结构信息,并且利用LSTM的记忆特性获取更加准确的位置信息。给出一个源语言序列,采用依存句法树来将其转化为相应的依存关系矩阵,最后采用CBOW(continuous bag-of-words model, CBOW)模型根据依存关系词和上下文词预测目标词获取词向量。输出的词向量使用LSTM模型训练,将每一个时间步的输出与原序列拼接作为输入序列。实验结果表明:在WMT17汉语-英语语言对的翻译任务中,使用改进后的模型提升了0.93个BLEU点。
Abstract:For the problem of the lack of linguistic knowledge, and the insufficient flexibility of positional encoding in the neural machine translation model which is called Transformer, this paper introduces dependency syntax analysis and the long short-term memory network(Longshort-term memory, LSTM) to construct the Source language syntactic structure information in the neural machine translation system, and uses the memory characteristics of LSTM to obtain more accurate positional information.Given a source language sequence, transform it into the corresponding dependency matrix by dependency syntax tree, and finally use the CBOW (Continuous bag-of-words model, CBOW) model to predict the target word based on the dependency word and the context word to obtain the word embedding.Use the LSTM model to train the output word embedding, and concatenate the output of each step with the original sequence as the input sequence.Experiments show that in the WMT17 Chinese-English language pair translation task, using the improved model improves 0.93BLEU points.


PDF全文下载地址:

可免费Download/下载PDF全文
相关话题/

  • 领限时大额优惠券,享本站正版考研考试资料!
    大额优惠券
    优惠券领取后72小时内有效,10万种最新考研考试考证类电子打印资料任你选。涵盖全国500余所院校考研专业课、200多种职业资格考试、1100多种经典教材,产品类型包含电子书、题库、全套资料以及视频,无论您是考研复习、考证刷题,还是考前冲刺等,不同类型的产品可满足您学习上的不同需求。 ...
    本站小编 Free壹佰分学习网 2022-09-19