admin管理员组

文章数量:1530082

Abstract

  • Global Attention: always attends to all source words.
  • Local Attention: only looks at a subset of source words at a time.

Introduction


在 NMT 领域,Bahdanau et al. (2015) 成功地将注意力机制应用于 jointly translate and align words.

本文提出了两种新颖的 attention-based models: global attentionlocal attention.

全局 attention 与 Bahdanau et al. (2015) 中的方法非常类似,

本文标签: AttentionApproachesEffectiveBasedTranslation