Original title: Využití větné struktury v neuronovém strojovém překladu
Translated title: Využití větné struktury v neuronovém strojovém překladu
Authors: Pham, Thuong-Hai ; Bojar, Ondřej (advisor) ; Helcl, Jindřich (referee)
Document type: Master’s theses
Year: 2018
Language: eng
Abstract: Neural machine translation has been lately established as the new state of the art in machine translation, especially with the Transformer model. This model emphasized the importance of self-attention mechanism and sug- gested that it could capture some linguistic phenomena. However, this claim has not been examined thoroughly, so we propose two main groups of meth- ods to examine the relation between these two. Our methods aim to im- prove the translation performance by directly manipulating the self-attention layer. The first group focuses on enriching the encoder with source-side syn- tax with tree-related position embeddings or our novel specialized attention heads. The second group is a joint translation and parsing model leveraging self-attention weight for the parsing task. It is clear from the results that enriching the Transformer with sentence structure can help. More impor- tantly, the Transformer model is in fact able to capture this type of linguistic information with guidance in the context of multi-task learning at nearly no increase in training costs. 1
Keywords: attention machine translation dependency neural network; attention machine translation dependency neural network

Institution: Charles University Faculties (theses) (web)
Document availability information: Available in the Charles University Digital Repository.
Original record: http://hdl.handle.net/20.500.11956/101647

Permalink: http://www.nusl.cz/ntk/nusl-387905


The record appears in these collections:
Universities and colleges > Public universities > Charles University > Charles University Faculties (theses)
Academic theses (ETDs) > Master’s theses
 Record created 2018-11-15, last modified 2022-03-04


No fulltext
  • Export as DC, NUŠL, RIS
  • Share