• No results found

[PDF] Top 20 Incremental Decoding and Training Methods for Simultaneous Translation in Neural Machine Translation

Has 10000 "Incremental Decoding and Training Methods for Simultaneous Translation in Neural Machine Translation" found on our website. Below are the top 20 most common "Incremental Decoding and Training Methods for Simultaneous Translation in Neural Machine Translation".

Incremental Decoding and Training Methods for Simultaneous Translation in Neural Machine Translation

Incremental Decoding and Training Methods for Simultaneous Translation in Neural Machine Translation

... the neural MT architecture to oper- ate in an online fashion where i) the encoder and the attention are updated dynamically as new input words are added, through a READ operation, and ii) the decoder generates ... See full document

7

Dynamic Oracle for Neural Machine Translation in Decoding Phase

Dynamic Oracle for Neural Machine Translation in Decoding Phase

... two methods which use dynamic oracle to solve the issue mentioned ...during training time the truth have been completely altered, our method will always pro- vide the model with the best suitable token in ... See full document

5

Training Neural Machine Translation to Apply Terminology Constraints

Training Neural Machine Translation to Apply Terminology Constraints

... two decoding layers, shared source and target embeddings, and use the Sock- eye toolkit (Hieber et ...two methods we propose, train-by-appending and train-by-replace with the constrained decoding ... See full document

6

Improving Neural Machine Translation through Phrase based Forced Decoding

Improving Neural Machine Translation through Phrase based Forced Decoding

... the training set to improve the training pro- cess of phrase-based SMT and prune the phrase- based rule ...forced decoding, but they used a high penalty for all insertions and ...forced ... See full document

11

Incremental Segmentation and Decoding Strategies for Simultaneous Translation

Incremental Segmentation and Decoding Strategies for Simultaneous Translation

... statistical machine translation. Mini- mum error rate training (MERT) was performed on the development set (dev2010) to optimize the feature weights of the log-linear model used in ...During ... See full document

5

From Bilingual to Multilingual Neural Machine Translation by Incremental Training

From Bilingual to Multilingual Neural Machine Translation by Incremental Training

... On the other hand, architectures that share pa- rameters between all languages (Johnson et al., 2017) by using a single encoder and decoder trained to be able to translate from and to any of the languages of the system. ... See full document

7

Efficient Incremental Decoding for Tree to String Translation

Efficient Incremental Decoding for Tree to String Translation

... Most efforts in statistical machine translation so far are variants of either phrase-based or syntax-based models. From a theoretical point of view, phrase- based models are neither expressive nor ... See full document

11

Applicability and Challenges of Using Machine Translation in Translator Training

Applicability and Challenges of Using Machine Translation in Translator Training

... decade, translation as well as translator training have experienced a significant ...web-based translation resources, such as Google ...the translation didactic process and training is ... See full document

12

Tibetan Chinese Neural Machine Translation based on Syllable Segmentation

Tibetan Chinese Neural Machine Translation based on Syllable Segmentation

... the translation model, which breaks the limita- tion that the traditional encoder-decoder structure, such as Seq2Seq model, relies on a fixed length vector in the process of ...In machine translation ... See full document

9

Fast Decoding and Optimal Decoding for Machine Translation

Fast Decoding and Optimal Decoding for Machine Translation

... A*) decoding algorithm is a kind of best-first search which was first intro- duced in the domain of speech recognition (Je- linek, ...stack decoding algorithm ... See full document

8

Pre Translation for Neural Machine Translation

Pre Translation for Neural Machine Translation

... One main drawback of this approach is that the whole source sentence has to be stored in a fixed- size context vector. To overcome this problem, (Bahdanau et al., 2014) introduced the soft attention mechanism. Instead of ... See full document

9

Minimum Risk Training for Neural Machine Translation

Minimum Risk Training for Neural Machine Translation

... Although NMT models have achieved results on par with or better than conventional SMT, they still suffer from a major drawback: the models are op- timized to maximize the likelihood of training data instead of ... See full document

10

RegMT System for Machine Translation, System Combination, and Evaluation

RegMT System for Machine Translation, System Combination, and Evaluation

... Philipp Koehn, Hieu Hoang, Alexandra Birch, Chris Callison-Burch, Marcello Federico, Nicola Bertoldi, Brooke Cowan, Wade Shen, Christine Moran, Richard Zens, Chris Dyer, Ondrej Bojar, Alexandra Con- stantin, and Evan ... See full document

7

Faster Decoding for Subword Level Phrase based SMT between Related Languages

Faster Decoding for Subword Level Phrase based SMT between Related Languages

... 2: Translation accuracy and Relative decoding time for orthographic syllable level translation using different decoding methods and ...Relative decoding time is indicated as a ... See full document

7

Decoding Algorithm in Statistical Machine Translation

Decoding Algorithm in Statistical Machine Translation

... Table 2: Examples of Correct, Okay, and Incorrect Translations: for each translation, the first line is an input German sentence, the second line is the human made target translation for[r] ... See full document

7

Low Resource Corpus Filtering Using Multilingual Sentence Embeddings

Low Resource Corpus Filtering Using Multilingual Sentence Embeddings

... In this paper, we describe our submission to the WMT19 low-resource parallel corpus fil- tering shared task. Our main approach is based on the LASER toolkit (Language-Agnostic SEntence Representations), which uses an ... See full document

6

Bidirectional Decoding for Statistical Machine Translation

Bidirectional Decoding for Statistical Machine Translation

... The translation process is treated as a noisy chan- nel model, like those used in speech recognition in which there exists e transcribed as f, and a trans- lation is to infer the best e from f in terms of ...a ... See full document

7

Variational Decoding for Statistical Machine Translation

Variational Decoding for Statistical Machine Translation

... Already at (14), we explicitly ruled out translations y having no derivation at all in the hypergraph. However, suppose the hypergraph were very large (thanks to a large or smoothed translation model and weak ... See full document

9

Equalizing Gender Bias in Neural Machine Translation with Word Embeddings Techniques

Equalizing Gender Bias in Neural Machine Translation with Word Embeddings Techniques

... proper translation has to be derived from ...the translation sys- tem is gender biased, the context is disregarded, while if the system is neutral, the translation is cor- rect (since it has the ... See full document

8

Trainable Greedy Decoding for Neural Machine Translation

Trainable Greedy Decoding for Neural Machine Translation

... two decoding objectives for two ...evaluating translation systems) and log-likelihood (the most widely used learning criterion for neural machine ...greedy decoding and beam ...greedy ... See full document

11

Show all 10000 documents...