Repository logo
Institutional Digital Repository
Shreenivas Deshpande Library, IIT (BHU), Varanasi

NLPRL System for Very Low Resource Supervised Machine Translation

dc.contributor.authorBaruah R.; Mundotiya R.K.; Kumar A.; Singh A.K.
dc.date.accessioned2025-05-23T11:27:16Z
dc.description.abstractThis paper describes the results of the system that we used for the WMT20 very low resource (VLR) supervised MT shared task. For our experiments, we use a byte-level version of BPE, which requires a base vocabulary of size 256 only. BPE based models are a kind of sub-word models. Such models try to address the Out of Vocabulary (OOV) word problem by performing word segmentation so that segments correspond to morphological units. They are also reported to work across different languages, especially similar languages due to their sub-word nature. Based on BLEU cased score, our NLPRL systems ranked ninth for HSB to GER and tenth in GER to HSB translation scenario. © 2020 Association for Computational Linguistics
dc.identifier.doiDOI not available
dc.identifier.urihttp://172.23.0.11:4000/handle/123456789/11211
dc.relation.ispartofseries5th Conference on Machine Translation, WMT 2020 - Proceedings
dc.titleNLPRL System for Very Low Resource Supervised Machine Translation

Files

Collections