Techniques Comparison for Natural Language Processing

Iosifova, Olena and Iosifov, Ievgen and Rolik, Oleksandr and Sokolov, V. Y. (2020) Techniques Comparison for Natural Language Processing MoMLeT&DS, 2631 (I). pp. 57-67. ISSN 1613-0073

[thumbnail of O_Iosifova_I_Iosifov_O_Rolik_V_Sokolov_MoMLeT_2631.pdf]
Preview
Text
O_Iosifova_I_Iosifov_O_Rolik_V_Sokolov_MoMLeT_2631.pdf

Download (573kB) | Preview

Abstract

These improvements open many possibilities in solving Natural Language Processing downstream tasks. Such tasks include machine translation, speech recognition, information retrieval, sentiment analysis, summarization, question answering, multilingual dialogue systems development, and many more. Language models are one of the most important components in solving each of the mentioned tasks. This paper is devoted to research and analysis of the most adopted techniques and designs for building and training language models that show a state of the art results. Techniques and components applied in the creation of language models and its parts are observed in this paper, paying attention to neural networks, embedding mechanisms, bidirectionality, encoder and decoder architecture, attention, and self-attention, as well as parallelization through using transformer. As a result, the most promising techniques imply pre-training and fine-tuning of a language model, attention-based neural network as a part of model design, and a complex ensemble of multidimensional embedding to build deep context understanding. The latest offered architectures based on these approaches require a lot of computational power for training language models, and it is a direction of further improvement. Algorithm for choosing right model for relevant business task provided considering current challenges and available architectures.

Item Type: Article
Additional Information: DOI: 10/d238 EID: 2-s2.0-85088881294
Uncontrolled Keywords: Natural Language Processing; NLP; Language Model; Embedding; Recurrent Neural Network; RNN; Gated Recurrent Unit; GRU; Long Short-Term Memory; LSTM, Encoder; Decoder; Attention; Transformer; Transfer Learning; Deep Learning; Neural Network
Subjects: Це архівна тематика Київського університету імені Бориса Грінченка > Статті у наукометричних базах > Scopus
Це архівна тематика Київського університету імені Бориса Грінченка > Статті у наукометричних базах > Web of Science
Divisions: Це архівні підрозділи Київського університету імені Бориса Грінченка > Факультет інформаційних технологій та математики > Кафедра інформаційної та кібернетичної безпеки імені професора Володимира Бурячка
Depositing User: Volodymyr Sokolov
Date Deposited: 13 Aug 2020 09:54
Last Modified: 09 Aug 2021 12:33
URI: https://elibrary.kubg.edu.ua/id/eprint/31628

Actions (login required)

View Item View Item