Please use this identifier to cite or link to this item: http://hdl.handle.net/10553/117901
Title: Informer: an efficient transformer architecture using convolutional layers
Authors: Estupiñán Ojeda, Cristian David 
Guerra-Artal, Cayetano 
Hernández Tejera, Mario 
UNESCO Clasification: 120304 Inteligencia artificial
Keywords: Convolutional Layers
Deep Learning
Neural Machine Translation
Transformer
Issue Date: 2022
Journal: Lecture Notes in Computer Science 
Conference: 13th International Conference on Agents and Artificial Intelligence, ICAART 2021 
Abstract: The use of Transformer based architectures has been extended in recent years, reaching the level of State of the Art (SOTA) in numerous tasks in the field of Natural Language Processing (NLP). However, despite the advantages of this architecture, it has some negative factors, such as the high number of parameters it uses. That is why the use of this type of architecture can become expensive for research teams with limited resources. New variants have emerged with the purpose of improving the efficiency of the Transformer architecture, addressing different aspects of it. In this paper we will focus on the development of a new architecture that seeks to reduce the memory consumption of the Transformer, meanwhile is able to achieve a SOTA result in two different datasets [14] for the Neural Machine Translation (NMT) task.
URI: http://hdl.handle.net/10553/117901
ISBN: 978-3-031-10160-1
ISSN: 0302-9743
DOI: 10.1007/978-3-031-10161-8_11
Source: Rocha, A.P., Steels, L., van den Herik, J. (eds) Agents and Artificial Intelligence. ICAART 2021. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) [ISSN 0302-9743], v. 13251 LNAI, p. 208-217, (Enero 2022)
Appears in Collections:Actas de congresos
Show full item record

Google ScholarTM

Check

Altmetric


Share



Export metadata



Items in accedaCRIS are protected by copyright, with all rights reserved, unless otherwise indicated.