Identificador persistente para citar o vincular este elemento:
http://hdl.handle.net/10553/117901
Título: | Informer: an efficient transformer architecture using convolutional layers | Autores/as: | Estupiñán Ojeda, Cristian David Guerra-Artal, Cayetano Hernández Tejera, Mario |
Clasificación UNESCO: | 120304 Inteligencia artificial | Palabras clave: | Convolutional Layers Deep Learning Neural Machine Translation Transformer |
Fecha de publicación: | 2022 | Publicación seriada: | Lecture Notes in Computer Science | Conferencia: | 13th International Conference on Agents and Artificial Intelligence, ICAART 2021 | Resumen: | The use of Transformer based architectures has been extended in recent years, reaching the level of State of the Art (SOTA) in numerous tasks in the field of Natural Language Processing (NLP). However, despite the advantages of this architecture, it has some negative factors, such as the high number of parameters it uses. That is why the use of this type of architecture can become expensive for research teams with limited resources. New variants have emerged with the purpose of improving the efficiency of the Transformer architecture, addressing different aspects of it. In this paper we will focus on the development of a new architecture that seeks to reduce the memory consumption of the Transformer, meanwhile is able to achieve a SOTA result in two different datasets [14] for the Neural Machine Translation (NMT) task. | URI: | http://hdl.handle.net/10553/117901 | ISBN: | 978-3-031-10160-1 | ISSN: | 0302-9743 | DOI: | 10.1007/978-3-031-10161-8_11 | Fuente: | Rocha, A.P., Steels, L., van den Herik, J. (eds) Agents and Artificial Intelligence. ICAART 2021. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) [ISSN 0302-9743], v. 13251 LNAI, p. 208-217, (Enero 2022) |
Colección: | Actas de congresos |
Los elementos en ULPGC accedaCRIS están protegidos por derechos de autor con todos los derechos reservados, a menos que se indique lo contrario.