Please use this identifier to cite or link to this item: http://hdl.handle.net/10553/70371
Title: Automatic domain-specific learning: towards a methodology for ontology enrichment
Authors: Ureña Gómez-Moreno, Pedro
Mestre-Mestre, Eva M.
UNESCO Clasification: 570107 Lengua y literatura
550510 Filología
Keywords: Ontology learning
FunGramKB
Corpus
Terminology
Biology
Issue Date: 2017
Journal: LFE. Revista de Lenguas para Fines Específicos 
Abstract: At the current rate of technological development, in a world where enormous amount of data are constantly created and in which the Internet is used as the primary means for information exchange, there exists a need for tools that help processing, analyzing and using that information. However, while the growth of information poses many opportunities for social and scientific advance, it has also highlighted the difficulties of extracting meaningful patterns from massive data. Ontologies have been claimed to play a major role in the processing of large-scale data, as they serve as universal models of knowledge representation, and are being studied as possible solutions to this. This paper presents a method for the automatic expansion of ontologies based on corpus and terminological data exploitation. The proposed “ontology enrichment method” (OEM) consists of a sequence of tasks aimed at classifying an input keyword automatically under its corresponding node within a target ontology. Results prove that the method can be successfully applied for the automatic classification of specialized units into a reference ontology.
URI: http://hdl.handle.net/10553/70371
ISSN: 1133-1127
DOI: 10.20420/rlfe.2017.173
Source: LFE. Revista de lenguas para fines específicos [eISSN 2340-8561], v. 23 (2), p. 63-85
Appears in Collections:Artículos
Thumbnail
pdf
Adobe PDF (476,9 kB)
Show full item record

Page view(s)

60
checked on May 4, 2024

Download(s)

89
checked on May 4, 2024

Google ScholarTM

Check

Altmetric


Share



Export metadata



Items in accedaCRIS are protected by copyright, with all rights reserved, unless otherwise indicated.