Please use this identifier to cite or link to this item:
http://hdl.handle.net/10553/121618
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Calvo, Manuel G. | en_US |
dc.contributor.author | Avero, P. | en_US |
dc.contributor.author | Fernández Martín, Andrés | en_US |
dc.contributor.author | Recio, Guillermo | en_US |
dc.date.accessioned | 2023-03-28T12:53:46Z | - |
dc.date.available | 2023-03-28T12:53:46Z | - |
dc.date.issued | 2016 | en_US |
dc.identifier.issn | 1528-3542 | en_US |
dc.identifier.uri | http://hdl.handle.net/10553/121618 | - |
dc.description.abstract | We investigated the minimum expressive intensity that is required to recognize (above chance) static and dynamic facial expressions of happiness, sadness, anger, disgust, fear, and surprise. To this end, we varied the degree of intensity of emotional expressions unfolding from a neutral face, by means of graphics morphing software. The resulting face stimuli (photographs and short videos) were presented in an expression categorization task for 1 s each, and measures of sensitivity or discrimination (A') were collected to establish thresholds. A number of physical, perceptual, categorical, and affective controls were performed. All six basic emotions were reliably recognized above chance level from low intensities, although recognition thresholds varied for different expressions: 20% of intensity, for happiness; 40%, for sadness, surprise, anger, and disgust; and 50%, for fear. The advantage of happy faces may be due to their greater physical change in facial features (as shown by automated facial expression measurement), also at low levels of intensity, relative to neutral faces. Recognition thresholds and the pattern of confusions across expressions were, nevertheless, equivalent for dynamic and static expressions, although dynamic expressions were recognized more accurately and faster. | en_US |
dc.language | eng | en_US |
dc.relation.ispartof | Emotion | en_US |
dc.source | Emotion [ISSN 1528-3542], v. 16 (8), p. 1186–1200, (2016) | en_US |
dc.subject | 610604 Análisis experimental de la conducta | en_US |
dc.subject.other | Dynamic | en_US |
dc.subject.other | Emotion | en_US |
dc.subject.other | Facial expression | en_US |
dc.subject.other | Intensity | en_US |
dc.subject.other | Recognition thresholds | en_US |
dc.title | Recognition thresholds for static and dynamic emotional faces | en_US |
dc.type | info:eu-repo/semantics/article | en_US |
dc.type | Article | en_US |
dc.identifier.doi | 10.1037/emo0000192 | en_US |
dc.identifier.pmid | 27359222 | - |
dc.identifier.scopus | 2-s2.0-84976500583 | - |
dc.identifier.isi | WOS:000389306300010 | - |
dc.contributor.orcid | #NODATA# | - |
dc.contributor.orcid | #NODATA# | - |
dc.contributor.orcid | #NODATA# | - |
dc.contributor.orcid | #NODATA# | - |
dc.description.lastpage | 1200 | en_US |
dc.identifier.issue | 8 | - |
dc.description.firstpage | 1186 | en_US |
dc.relation.volume | 16 | en_US |
dc.investigacion | Ciencias Sociales y Jurídicas | en_US |
dc.type2 | Artículo | en_US |
dc.identifier.external | 48805022 | - |
dc.utils.revision | Sí | en_US |
dc.identifier.ulpgc | No | en_US |
dc.contributor.buulpgc | BU-ECO | en_US |
dc.description.sjr | 2,397 | |
dc.description.jcr | 3,251 | |
dc.description.sjrq | Q1 | |
dc.description.jcrq | Q1 | |
dc.description.ssci | SSCI | |
dc.description.erihplus | ERIH PLUS | |
item.grantfulltext | open | - |
item.fulltext | Con texto completo | - |
crisitem.author.dept | GIR IUCES: Dirección de Marketing, RSC y empresa familiar | - |
crisitem.author.dept | IU de Cibernética, Empresa y Sociedad (IUCES) | - |
crisitem.author.dept | Departamento de Economía y Dirección de Empresas | - |
crisitem.author.orcid | 0000-0002-7638-7489 | - |
crisitem.author.parentorg | IU de Cibernética, Empresa y Sociedad (IUCES) | - |
crisitem.author.fullName | Fernández Martín, Andrés | - |
Appears in Collections: | Artículos |
SCOPUSTM
Citations
75
checked on Nov 17, 2024
WEB OF SCIENCETM
Citations
74
checked on Nov 17, 2024
Page view(s)
71
checked on May 18, 2024
Download(s)
327
checked on May 18, 2024
Google ScholarTM
Check
Altmetric
Share
Export metadata
Items in accedaCRIS are protected by copyright, with all rights reserved, unless otherwise indicated.