Files in this item
GesturalOrigins : a bottom-up framework for establishing systematic gesture data across ape species
Item metadata
dc.contributor.author | Grund, Charlotte Vicki Christina | |
dc.contributor.author | Badihi, Gal | |
dc.contributor.author | Graham, Kirsty Emma | |
dc.contributor.author | Safryghin, Alexandra | |
dc.contributor.author | Hobaiter, Cat | |
dc.date.accessioned | 2023-03-16T11:30:08Z | |
dc.date.available | 2023-03-16T11:30:08Z | |
dc.date.issued | 2023-03-15 | |
dc.identifier | 283484104 | |
dc.identifier | b1f51630-90b5-48ed-9629-e53536de4b02 | |
dc.identifier | 85149922129 | |
dc.identifier.citation | Grund , C V C , Badihi , G , Graham , K E , Safryghin , A & Hobaiter , C 2023 , ' GesturalOrigins : a bottom-up framework for establishing systematic gesture data across ape species ' , Behavior Research Methods , vol. 56 , pp. 986–1001 . https://doi.org/10.3758/s13428-023-02082-9 | en |
dc.identifier.issn | 1554-351X | |
dc.identifier.other | ORCID: /0000-0002-7422-7676/work/131122741 | |
dc.identifier.other | ORCID: /0000-0002-3893-0524/work/131122898 | |
dc.identifier.uri | https://hdl.handle.net/10023/27196 | |
dc.description | Funding: This research received funding from the European Union’s 8th Framework Programme, Horizon 2020, under grant agreement no 802719. | en |
dc.description.abstract | Current methodologies present significant hurdles to understanding patterns in the gestural communication of individuals, populations, and species. To address this issue, we present a bottom-up data collection framework for the study of gesture: GesturalOrigins. By “bottom-up”, we mean that we minimise a priori structural choices, allowing researchers to define larger concepts (such as ‘gesture types’, ‘response latencies’, or ‘gesture sequences’) flexibly once coding is complete. Data can easily be re-organised to provide replication of, and comparison with, a wide range of datasets in published and planned analyses. We present packages, templates, and instructions for the complete data collection and coding process. We illustrate the flexibility that our methodological tool offers with worked examples of (great ape) gestural communication, demonstrating differences in the duration of action phases across distinct gesture action types and showing how species variation in the latency to respond to gestural requests may be revealed or masked by methodological choices. While GesturalOrigins is built from an ape-centred perspective, the basic framework can be adapted across a range of species and potentially to other communication systems. By making our gesture coding methods transparent and open access, we hope to enable a more direct comparison of findings across research groups, improve collaborations, and advance the field to tackle some of the long-standing questions in comparative gesture research. | |
dc.format.extent | 16 | |
dc.format.extent | 1147893 | |
dc.language.iso | eng | |
dc.relation.ispartof | Behavior Research Methods | en |
dc.subject | Video coding | en |
dc.subject | Guesture action phases | en |
dc.subject | GesturalOrigins | en |
dc.subject | Visual communication | en |
dc.subject | Language evolution | en |
dc.subject | BF Psychology | en |
dc.subject | DAS | en |
dc.subject.lcc | BF | en |
dc.title | GesturalOrigins : a bottom-up framework for establishing systematic gesture data across ape species | en |
dc.type | Journal article | en |
dc.contributor.sponsor | European Research Council | en |
dc.contributor.institution | University of St Andrews. School of Psychology and Neuroscience | en |
dc.contributor.institution | University of St Andrews. Institute of Behavioural and Neural Sciences | en |
dc.contributor.institution | University of St Andrews. Centre for Social Learning & Cognitive Evolution | en |
dc.identifier.doi | https://doi.org/10.3758/s13428-023-02082-9 | |
dc.description.status | Peer reviewed | en |
dc.identifier.grantnumber | 802719 | en |
This item appears in the following Collection(s)
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.