Show simple item record

Files in this item

Thumbnail

Item metadata

dc.contributor.authorO’Grady, Stefan
dc.date.accessioned2024-02-02T12:30:08Z
dc.date.available2024-02-02T12:30:08Z
dc.date.issued2024-01-31
dc.identifier298637439
dc.identifier891c280f-14d4-4a84-9fa8-ac442fa12645
dc.identifier85184220360
dc.identifier.citationO’Grady , S 2024 , ' Investigating the role of response format in computer-based lecture comprehension tasks ' , International Journal of Listening , vol. Latest Articles . https://doi.org/10.1080/10904018.2024.2312272en
dc.identifier.issn1090-4018
dc.identifier.otherORCID: /0000-0003-3810-713X/work/152318612
dc.identifier.urihttps://hdl.handle.net/10023/29142
dc.description.abstractLanguage assessment is increasingly computermediated. This development presents opportunities with new task formats and equally a need for renewed scrutiny of established conventions. Recent recommendations to increase integrated skills assessment in lecture comprehension tests is premised on empirical research that demonstrates enhanced construct coverage over conventional selected response formats such as multiple-choice. However, the comparison between response formats is underexplored in computer-mediated assessment and does not consider test item presentation methods that this technology affords. To this end, the present study investigates performance in a computer-mediated lecture comprehension task by examining test taker accounts of task completion involving multiple-choice questions without question preview and integrated response formats. Findings demonstrate overlap between the formats in terms of several core processes but also point to important differences regarding the prioritization of aspects of the lecture, memory and test anxiety. In many respects, participant comments indicate the multiple-choice format measured a more comprehensive construct than the integrated format. The research will be relevant to individuals with interests in computer-mediated assessment and specifically with a responsibility for developing and validating lecture comprehension assessments.
dc.format.extent16
dc.format.extent839596
dc.language.isoeng
dc.relation.ispartofInternational Journal of Listeningen
dc.subjectLB Theory and practice of educationen
dc.subject.lccLBen
dc.titleInvestigating the role of response format in computer-based lecture comprehension tasksen
dc.typeJournal articleen
dc.contributor.institutionUniversity of St Andrews. International Education Instituteen
dc.identifier.doihttps://doi.org/10.1080/10904018.2024.2312272
dc.description.statusPeer revieweden


This item appears in the following Collection(s)

Show simple item record