Files in this item
Investigation of accurate, fast, robust, neural gesture recognition involving a sequential approach
Item metadata
dc.contributor.advisor | Weir, Michael | |
dc.contributor.author | Almuallem, Zahida | |
dc.coverage.spatial | 467 | en_US |
dc.date.accessioned | 2023-08-18T14:10:41Z | |
dc.date.available | 2023-08-18T14:10:41Z | |
dc.date.issued | 2023-11-28 | |
dc.identifier.uri | https://hdl.handle.net/10023/28201 | |
dc.description.abstract | Abstract redacted | en_US |
dc.description.sponsorship | "I would like to show my gratitude to my employer, the Department of Computer Science at King Saud University, for permitting me to study leave and granting a scholarship to pursue my PhD study. Also, I would like to thank them for their support and understanding, especially during the Covid-19 pandemic. This work was not possible without their support as this work was fully supported by King Saud University (college of computer and information science)."--Acknowledgements | en |
dc.language.iso | en | en_US |
dc.relation | Local minima Benchmark Problems and two letters dataset (thesis data). Almuallem, Z., University of St Andrews, 14 Aug 2025. DOI: https://doi.org/10.17630/b604b5ae-ec6d-4b0a-b050-a4bbf80b948b | en |
dc.relation.uri | https://doi.org/10.17630/b604b5ae-ec6d-4b0a-b050-a4bbf80b948b | |
dc.title | Investigation of accurate, fast, robust, neural gesture recognition involving a sequential approach | en_US |
dc.type | Thesis | en_US |
dc.contributor.sponsor | King Saud University | en_US |
dc.type.qualificationlevel | Doctoral | en_US |
dc.type.qualificationname | PhD Doctor of Philosophy | en_US |
dc.publisher.institution | The University of St Andrews | en_US |
dc.rights.embargodate | 2025-08-14 | |
dc.rights.embargoreason | Thesis restricted in accordance with University regulations. Restricted until 14th August 2025 | en |
dc.identifier.doi | https://doi.org/10.17630/sta/583 |
This item appears in the following Collection(s)
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.