|Institution:||Manchester Metropolitan University|
|Full text PDF:||http://hdl.handle.net/2173/326218|
This thesis describes motion capture methods with an application for real-time recording of extreme human movement. A wireless gyroscopic sensor based system is used to record and evaluate misalignments in ankle position of ballet dancers in a performance environment. Anatomic alignment has been shown to contribute to dance related injuries, and results of this work show that subtle variations in joint rotation can be clearly measured. The workflow has been developed to extract performance analysis data for fault detection in order to assist augmented feedback methods for the prevention of injury and improved performance. Infra-red depth sensing technology, commonly used in garment design, has been used to produce a representation of a scanned human subject and a workflow established to utilise this character avatar for animation using motion capture data. The process of presenting a visually acceptable representation of an overall performance in addition to the numerical evaluation of specific joint orientation provides a significant contribution to knowledge.