Generative co-design & non-planar additive manufacture of aesthetic prostheses

Zhou, Feng (2023) Generative co-design & non-planar additive manufacture of aesthetic prostheses. PhD thesis, University of Nottingham.

PDF (The corrected version of my thesis) (Thesis - as examined) - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
Available under Licence Creative Commons Attribution.
Download (244MB) | Preview


Traditionally, the prosthesis has been treated as medicalised device and primarily designed for function. The aesthetics of prostheses appeared to be a secondary concern, and even when they were considered, the appearance of prostheses often tried to mimic human limbs to hide disability. However, a new trend of aesthetic prosthesis has emerged recently which solicits attention, expresses the personal style and self-identity of the individual with limb loss or absence, and emphasises their individuality and uniqueness rather than incompleteness, which has been demonstrated to significantly impact users’ psychological well-being.

However, such aesthetic prostheses must be unique to each individual, which requires a degree of personalisation in both design and manufacture that exceeds the capabilities of conventional design and manufacturing techniques. In response, I establish techniques for the generative co-design and non-planar additive manufacture of personalised aesthetic prostheses. This involves following an interdisciplinary approach that weaves together techniques from human-computer interaction (HCI), prosthesis and disability research, dance and motion capture, and additive manufacture.

My proposed generative co-design strategy combines the advantages of generative design, which enables the efficient exploration of many designs, with the collaborative design which enables users’ involvement in the design process so as to embody a deep expression of individual identity within the designed prostheses. The strategy enables the direct personalisation of aesthetic prostheses through a personally expressive skill, in this case, dancing, without the involvement of professional designers.

The strategy is embodied in three algorithms that collectively address the whole design flow from conceptual design to final design for manufacture. Mogrow is a generative co-design algorithm driven by motion capture technology so that dancing can generate personalised aesthetic seeds - archetypal designs that might be applied to various products. Leg sculpting is a generative design algorithm that applies an aesthetic seed to a specific product, a prosthesis cover that is personalised to fit users’ unique body features. A final algorithm optimises the design of the prosthesis produced by leg sculpting to be manufactured without printing supports, significantly improving the efficiency of additive manufacture without compromising aesthetic details.

While the application of additive manufacturing technology can significantly improve the efficiency of customisation, the aesthetic prosthesis requires higher freedom of morphology to open up a broader space for aesthetic consideration, which potentially conflicts with the requirements of mechanical strength and weight. The final contribution of this thesis is therefore to establish a non-planar additive manufacturing platform based on a six degrees of freedom (6DOF) robotic arm, that accommodates tradeoffs between visual aesthetic, form, weight, and mechanical properties of aesthetic prostheses.

This research uses disabled dancers as research collaborators. Three workshops are conducted to interact with the algorithms and discuss the results.

Item Type: Thesis (University of Nottingham only) (PhD)
Supervisors: Benford, Steve
Ashcroft, Ian
Keywords: prosthesis, prostheses, design, hci, generative co-design
Subjects: Q Science > QA Mathematics > QA 75 Electronic computers. Computer science
R Medicine > RD Surgery
Faculties/Schools: UK Campuses > Faculty of Science > School of Computer Science
Item ID: 73519
Depositing User: ZHOU, FENG
Date Deposited: 26 Jul 2023 04:40
Last Modified: 26 Jul 2023 04:40

Actions (Archive Staff Only)

Edit View Edit View