Measuring Emotional Response to Sensory Attributes: Context effects

Nijman, Marit (2019) Measuring Emotional Response to Sensory Attributes: Context effects. PhD thesis, University of Nottingham.

[img] PDF (Thesis - as examined) - Repository staff only - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
Download (3MB)


The context in which products are consumed can impact consumer liking and choice behaviour, but not much is currently known about how context might impact emotional responses. This research aimed to fill the gap of knowledge in this area by investigating the impact of context on emotional response to the sensory attributes of beer.

Consumers’ self-reported emotional response to Ale and Lager beer was compared between a sensory laboratory, bar and evoked (imagined with the help of pictures and sound recordings) context, and consumers were clustered based on their product liking per context. Results showed stronger product differentiation on emotional response in the bar context and variation between consumers in the degree to which their liking and emotional response was impacted by context. Evoking a bar context in a sensory lab was shown to lead to more similar emotional responses to the bar for consumers that were context-sensitive.

Controlled test environments offer greater experimental control than real life consumption environments, but it is not known how stable emotional responses are in these different test situations. The stability of emotional response was evaluated over the course of consuming a lager beer from a single sip to a glass of beer (284 ml) in a bar and central location test (CLT). Single sip responses were found to be less stable than emotions reported after a glass of beer, but test-retest reliability was comparable between the bar and CLT context.

Smartphones can facilitate the collection of emotional response data in real life environments. However, the smaller screen and using a finger might lead to different emotion scores on line scales than when scores are given with a mouse on the monitor display of a desktop computer. Comparing consumers’ self-reported emotional responses to beer collected with smartphones to responses collected with a computer revealed a stronger product differentiation when a computer was used. Data collection with smartphones might lead to underestimated differences in emotional response.

Virtual Reality (VR) can be used to simulate a relevant consumption environment in a controlled test environment. To date research on how simulation of context with VR might impact emotional responses is limited. An innovative VR methodology was developed in which a 360° video of a bar was combined with virtual modelling and tracking of objects in order to facilitate independent beer consumption and emotion questionnaire completion by participants while immersed in VR. Consumers reported higher levels of engagement in this VR context than when pictures and sound recordings were used to evoke the same bar context. However, the excitement associated with the novelty of VR technology appeared to affect the measurement of emotional responses to the beers, leading to higher scores on positive emotion categories. This effect was shown to have reduced during a second exposure to the VR context. Interviews identified key elements to creating a convincing virtual consumption environment.

This research delivered valuable information for the field of sensory consumer science regarding strategies to optimise consumption context for the measurement of consumers’ emotional responses to products.

Item Type: Thesis (University of Nottingham only) (PhD)
Supervisors: Ford, Rebecca
Yang, Qian
Keywords: Sensory consumer research, context, emotional response, beer, virtual reality
Subjects: B Philosophy. Psychology. Religion > BF Psychology
T Technology > TP Chemical technology > TP 368 Food processing and manufacture
Faculties/Schools: UK Campuses > Faculty of Science > School of Biosciences
Item ID: 59306
Depositing User: Nijman, Marit
Date Deposited: 26 Oct 2019 04:40
Last Modified: 13 Dec 2021 04:30

Actions (Archive Staff Only)

Edit View Edit View