Sign In

A subscription to JoVE is required to view this content. Sign in or start your free trial.

In This Article

  • Summary
  • Abstract
  • Introduction
  • Protocol
  • Results
  • Discussion
  • Disclosures
  • Acknowledgements
  • Materials
  • References
  • Reprints and Permissions

Summary

This article describes a set of methods for measuring the suppressive ability of sniffing alcoholic beverages on the wasabi-elicited stinging sensation.

Abstract

The commercial wasabi pastes commonly used for food preparation contain a homologous compound of chemosensory isothiocyanates (ITCs) that elicit an irritating sensation upon consumption. The impact of sniffing dietary alcoholic beverages on the sensation of wasabi spiciness has never been studied. While most sensory evaluation studies focus on individual food and beverages separately, there is a lack of research on the olfactory study of sniffing liquor while consuming wasabi. Here, a methodology is developed that combines the use of an animal behavioral study and a convolutional neural network to analyze the facial expressions of mice when they simultaneously sniff liquor and consume wasabi. The results demonstrate that the trained and validated deep learning model recognizes 29% of the images depicting co-treatment of wasabi and alcohol belonging to the class of the wasabi-negative liquor-positive group without the need for prior training materials filtering. Statistical analysis of mouse grimace scale scores obtained from the selected video frame images reveals a significant difference (P < 0.01) between the presence and absence of liquor. This finding suggests that dietary alcoholic beverages might have a diminishing effect on the wasabi-elicited reactions in mice. This combinatory methodology holds potential for individual ITC compound screening and sensory analyses of spirit components in the future. However, further study is required to investigate the underlying mechanism of alcohol-induced suppression of wasabi pungency.

Introduction

Wasabia japonica, commonly known as wasabi, has gained recognition in food preparation1,2. The intense sensory experience it elicits upon consumption, characterized by tearing up, sneezing, or coughing, is well-known. This distinctive pungency of wasabi can be attributed to a homologous compound of chemosensory isothiocyanates (ITCs). They are volatile organosulfur phytochemicals that can be categorized into ω-alkenyl and ω-methylthioalkyl isothiocyanates3. Among these compounds, allyl isothiocyanate (AITC) is the most predominated natural ITC product found in plants belonging to the Cruciferae family, such as horseradish and mustard4. Commercial wasabi pastes are commonly prepared from horseradish, making AITC a chemical marker used for quality control of these commercial products5.

Pairing dietary alcoholic beverages with wasabi-infused dishes can be considered an example of cultural disposition6. Subjectively, this combination may complement the spiciness and heat between wasabi and the spirit, enhancing the overall culinary experience. Animal qualitative behavioral assessment (QBA) is a comprehensive whole-animal methodological approach that examines behavioral changes in subjects in response to short-term or long-term external stimuli using numerical terms7. This method encompasses pain tests, motor tests, learning and memory tests, as well as emotion tests specifically designed for rodent models8. However, studies investigating the synergistic sensory evaluation of gustation together with olfaction remain scarce in the literature until now9,10. Most of the studies on chemesthetic sensation are confined to examining individual food and beverage consumption separately11. Consequently, there is a dearth of research on the taste-smell interaction involving the act of sniffing liquor while consuming wasabi.

As the wasabi-induced stinging sensation is believed to be a form of nociception12, animal behavioral assessments are well-suited for evaluating the nociceptive sensory responses in rodent animals8,13,14. A method for assessing nociception in mice, known as the mouse grimace scale (MGS) scoring was developed by Langford et al.15,16. This behavioral study method is a pain-related assessment approach, relying on the analysis of facial expressions exhibited by the experimental mice. The experimental setup is straightforward, involving a transparent cage and 2 cameras for video recording. By incorporating advanced technologies17,18,19 for automatic data capture, quantitative and qualitative behavioral measures can be obtained, enhancing animal welfare during behavioral monitoring20. Consequently, the MGS has the potential to be applied in studying the effects of various external stimuli on animals in an uninterrupted and ad libitum manner. However, the scoring process only involves selecting a few (less than 10) video frame images for evaluation by the panelists, and prior training is necessary. Scoring a large number of sample images can be labor-intensive. To overcome this time-consuming challenge, several studies have employed machine learning techniques for predicting the MGS score21,22. Yet, it is important to note that the MGS is a continuous measure. Therefore, a multi-class classification model would be more suitable for evaluating a logical and categorical problem, such as determining whether the images of mice simultaneously ingesting wasabi and sniffing liquor resemble those of normal mice.

In this study, a methodology for investigating the taste-smell interaction in mice was proposed. This methodology combines animal behavioral studies with a convolutional neural network (CNN) to analyze the facial expressions of the mouse subjects. Two mice were observed thrice under normal behavioral conditions, during the experience of wasabi-induced nociception and while sniffing liquor in a specifically designed cage. The facial expressions of the mice were video-recorded, and the generated frame images were utilized to optimize the architecture of a deep learning (DL) model. The model was then validated using an independent image dataset and deployed to classify the images acquired from the experimental group. To determine the extent of wasabi pungency suppression when the mice simultaneously sniffed liquor during wasabi consumption, the insights provided by artificial intelligence were further corroborated through cross-validation with another data analysis method, the MGS scoring16.

Protocol

In this study, two 7-week-old ICR male mice weighing between 17-25 g were utilized for the animal behavioral assessment. All housing and experimental procedures were approved by the Hong Kong Baptist University Committee on the Use of Human and Animal Subjects in Teaching and Research. The animal room was maintained at a temperature of 25 °C and a room humidity of 40%-70% on a 12-h light-dark cycle.

1. Cage design

  1. Prepare acrylonitrile butadiene styrene bricks in 3 different dimensions for cage construction: 8 mm x 8 mm x 2 mm, 16 mm x 16 mm x 6 mm, and 32 mm x 16 mm x 6 mm.
  2. Prepare an acrylonitrile butadiene styrene plate (312 mm x 147 mm x 2 mm) as the cage base.
  3. Prepare a 239 mm x 107 mm non-transparent acrylic plate with a thickness of 2 mm to be used as the bottom plate.
  4. Prepare a 239 mm x 107 mm transparent acrylic plate with a thickness of 5 mm to be used as the top plate.
  5. Prepare a 107 mm x 50 mm transparent acrylic plate with a thickness of 7 mm to be used as the terminal plate.
  6. Construct 2 opaque side walls by stacking bricks to a height of 54 mm.
  7. Embed the acrylic plates into the acrylonitrile butadiene styrene-based cage, as illustrated in Figure 1A.
  8. Prepare a chows' chamber that is constructed by five 90 mm x 50 mm transparent acrylic plates with a thickness of 2 mm, as illustrated in Figure 1B. Among the 5 transparent acrylic plates, use 2 plates for the sides, 1 plate for the top, 1 plate for the bottom, and 1 plate for the terminal.
  9. Prepare a 60 mm x 50 mm 2 mm transparent acrylic plate as a chow introduction plate, and place it in the chow's chamber.

2. Animal behavioral assessment

  1. House littermates of 2 7-week-old ICR male mice together in a regular cage.
  2. Provide the littermates of 2 mice with free access to food pellets and tap water for a 1-week adaptation period.
  3. After 1 week, introduce the littermates of 2 mice with a bottle of ethanol (~40% v/v).
    NOTE: Make sure that they are only allowed to sniff or inhale the provided aqueous ethanol on an ad libitum basis while drinking is restricted.
  4. Conduct behavioral experiments using the 9-10-week-old mouse model and the transparent cubicle cage that is depicted in Figure 1A.
  5. Disassemble all the acrylic plates and acrylonitrile butadiene styrene plates and clean them thoroughly. Start by rinsing them with ultrapure water at least 3 times and then dry them using paper towels. Next, spray them with 75% ethanol, followed by cleaning them with lens paper. Finally, allow them to air dry for at least 15 min.
  6. Weigh the mice and record their body weights before each replication of the behavioral experiment.
  7. Freshly prepare a mixture of wasabi and peanut butter by weighing 1 g of commercial wasabi and 4.5 g of peanut butter. Mix them in a zip plastic bag.
    NOTE: Due to the volatility of the isothiocyanates in wasabi, it is important to store the commercial wasabi in a -20 °C freezer.
  8. Weigh and provide either two 0.5 g pastes of peanut butter or a mixture of wasabi and peanut butter on the chow introduction plate, as illustrated in Figure 1B, C.
  9. Place the prepared chow introduction plate in the chows' chamber, as illustrated in Figure 1B, C, to allow the 2 mice to have ad libitum access to food during each video recording session.
  10. Fill the groove underneath with 30 mL of liquid, either pure water or liquor (~42% v/v ethanol), to facilitate concurrent inhalation, as indicated in Figure 1B, C.
  11. Begin recording using the cameras on 2 smartphones placed on the phone stands at each terminal.
    NOTE: The specifications of the videos are as follows: frame width, 1920; frame height, 1080; data rate, 20745 kbps; frame rate, 30.00 frames per second (fps).
  12. Carefully place the 2 trained littermates of mice into the designed animal behavioral study platform from the top and promptly secure the cage with the top plate.
    NOTE: Make sure this step is completed within 15 s.
  13. Record each video for 2-3 min.
    NOTE: Make sure the entire duration of the experiment, from the preparation of the peanut butter-wasabi mixture to the completion of the video recording, is limited to 5 min.
  14. Repeat the whole experiment 3 times.
    NOTE: Make sure each replicate of the experiment is separated for at least 6 h.
  15. Mimic different scenarios.
    NOTE: For example, in this work, a pair of mice was used in 4 groups with 4 different scenarios that were mimicked by using the experimental setting described above. These scenarios include Scenario A for the background study, Scenario B for the positive control study, Scenario C for the wasabi-alcohol taste-smell interaction study, and Scenario D for the negative control study. Table 1 provides a summary of these scenarios.

3. Image recognition

Similar to many studies on image processing23,24,25, a classification model was attained by training a CNN. The script for DL operations was written in Python v.3.10 on Jupyter Notebook (anaconda3). It is available on the following GitHub repository: git@github.com:TommyNHL/imageRecognitionJove.git. To construct and train the CNN, open-source libraries were used, including, numpy v.1.21.5, seaborn v.0.11.2, matplotlib v.3.5.2, cv2 v.4.6.0, sklearn v.1.0.2, tensorflow v.2.11.0, and keras v.2.11.0. These libraries provided the necessary tools and functionality to develop and train CNN for image recognition.

  1. Export a series of video frame images from the collected video clips to generate a data set for model training by using the provided Jupyter Notebook, named Step1_ExtractingAndSavingVideoFrameImages.ipynb.
  2. Only select the images with at least 1 mouse consuming the provided paste. Examples of the selected images are provided in Supplementary Figure 1, Supplementary Figure 2, Supplementary Figure 3, Supplementary Figure 4, Supplementary Figure 5, Supplementary Figure 6, and Supplementary Figure 7.
  3. Perform data augmentation by horizontally flipping the generated images by implementing the script provided in the Jupyter Notebook, named Step2_DataAugmentation.ipynb.
  4. Reserve the image data from each second replicate for external independent CNN model validation. Use the images from each first and third replicates for internal model training and testing.
  5. Preprocess the image data used in CNN modeling by running the script in the Jupyter Notebook, named Step3_CNNmodeling_TrainTest.ipynb, including image resizing, black color conversion, and image signal normalization.
  6. Split the training materials into internal training and testing datasets in an 8:2 fashion randomly.
  7. Initialize the architecture for CNN. Design the number of outputs of CNN based on the number of scenarios to be examined.
    NOTE: For example, in this study, the neural network was designated to classify 3 classes. Make sure the script for handling data imbalance on class weight is compiled.
  8. Find the hyperparameter combination that yields minimal loss on the internal test samples for CNN construction.
  9. Adopt the optimal hyperparameters combination for constructing CNN architecture.
  10. Open the provided Jupyter Notebooks Step4_CNNmodel_ExternalValOriginal.ipynb and Step5_CNNmodel_ExternalValFlipped.ipynb. Validate the attained model using the independent (original and flipped) images from the second replicate of the animal behavioral experiment.
  11. Deploy the attained and validated model for classifying the video frame images generated from the experimental group using Jupyter Notebook Step6_CNNmodel_Application.ipynb.
    NOTE: For example, it is scenario C in this work.

4. Manual mouse grimace scale scoring

NOTE: To validate the insights provided by the CNN model prediction, another method previously developed and validated by Langford et al. was applied16. This method involves scoring the MGS based on the 5 specific mouse facial action units (AUs): orbital tightening, nose bulge, cheek bulge, ears tightening outward, and whisker change. Each AU is assigned a score of 0, 1, or 2, indicating the absence, moderate presence, or obvious presence of the AU, respectively. This scoring system allows for the quantification and scaling of each AU to assess the level of nociception or discomfort experienced by the mice.

  1. Capture 3 video frame images of the littermates ingesting the paste for each video clip. Ensure that each frame is separated by at least 3 seconds.
  2. Blind code and randomly reorder images from different classes of scenarios in sequence by using the provided template file named "shuffleSlides.pptm" (Supplementary File 1) and running the embedded Macro code.
  3. Invite at least 10 panelists to score the sample images.
  4. Train the panelists to score image samples using the MGS. Provide the panelists with training materials that include the original article regarding MGS and its manual15,16.
  5. Calculate the MGS score of each animal subject in a captured frame by averaging all the score points of the corresponding 5 facial AUs. Present the MGS score as the mean ± standard error of measurement (SEM).
  6. Determine whether statistically significant differences exist among different classes of scenarios by one-way analysis of variance (ANOVA) with Bonferroni's multiple comparison post-hoc test.
    NOTE: A value of P < 0.05 is considered statistically significant.

Results

The main objective of this study is to establish a robust framework for investigating the taste-smell interaction in mice. This framework incorporates the use of artificial intelligence and QBA to develop a predictive classification model. Additionally, the insights obtained from DL are cross-validated with a quantitative MGS assessment for an internal independent analysis. The primary application of this methodology is to examine the extent of suppression of the wasabi-invoked nociception when mice sniff dietary alcohol...

Discussion

The proposed method for studying taste-smell interaction in this work is based on the original method of behavioral coding for facial expression of pain in mice, which was developed by Langford et al.16. Several recently published articles have introduced CNN for automatic mouse face tracking and subsequent MGS scoring21,26,27,28. Applying CNNs offers an advantage over t...

Disclosures

The authors declare that there are no conflicts of interest.

Acknowledgements

Z. Cai would like to acknowledge the financial support from the Kwok Chung Bo Fun Charitable Fund for the establishment of the Kwok Yat Wai Endowed Chair of Environmental and Biological Analysis.

Materials

NameCompanyCatalog NumberComments
Absolute ethanol (EtOH)VWR Chemicals BDHCAS# 64-17-5
Acrylonitrile butadiene styrene bricksJiahuifeng Flagship Storehttps://shop.paizi10.com/jiahuifeng/chanpin.html
Acrylonitrile butadiene styrene platesJiahuifeng Flagship Storehttps://shop.paizi10.com/jiahuifeng/chanpin.html
Allyl isothiocyanate (AITC)Sigma-AldrichCAS# 57-06-7
Anhydrous dimethyl sulfoxideSigma-AldrichCAS# 67-68-5
Chinese spiritYanghe Qingcihttps://www.chinayanghe.com/article/45551.html
Commercial wasabiS&B FOODS INC.https://www.sbfoods-worldwide.com
Formic acid (FA)VWR Chemicals BDHCAS# 64-18-6
GraphPad Prism 5GraphPadhttps://www.graphpad.com
HPLC-grade acetonitrile (ACN)VWR Chemicals BDHCAS# 75-05-8
HPLC-grade methanol (MeOH)VWR Chemicals BDHCAS# 67-56-1
Microsoft Excel 2016Microsofthttps://www.microsoft.com 
Microsoft PowerPoint 2016Microsofthttps://www.microsoft.com
Milli-Q water systemMilliporehttps://www.merckmillipore.com
Mouse: ICRLaboratory Animal Services Centre (The Chinese University of Hong Kong, Hong Kong, China)N/A
Peanut butterSkippyhttps://www.peanutbutter.com/peanut-butter/creamy
Python v.3.10Python Software Foundationhttps://www.python.org 
Transparent acrylic platesTaobao Storehttps://item.taobao.com/item.htm?_u=32l3b7k63381&id=60996545797
0&spm=a1z09.2.0.0.77572e8dFPM
EHU

References

  1. Isshiki, K., Tokuoka, K., Mori, R., Chiba, S. Preliminary examination of allyl isothiocyanate vapor for food preservation. Biosci Biotechnol Biochem. 56 (9), 1476-1477 (1992).
  2. Li, X., Wen, Z., Bohnert, H. J., Schuler, M. A., Kushad, M. M. Myrosinase in horseradish (Armoracia rusticana) root: Isolation of a full-length cDNA and its heterologous expression in Spodoptera frugiperda insect cells. Plant Sci. 172 (6), 1095-1102 (2007).
  3. Depree, J. A., Howard, T. M., Savage, G. P. Flavour and pharmaceutical properties of the volatile sulphur compounds of Wasabi (Wasabia japonica). Food Res Int. 31 (5), 329-337 (1998).
  4. Hu, S. Q., Wei, W. Study on extraction of wasabi plant material bio-activity substances and anti-cancer activities. Adv Mat Res. 690 - 693, 1395-1399 (2013).
  5. Lee, H. -. K., Kim, D. -. H., Kim, Y. -. S. Quality characteristics and allyl isothiocyanate contents of commercial wasabi paste products. J Food Hyg Saf. 31 (6), 426-431 (2016).
  6. Bacon, T. Wine, wasabi and weight loss: Examining taste in food writing. Food Cult Soc. 17 (2), 225-243 (2014).
  7. Fleming, P. A., et al. The contribution of qualitative behavioural assessment to appraisal of livestock welfare. Anim Prod Sci. 56, 1569-1578 (2016).
  8. Shi, X., et al. Behavioral assessment of sensory, motor, emotion, and cognition in rodent models of intracerebral hemorrhage. Front Neurol. 12, 667511 (2021).
  9. Stevenson, R. J., Prescott, J., Boakes, R. A. Confusing tastes and smells: How odours can influence the perception of sweet and sour tastes. Chem Senses. 24 (6), 627-635 (1999).
  10. Pfeiffer, J. C., Hollowood, T. A., Hort, J., Taylor, A. J. Temporal synchrony and integration of sub-threshold taste and smell signals. Chem Senses. 30 (7), 539-545 (2005).
  11. Simons, C. T., Klein, A. H., Carstens, E. Chemogenic subqualities of mouthfeel. Chem Senses. 44 (5), 281-288 (2019).
  12. Andrade, E. L., Luiz, A. P., Ferreira, J., Calixto, J. B. Pronociceptive response elicited by TRPA1 receptor activation in mice. Neuroscience. 152 (2), 511-520 (2008).
  13. Palazzo, E., Marabese, I., Gargano, F., Guida, F., Belardo, C., Maione, S. Methods for evaluating sensory, affective and cognitive disorders in neuropathic rodents. Curr Neuropharmacol. 19 (6), 736-746 (2020).
  14. Topley, M., Crotty, A. M., Boyle, A., Peller, J., Kawaja, M., Hendry, J. M. Evaluation of motor and sensory neuron populations in a mouse median nerve injury model. J Neurosci Methods. 396, 109937 (2023).
  15. Langford, D. J., et al. . Mouse Grimace Scale (MGS): The Manual. , (2015).
  16. Langford, D. J., et al. Coding of facial expressions of pain in the laboratory mouse. Nat Methods. 7 (6), 447-449 (2010).
  17. Liu, H., Fang, S., Zhang, Z., Li, D., Lin, K., Wang, J. MFDNet: Collaborative poses perception and matrix Fisher distribution for head pose estimation. IEEE Trans Multimedia. 24, 2449-2460 (2022).
  18. Liu, T., Wang, J., Yang, B., Wang, X. NGDNet: Nonuniform Gaussian-label distribution learning for infrared head pose estimation and on-task behavior understanding in the classroom. Neurocomputing. 436, 210-220 (2021).
  19. Liu, T., Liu, H., Yang, B., Zhang, Z. LDCNet: Limb direction cues-aware network for flexible human pose estimation in industrial behavioral biometrics systems. IEEE Trans Industr Inform. 20 (6), 8068-8078 (2023).
  20. Grant, E. P., et al. What can the quantitative and qualitative behavioural assessment of videos of sheep moving through an autonomous data capture system tell us about welfare. Appl Anim Behav Sci. 208, 31-39 (2018).
  21. Vidal, A., Jha, S., Hassler, S., Price, T., Busso, C. Face detection and grimace scale prediction of white furred mice. Mach Learn Appl. 8, 100312 (2022).
  22. Zylka, M. J., et al. Development and validation of Painface, A software platform that simplifies and standardizes mouse grimace analyses. J Pain. 24 (4), 35-36 (2023).
  23. Liu, H., Zhang, C., Deng, Y., Liu, T., Zhang, Z., Li, Y. F. Orientation cues-aware facial relationship representation for head pose estimation via Transformer. IEEE Trans Image Process. 32, 6289-6302 (2023).
  24. Liu, H., Liu, T., Chen, Y., Zhang, Z., Li, Y. F. EHPE: Skeleton cues-based Gaussian coordinate encoding for efficient human pose estimation. IEEE Trans Multimedia. , (2022).
  25. Liu, H., et al. TransIFC: Invariant cues-aware feature concentration learning for efficient fine-grained bird image classification. IEEE Trans Multimedia. , (2023).
  26. Akkaya, I. B., Halici, U. Mouse face tracking using convolutional neural networks. IET Comput Vis. 12 (2), 153-161 (2018).
  27. Andresen, N., et al. Towards a fully automated surveillance of well-being status in laboratory mice using deep learning: Starting with facial expression analysis. PLoS One. 15 (4), e0228059 (2020).
  28. Ernst, L., et al. Improvement of the mouse grimace scale set-up for implementing a semi-automated Mouse Grimace Scale scoring (Part 1). Lab Anim. 54 (1), 83-91 (2020).
  29. Tuttle, A. H., et al. A deep neural network to assess spontaneous pain from mouse facial expressions. Mol Pain. 14, 1744806918763658 (2018).
  30. Lencioni, G. C., de Sousa, R. V., de Souza Sardinha, E. J., Corrêa, R. R., Zanella, A. J. Pain assessment in horses using automatic facial expression recognition through deep learning-based modeling. PLoS One. 16 (10), e0258672 (2021).
  31. Steagall, P. V., Monteiro, B. P., Marangoni, S., Moussa, M., Sautié, M. Fully automated deep learning models with smartphone applicability for prediction of pain using the Feline Grimace Scale. Sci Rep. 13, 21584 (2023).

Reprints and Permissions

Request permission to reuse the text or figures of this JoVE article

Request Permission

Explore More Articles

Animal Behavioral AssessmentConvolutional Neural NetworkWasabiAlcoholTaste smell InteractionChemosensory IsothiocyanatesSensory EvaluationOlfactory StudyFacial ExpressionsDeep Learning ModelMouse Grimace ScaleDietary Alcoholic BeveragesWasabi SpicinessITC Compound ScreeningSensory Analyses

This article has been published

Video Coming Soon

JoVE Logo

Privacy

Terms of Use

Policies

Research

Education

ABOUT JoVE

Copyright © 2025 MyJoVE Corporation. All rights reserved