Recent literature shows that dogs process human faces similarly to humans. They are able to discriminate familiar human faces using the global visual information both of the faces and the head (Huber, Racca, Scaf, Virányi, & Range, 2013), scanning all the facial features systematically (e.g. eyes, nose and mouth; Somppi et al., 2016) and relying on configural elaboration (Pitteri, Mongillo, Carnier, Marinelli, & Huber, 2014). Moreover, dogs, as well as humans, focus their attention mainly in the eye region, showing faces identification impairments when it is masked (Pitteri et al., 2014; Somppi et al., 2016). Interestingly, their gazing pattern of faces informative regions varies according to the emotion expressed. Dogs tend to look more at the forehead region of positive emotional expression and at the mouth and the eyes of negative facial expressions (Barber, Randi, Müller, & Huber, 2016), but they avert their gaze from angry eyes (Somppi et al., 2016). The attentional bias shown toward the informative regions of human emotional faces suggests, therefore, that dogs use facial cues to encode human emotions. Furthermore, in exploring human faces (but not conspecific ones), dogs, as humans, rely more on information contained in their left visual field (Barber et al., 2016; Guo, Meints, Hall, Hall, & Mills, 2009; Ley & Bryden, 1979; Racca, Guo, Meints, & Mills, 2012). Although symmetric, the two sides of human faces differ in emotional expressivity. Previous studies employing mirrored chimeric (i.e. composite pictures made up of the normal and mirror-reversed hemiface images, obtained by splitting the face down the midline) and 3-D rotated pictures of faces, reported that people perceive the left hemiface as displaying stronger emotions more than the right one (Lindell, 2013; Nicholls, Ellis, Clement, & Yoshino, 2004), especially for negative emotions (Borod, Haywood, & Koff, 1997; Nicholls et al., 2004; Ulrich, 1993). Considering that the muscles of the left side of the face are mainly controlled by the contralateral hemisphere, such a difference in the emotional intensity displayed suggests a right hemisphere dominant role in expressing emotions (Dimberg & Petterson, 2000). Moreover, in humans, the right hemisphere has also a crucial role in the processing of emotions, since individuals with right-hemisphere lesions showed impairments in their ability to recognize others emotions (Bowers, Bauer, Coslett, & Heilman, 1985). A right-hemispheric asymmetry in processing human faces has also been found in dogs, which showed a left gaze bias in attending to neutral human faces (Barber et al., 2016; Guo et al., 2009; Racca et al., 2012). Nevertheless, the results on dogs looking bias for emotional faces are inconsistent. Whilst a left gaze bias was shown in response to all human faces regardless the emotion expressed (Barber et al., 2016), Racca et al. (2012) observe this preference only for neutral and negative emotions, but not for the positive ones. Thus, the possibility that such a preference is dependent on the valence of the emotion conveyed and subsequently perceived cannot be excluded. Furthermore, it remains still unclear whether dogs understand the emotional message conveyed by human facial expressions and which significance and valence they attribute to it.
Participants were randomly divided in two groups according to the gender of the presented human faces, so that each subject was presented with only female or male pictures. The test consisted in 2 weekly trials in which a maximum of two different emotional faces dyads were shown per each dog until the full set of stimuli was completed (i.e. each subject was presented with all the seven emotional faces).
Ekman Emotion In The Human Face Pdf Download
The right-right (R-R) or left-left (L-L) hemifaces chimeric pictures of the same emotion were randomly assigned to each trial (and counterbalanced considering the whole sample), as well as the order of the emotional faces displayed.
Once in the testing room, the owner led the dog to the bowl on a loose leash, helping it to take a central position in the testing apparatus and waited till he started to feed. Then, he let the dog off the leash and positioned himself 2.5 m behind it. During the test, the owner had to maintain this position, looking straight to the wall in front of him and avoiding any interactions with the dog. After 10 seconds from the owner positioning, the first emotional face was displayed. Visual stimuli appeared simultaneously on the two screens, where they remain for 4 seconds. The chimeric pictures of the different emotions were presented in the middle of the screen. The interstimulus interval was at least 7 seconds, but if a subject did not resume feeding within this time, the following stimulus presentation was postponed. The maximum time allowed to resume feeding was 5 minutes. Visual stimuli were presented as a PowerPoint slideshow in which the first, the last, and in between stimuli slides were homogeneous black. All the seven emotional face dyads were displayed only once per each dog since it was registered a high level of habituation to the stimuli during the pilot test.
The heart rate response to the stimuli presentation was evaluated following the procedures and the analysis previously described in Siniscalchi et al. (2016) and Siniscalchi et al. (2018). The PC-Vetgard+tm Multiparameter wireless system, to which dogs were previously accustomed, was used to record continuously the cardiac activity during the test. The heart rate response was analysed from the pictures appearance for at least the following 10 seconds or till the dog resumed to feed (maximum time allowed was 5 minutes). For the analysis, a heart rate curve was obtained during a pre-test in order to calculate the heart rate basal average (HR baseline). The highest (HV) and lowest (LV) values of the heart rate registered during the test were scored. Moreover, the area delimited by the HR curve and the baseline was computed for each dog and each visual stimulus using Microsoft Excel. The area under the curve (above baseline and under curve; AUC) and the area above the curve (under baseline and above curve; AAC) values were calculated as number of pixels employing Adobe Photoshop. HR changes for each dog during presentations of different emotional faces were then analysed by comparing different area values with the corresponding baseline.
Results for the cardiac activity are shown in Fig. 6. A statistically significant main effect of different emotional faces was observed in the overall time at which heart-rate values were higher than the basal average, AUC: F(6, 107) = 49.117, p p p = .040); fear vs. happiness (p = .002), vs. neutral (p = .004); in addition the overall time at which heart-rate values were higher than the basal average was higher for surprise than disgust (p = .004) and neutral (p = .043). Similarly to the behavioural results, GLMM analysis for left-left and right-right human chimeric faces revealed that the composite pictures made up of the left hemiface elicited significantly stronger AUC levels with respect to the composite pictures made up of the right hemiface (L-L pictures: M = 6,809,123.945, SEM = 178,468.906; R-R pictures: M = 5,933,745.620, SEM = 178,471.283), F(1, 107) = 12.878, p = .001. No effects of human face gender, F(1, 107) = 0.012, p = 0.913, or sex, F(1, 107) = 0.873, p = 0.352, on AUC values were revealed.
No statistical significant effects were observed in AAC values: emotion category, F(6, 107) = 0.578, p = .747; human face gender, F(1, 107) = 0.016, p = .899; sex, F(1, 107) = 0.018, p = .893; and visual stimuli chimeras, F(1, 107) = 0.627, p = .238.
Overall, our results from the arousal dimension supported the prevalent activation of the right hemisphere in the analysis of anger, fear, and happiness human faces since tested subjects exhibited a longer latency to resume feeding and a higher stress levels in response to these emotional stimuli compared to the others over the experiment.
Finally, as for separated analysis of mirrored chimeric faces (composite pictures made up of the left-left and the right-right hemifaces), our results showed that dogs displayed a higher behavioural response and cardiac activity in response to left-left pictures compared to right-right ones. Thus, it can be concluded that dogs and humans show similarities in processing human emotional faces, since it has been reported that people perceive the left hemiface composite pictures as displaying stronger emotions than the right one (Lindell, 2013; Nicholls et al., 2004). Moreover, this finding is consistent with the general hypothesis of the main involvement of the right hemisphere in expressing high arousal emotions (Dimberg & Petterson, 2000).
The second major contribution was his focus primarily on the face, although he did give some attention to vocalizations, tears and posture. To date, facial expression has been found to be the richest source of information about emotions. The voice has yet to be shown to be a source for as many discrete emotional states as the face, although it is harder to fabricate or regulate than facial expressions.
The fourth insight was that emotions are not unique to humans, but found in many other species. His examples in Expression range from bees to roosters, dogs, cats, horses as well as other primates. For much of the last century that view was considered an example of bad science, of anthropomorphism. Underlying that belief was a reification of language and verbal self-report. If we cannot examine a species report of their experience, how can we know if emotion is occurring? That stance would require that we regard infants as not having emotions prior to their acquiring speech! Words are used to describe or reflect upon our emotional experience, but the words are representations of emotion not the sine qua non of emotion. 2ff7e9595c
Comments