Cracking the Code of Your Pet’s Facial Expressions

Photo: SariJuurinen/Getty Images/iStockphoto

Scientists are starting to be able to accurately read animal facial expressions and understand what they communicate.

Facial expressions project our internal emotions to the outside world. Reading other people’s faces comes naturally and automatically to most of us. Without your best friend saying a word, you know — by seeing the little wrinkles around her eyes, her rounded, raised cheeks and upturned lip corners — that she got that promotion she wanted.

What if we could just as easily read the faces of other living beings? Will there come a day when we can hold up a smartphone to our cat and know how he’s feeling?

Researchers are developing coding systems that enable them to objectively read animal facial expressions rather than inferring or guessing at their meaning. A coding system precisely describes how different facial features change when an animal feels a particular emotion, such as squinting an eye or pursing lips. By looking at photographs and scoring how much each of these features or “action units” change, we can determine how strongly an emotion is felt.

So far, only pain coding systems (grimace scales) for non-primate animals have been scientifically developed. Despite their different anatomy; miceratsrabbitshorses and sheep (including lambs) all pull a similar pain-face. They tighten their eyes, bulge or flatten their cheeks, change the position of their ears and tense their mouths.

Photo: Mirjam Guesgen

The push to develop grimace scales has largely come from our desire and ethical duty to assess and improve the welfare of animals used in labs or for food products.

Ideally, we want a way to accurately and reliably know how an animal is feeling by simply looking at them, rather than by drawing blood for tests or monitoring heart rates. By knowing their emotional states, we can change help to reduce pain, boredom, or fear and, ideally, foster curiosity or joy.

Animals, particularly social ones, may have evolved facial expressions for the same reason we did — to communicate with one another or, in the case of dogs, with us.

Particularly for prey animals, subtle cues that other members of their group (but not predators) can pick up on are useful for safety, for example. A pain behavior cue may trigger help or comfort from other group members, or serve as a warning to stay away from the source of pain.

If we can decipher grimacing, we should also, theoretically, be able to understand facial expressions for other emotions such as joy or sadness. We would also likely want to comprehend facial expressions for the animals closest to our hearts: our pets.

One day, pet owners, farmhands, or veterinarians could hold up a smartphone to a dog, sheep, or cat and have an app tell them the specific emotion the animal is showing.

However, getting to an automated emotion-identification system requires many steps. The first is to define emotions in a testable, non-species-specific way.

The second is to gather descriptive baseline data about emotional expression in a controlled, experimental environment. One way to do this might be to put animals in situations that will elicit a particular emotion and see how their physiology, brain patterns, behavior, and faces change. Any changes would need to occur reliably enough that we could call them a facial expression.

We already have some hints to go on: Depressed horses close their eyes, even when not resting. Fearful cows lay their ears flat on their heads and open their eyes wide. Joyful rats have pinker ears that point more forward and outward.

Once we have gathered this data, we would then need to turn that scientific information into an automated, technological system. The system would have to be able to extract the key facial action units from an image and calculate how those features differ from a neutral baseline expression.

The system would also need to be able to deal with individual differences in facial features as well as subtle differences in how individuals express emotion. The process of feature extraction and calculation also becomes difficult or fails when a face is poorly lit, on an angle, or partially covered.

While we are making progress in automated human facial-expression identification, we are still a long way off when it come to animals. A more realistic short-term goal would be to better understand which emotions nonhuman animals express and how. The answers could be staring us right in the face.

Mirjam Guesgen is a postdoctoral fellow in animal welfare University of Alberta. This article was originally published on the Conversation. Read the original article.

Cracking the Code of Your Pet’s Facial Expressions