How dogs see the world revealed | Sunday Observer

How dogs see the world revealed

25 September, 2022

Ever wondered what your dog is thinking when it gazes at the TV, seemingly fascinated by the news at ten?

Scientists have discovered that your pooch probably isn’t focused on Huw Edwards specifically, but more what the people on screen are doing.

Study dogs at Emory University in Georgia, USA had their brains scanned by an MRI machine while watching a half-hour video of stimulating content.

This included clips of dogs running around, humans interacting with each other, vehicles passing by, and a cat in a house.

Data from the MRI was fed into an artificial intelligence (AI) called Ivis, which correlated brain activity with whether an action or object was shown on screen.

Results showed that dogs are vastly more visually attuned to actions in their environment, rather than who or what is performing those actions.

Neuroscientist Erin Phillips said: ‘While our work is based on just two dogs it offers proof of concept that these methods work on canines.

‘I hope this paper helps pave the way for other researchers to apply these methods on dogs, as well as on other species, so we can get more data and bigger insights into how the minds of different animals work.’

Dogs possess only two types of cone cells in their eyes and can only perceive the colours blue and yellow.


This is vastly different to humans, who have three types of cone cells and can visualise the whole colour spectrum.

However, canines also have a higher density of motion-sensitive vision receptors than us.

Scientists believe dogs could visually perceive the world differently to humans in these ways because they need to be more aware of threats in their environment.

It could also be because they are more reliant on their other senses as, while humans are very visually oriented, dogs’ olfactory senses are much more powerful.

The researchers at the Canine Cognitive Neuroscience Lab wanted to discover if there were any other differences between how canine and human minds reconstruct what they see.

They recruited Bhubo, a four-year-old male Boxer-mix, and Daisy, an 11-year-old female Boston terrier-mix, to participate in a study.

Both pooches had been trained to enter and lie inside an fMRI machine completely unrestrained, so were able to have their brains scanned while awake and alert.

‘They didn’t even need treats!’ said Ms Phillips.

For the study, Daisy and Bhubo were each shown specially designed movies in three 30 minute sessions for a total of 90 minutes while relaxing in the fMRI machine.

The movies contained video clips that the researchers thought a dog might find interesting enough to watch for an extended period.

Dog’s perspective

They were filmed by the researchers using a gimbal - a pivoting camera support - and a selfie stick to allow them to shot footage from a ‘dog’s perspective’.

The clips showed dogs running around and humans interacting with dogs, giving them pets or treats, or waving a toy towards the camera itself.

Other activities included vehicles passing by, humans hugging or eating, a deer crossing a path, a cat in a house, and dogs walking on leashes.

As the dogs watched their movies, an MRI scan was taken of their brains that visualised neural activity.

Ms Phillips said: ‘It was amusing because it’s serious science, and a lot of time and effort went into it, but it came down to these dogs watching videos of other dogs and humans acting kind of silly.’ For comparison, two humans were also shown the videos while lying in the fMRI machine and undergoing a scan.

Next, the video data was segmented by timestamps, and each clip was given classifiers to identify what was being shown on screen at the time.

The classifiers included objects, such as dogs, humans, vehicles, or other animals, or actions, such as sniffing, eating, or playing

This information, as well as the dog and human MRI data, was fed into the neural network Ivis, and the results were published this week in Journal of Visualized Experiments.

Ivis had been trained to map the brain activity to the two classifiers, which it was able to for both with 99 per cent accuracy using the human data. Mail Online

However, it was only successful in finding correlations with the action-based classifiers for the canine data, and it did this with between 75 and 88 per cent accuracy.

This suggests that dogs’ brains prioritise what is going on in front of them over who or what is involved - a stark difference to how the human brain works.

‘We humans are very object oriented,’ said corresponding author Professor Gregory Berns.

‘There are 10 times as many nouns as there are verbs in the English language because we have a particular obsession with naming objects.

‘Dogs appear to be less concerned with who or what they are seeing and more concerned with the action itself.’

Actions first

He added: ‘It makes perfect sense that dogs’ brains are going to be highly attuned to actions first and foremost.

‘Animals have to be very concerned with things happening in their environment to avoid being eaten or to monitor animals they might want to hunt.

‘Action and movement are paramount.’

In future, the researchers want to map brain activity to olfactory input, as dogs have a much larger proportion of their brain devoted to processing olfactory information.

They also wish to conduct more detailed research into the vision perception of dogs, and potentially other animals.

Prof. Berns said: ‘We showed that we can monitor the activity in a dog’s brain while it is watching a video and, to at least a limited degree, reconstruct what it is looking at, ‘The fact that we are able to do that is remarkable.’

- Daily Mail