Dec. 3 (UPI) — Researchers in Australia and Britain have developed a new software framework that allows humans to see the world as animals do.
To accurately model animal behavior, scientists need to understand how different species process their surroundings, but figuring out exactly how different animals see the world has proven difficult.
The new framework, described Tuesday in the journal Methods in Ecology and Evolution, processes digital images and strips away the colors and details that can’t be seen by a specific animal species.
“The framework took four years to develop — many thousands of lines of code, combined with behavioral experiments to work out various parameters,” behavioral ecologist Jolyon Troscianko, a research fellow at the University of Exeter, told UPI. “Digital images are used to capture the colors and patterns, then the framework makes use of known limitations and features of animal vision. For example the parts of the spectrum the animals are sensitive to, and the details each animal can see from a given distance.”
Scientists have previously struggled to combine color and pattern information into a singular animal vision framework. But Troscianko and his research partners at the University of Queensland consolidated decades of animal vision research into a comprehensive analytical framework called the Quantitative Color Pattern Analysis framework, or QCPA for short.
“The QCPA is, just as researchers have been for decades, using information on the physiology and perceptual abilities of animal viewers,” said lead researcher Cedric van den Berg, a doctoral student at the University of Queensland. “The former is often obtained using histological, or invasive, methods, while the latter can also be obtained using behavioral experiments.”
The framework can process all kinds of digital photos, whether snapped with a smartphone or an expensive and powerful digital camera. The QCPA can also interpret all kinds of habitats, including underwater surroundings. After converting the digital photo into the colors animals process, as well as removing the details that animals can’t see, the images are processed by several more algorithms.
“There are tools for reconstructing the sharp edges in images following acuity control, and ‘agglomerative hierarchical clustering’ algorithms which use animal-vision to break the scene down into a manageable number of distinct colors,” Troscianko said.
The complex series of processing steps performed by the framework can even account for an animal’s ultraviolet vision capabilities.
Van der Berg and Troscianko claim the framework will have a variety of applications.
“One example is identifying how an animal’s camouflage works so that we can manage our land to protect certain species,” Troscianko said. “For example, lapwings — which nest on the ground — are in dramatic decline, and human land use changes may have made their nests more vulnerable to predators. We will be using these tools to identify the types of visual background which offer lapwings the best protection from predators.”
The framework could help biologists better understand a variety of animal behaviors, including mating systems, distance-dependent signalling and mimicry.
Because the framework operates as a digital plugin, it’s accessible to anyone with a camera and computer — or a smartphone. Thanks to the QCPA, exploring animal vision no longer requires sophisticated image technology, making animal vision-related research more accessible.
“The framework can be used for essentially any scientific question that requires the description of visual information as perceived by an animal, design pet friendly objects or satisfy a high-school students’ curiosity about the vision of the school’s hamster,” van der Berg said. “The applications are truly diverse and we have only just started to see what people are using these tools for.”