Spotting Elephants from Space Using Deep Learning

Searching for novel data technology systems that can help wildlife conservation, University of Oxford researchers use remote sensing data and a deep learning Convolutional Neural Network to detect and count elephants in South Africa.

By Julian Smith

By Julian Smith 2021年03月15日

It’s important to be able to track populations of threatened animal species in the wild. But doing this in person can be expensive and time-consuming when individuals are spread across thousands of square miles, especially in places where they blend into their environment.

Wildlife researchers have begun to use high-resolution satellite imagery to count species such as whales, penguins, polar bears, and flamingos. This technique is much more efficient than on-the-ground methods and makes it possible to survey across national borders without red tape. For example, the Worldview-3 satellite can cover up to 260,000 square miles every 24 hours, with a resolution of 1 foot.

It still takes trained observers to pick out animals from the digital images, though. The results can be inconsistent and prone to error, especially when the background isn’t as uniform as snow or water.

RELATED

Can Cloud-Connected Devices Keep Wildlife Safe?

Recently, a team from the University of Oxford used a combination of remote sensing data and a deep learning Convolutional Neural Network to detect and count elephants in South Africa, with surprisingly accurate results.

Convolutional Neural Networks are multi-layered artificial neural networks that can be trained, among other things, to find patterns in massive datasets. They have been used in medical imaging and facial verification, but ecologists have just started to put them to work finding wildlife such as albatross, whales, and seals in satellite images.

The Oxford project was the first to try to detect animals in complex environments. They chose elephants roaming the savannah and woodlands of South Africa’s Addo Elephant National Park, the third-largest park in the country. It wasn’t just because they were the world’s largest terrestrial mammal, said study author Isla Duporge of Oxford’s Department of Zoology.

“Elephant populations have steadily declined over the last few decades, and having accurate numbers is key to understanding the reasons for the decline,” she said.

Elephants may be huge, but their shape and color as seen from above are constantly changing as they forage, play, sleep and cover themselves in mud. Combined with the highly varied landscape, which changes from region to region and season to season, this can make it difficult to spot them even from manned aircraft, the way surveys have typically been done.

RELATED

NASA Wants to Keep Skies Safe as Drone Traffic Skyrockets

The Oxford team used data from the WorldView-3 and -4 satellites, the highest resolution satellite images currently available. Using 11 images captured between 2014 and 2019, they created a customized training dataset for the TensorFlow Object Detection API developed by Google.

“A CNN is trained by examples,” said co-author Olga Isupova of the University of Bath’s Department of Computer Science. 

“We showed it lots of images and said that this is an elephant and this is not an elephant. We tune its internal parameters to make as few mistakes as possible when it tries to guess.” 

Through these repeated tweaks, they helped the algorithm teach itself what an elephant looks like from above.

When CNN’s guesses were compared to those of 51 volunteer annotators, the team found it was comparable to human inaccuracy. People were better at finding elephants against simple backgrounds, but the CNN was more accurate in places where the background was more complex, such as thickets. Its results were also more consistent.

“I had quite high expectations, and the CNN even exceeded those,” Duporge said.

RELATED

Tech Watching Over Egypt’s Valley of the Kings

When the team then used the algorithm on lower resolution images of elephants in Kenya’s Maasai Mara, without any further training data, the results were the same. This is encouraging, Isupova says, since it shows CNNs can potentially be used to spot wildlife in places where they haven’t been specifically trained.

It can be a challenge to find computing power to process large image datasets. And the images themselves aren’t cheap; archived Worldview-3 images cost about $45 per square mile, and ordering new images costs $71 per square mile. High-resolution satellite imagery in many regions where threatened species live can be expensive.

But researchers can use existing images to find groups of animals, as well as signs like guano stains and mounds left by burrowing. And six new satellites with the same resolution as Worldview-3 will be launched in 2021, offering many more choices of targets.

The team is already in touch with African conservation groups, the U.S. Army Research Office, the Smithsonian, and other organizations, Duporge says. A study on wildebeest is in the works, and other species like rhinos, oryx, and livestock could be good candidates.

Julian Smith is a contributing writer.

© 2021 Nutanix, Inc. All rights reserved. For additional legal information, please go here.