Sounding the alarm in time to save endangered species

© © Wikimedia Commons/Adrian Michel

Biodiversity loss is accelerating. To identify species in urgent need of protection, scientists from Fribourg want to combine AI with data collection and engagement by citizen scientists.

A few years – or sometimes even just a few weeks – can be all it takes for a plant or animal to acquire "endangered species" status. For instance, when a new road is built through a forest, the chainsaws come out and a rare species of amphibian may be decimated as a result. Multiplied and amplified by climate change, extreme weather events – drought and forest fires – can devastate a whole population of animals or plants in the space of less than one season.

Biodiversity crises often occur more abruptly than one might expect – and they are happening at an increasing rate. To counter this, an alarm must be sounded as early as possible. That's the aim of Daniele Silvestro, an SNSF grant holder. In the journal Plants, People, Planet, this scientist from the University of Fribourg outlines an approach that combines artificial intelligence, aerial images and assistance from citizen scientists. He sees this as the best way to make the right decisions – and to do so faster.

Mobile phones and citizen science

The researcher is developing an AI system that can integrate a number of different types of environmental information – databases, images, surveys. He plans to optimise it in order to analyse satellite or aerial images. Deforestation, reforestation, changes in vegetation cover, new penguin colonies in the Antarctic, recently built infrastructure – aerial views of all these phenomena reveal vast quantities of information. "Thanks to artificial intelligence, we can analyse millions of images within a short time," Daniele Silvestro explains. "The human eye could do the same thing, but the fast pace of machine learning takes us to a level higher. It's really a sort of live survey of the planet."

To complete the system, Daniele Silvestro proposes including a citizen science component. His vision: volunteers use their mobile phones to provide pictures taken on the ground – on wasteland, in forests or in marshland. An app could be used to automatically identify the species present – e.g. tree types by the hundred in a small sample of tropical forest. Such information, especially if hidden beneath a thick canopy or in the soil, is impossible to capture from the heights at which drones and satellites operate. "Mobile phones open up huge potential that has remained largely untapped," says Daniele Silvestro. "In most places where you find people you also find mobile phones, and almost all of them are equipped with cameras and GPS for precise localisation".

Simulating a catastrophe

Thanks to all the data received, AI’s role would expand beyond mere monitoring. It could anticipate problems, identify high-risk areas and even propose strategies for avoiding ecological catastrophes. To achieve this, the team from Fribourg has adapted an engine often used by apps for games such as Chess and Go. Daniele Silvestro explains: "We literally get our AI to play a game, but instead of neutralising an adversary on the chessboard it learns strategies for predicting and preventing biodiversity loss."

In his study, the researcher shows that the images taken from above and the data obtained on the ground can enable AI to perform a live reclassification of the species' extinction risk. This would be a big plus, because when species slide into the danger zone, there is no time to lose. For example, when Australia was ravaged by forest fires in 2020, wild koala populations were decimated within just a few weeks, causing them to be reclassified as an endangered species. Daniele Silvestro is mainly focusing on the development of his open-source engine, known as CAPTAIN (Conservation Area Prioritisation Through Artificial INtelligence). He is currently in discussions with several institutions and companies with a view to giving his vision of an early warning system a more concrete form.