As reports say Artificial Intelligence save nature, biodiversity protection and thus help to end the extinction of species. What are the initiatives that prove that this can be possible?
The United Nation’s latest biodiversity report warns that one million species are on the brink of extinction, including 1,000 wild mammal species and 450 species of sharks and rays. Losing this much wildlife will have a huge impact on humans as one out of five people depend on these species for food and income.
To help reverse the crisis, the UN’s Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services urges governments and NGOs to secure land rights for Indigenous peoples, who have been “proactively” protecting wild species with “local and regional monitoring networks.”
According to the TCFD (Task Force on Climate-Related Financial Disclosures), “more than half of the world’s economic output, representing an economic value of USD 44 trillion, is highly or moderately dependent on nature. The extinction of 83% of wild mammals and 50% of flora constitutes a significant risk for financial and business stability.
Deep-learning neural networks are an advanced form of artificial intelligence which can improve their performance without being programmed and are revolutionizing the analysis of sound data from at-risk ecosystems.
This acoustic monitoring, coupled with visual monitoring and sometimes replacing it, is conservation for the 21st century. Restricted access, of course, remains a never-ending problem with AI—both in code-sharing and in the use of the technology itself. Governments and NGOs need to work harder to make this tool available to quick action teams, otherwise, its promise would prove null and void.
Interestingly, Deep-learning neural networks don’t need human monitoring. Because this advanced AI can analyze sound data within seconds, it opens up a window of time for critical intervention, particularly in places affected due to illegal logging and mining. The advanced AI technology can hear the sounds of chainsaws or drills and issue alerts to action-ready vigils on the ground. Indigenous communities with access to this technology and being able to respond immediately could be an important part of this new equation.
One of the first AI-based Indigenous conservation projects, undertaken by Cornell University, was co-developed with the Coral Gardeners, from Mo’orea, French Polynesia. Founded in 2017. This indigenous group cultivates heat-resistant super corals and covers them on damaged parts of the reef. Cornell provides the software to detect the sounds of the several organisms making their home here and, working also with the University of Hawaii, integrates them into a recording platform, ReefOS, a network of sensors and cameras collecting visual and acoustic data 24 hours a day. The AI-mediated soundscape tells the on-site respondents whether the reefs are beginning to sound like healthy and stable reef systems, or whether additional restoration efforts are required.
Such Indigenous-AI conservation works efficiently with other ecosystems as well. In March 2014, the Tembé tribe in Northern Brazil reached out to the San Francisco nonprofit Rainforest Connection to build a low-budget alert system to keep tabs on deforestation. Rainforest Connection uses an open-source AI software called TensorFlow and recycled cell phones to detect the sounds of chainsaws and logging trucks amid the din of the Amazon. Text alerts go out instantly to Tembé vigil team when Google’s cloud computing detects the revolution of a chainsaw.