![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
. | ![]() |
. |
![]() by Staff Writers Plymouth UK (SPX) May 13, 2019
Artificial intelligence (AI) could help scientists shed new light on the variety of species living on the ocean floor, according to new research led by the University of Plymouth. With increasing threats facing the marine environment, scientists desperately need more information about what inhabits the seabed in order to inform conservation and biodiversity management. Autonomous underwater vehicles (AUV) mounted with the latest cameras are now able to collect vast amounts of data, but a bottleneck is still created by humans having to process it. In a new study published in Marine Ecology Progress Series, marine scientists and robotics experts tested the effectiveness of a computer vision (CV) system in potentially fulfilling that role. They showed on average it is around 80% accurate in identifying various animals in images of the seabed, but can be up to 93% accurate for specific species if enough data is used to train the algorithm. This, scientists say, demonstrates CV could soon be routinely employed to study marine animals and plants and lead to a major increase in data availability for conservation research and biodiversity management. PhD student Nils Piechaud, lead author on the study, said: "Autonomous vehicles are a vital tool for surveying large areas of the seabed deeper than 60m (the depth most divers can reach). But we are currently not able to manually analyse more than a fraction of that data. This research shows AI is a promising tool but our AI classifier would still be wrong one out of five times, if it was used to identify animals in our images. "This makes it an important step forward in dealing with the huge amounts of data being generated from the ocean floor, and shows it can help speed up analysis when used for detecting some species. But we are not at the point of considering it a suitable complete replacement for humans at this stage." The study was conducted as part of Deep Links, a research project funded by the Natural Environment Research Council, and led by the University of Plymouth, in collaboration with Oxford University, British Geological Survey and the Joint Nature Conservation Committee. One of the UK's national AUVs - Autosub6000, deployed in May 2016 - collected more than 150,000 images in a single dive from around 1200m beneath the ocean surface on the north-east side of Rockall Bank, in the North East Atlantic. Around 1,200 of these images were manually analysed, containing 40,000 individuals of 110 different kinds of animals (morphospecies), most of them only seen a handful of times. Researchers then used Google's Tensorflow, an open access library, to teach a pre-trained Convolutional Neural Network (CNN) to identify individuals of various deep-sea morphospecies found in the AUV images. They then assessed how the CNN performed when trained with different numbers of example images of animals, and different of numbers of morphospecies to choose from. The accuracy of manual annotation by humans can range from 50 to 95%, but this method is slow and even specialists are very inconsistent across time and research teams. This automated method reached around 80% accuracy, approaching the performance of humans with a clear speed and consistency advantage. This is particularly true for some morphospecies that the algorithms work very well with. For example, the model correctly identifies one animal (a type of xenophyophore) 93% of the time. While the study does not advocate the replacement of manual annotation, it does demonstrate that marine biologists could be able to implement AI for specific tasks if carefully assessing the reliability of their predictions. This would greatly enhance the capacity of scientists to analyse their data. The researchers say the combination of specialist ecological knowledge with the high-tech AUV's capacity to survey large areas of the seabed, and the fast data processing capacity of AI, could greatly speed up deep-ocean exploration, and with it our wider understanding of marine ecosystems. Dr Kerry Howell, Associate Professor in Marine Ecology and Principal Investigator for the Deep Links project, added: "Most of our planet is deep sea, a vast area in which we have equally large knowledge gaps. With increasing pressures on the marine environment including climate change, it is imperative that we understand our oceans and the habitats and species found within them. In the age of robotic and autonomous vehicles, big data, and global open research, the development of AI tools with the potential to help speed up our acquisition of knowledge is an exciting and much needed advance."
![]() ![]() Ultrasound technology reveals the fetus of a pregnant wild reef manta ray Washington DC (UPI) May 01, 2019 Scientists successfully imaged the fetus of a pregnant wild reef manta ray using the world's first contactless underwater ultrasound scanner. The waters surrounding the islands of the Republic of Maldives host the world's largest population of reef manta rays, but the ray population varies greatly. "Manta rays are one of the most beautiful and iconic creatures that swim in our oceans," Gareth Pearce, a researcher with the department of veterinary medicine at the University of Cambridge, ... read more
![]() |
|
The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us. |