AI Helps Create One of the Largest 3D Maps of the Seafloor
News Aug 31, 2018 | Original Press Release from the Schmidt Ocean Institute
AUV AE2000f is the expedition’s scout – it swims at about 20km/h and collects preliminary images of the seafloor in order to determine where the other robots should focus for future dives. Credit: Schmidt Ocean Institute
A recent expedition led by Dr. Blair Thornton, holding Associate Professorships at both the University of Southampton and the Institute of Industrial Science, the University of Tokyo, demonstrated how the use of autonomous robotics and artificial intelligence at sea can dramatically accelerate the exploration and study of hard to reach deep sea ecosystems, like intermittently active methane seeps. Thanks to rapid high throughput data analysis at sea, it was possible to identify biological hotspots at the Hydrate Ridge Region off the coast of Oregon, quickly enough to survey and sample them, within days following the Autonomous Underwater Vehicles (AUV) imaging survey. The team aboard research vessel Falkor used a form of Artificial Intelligence, unsupervised clustering, to analyze AUV-acquired seafloor images and identify target areas for more detailed photogrammetric AUV surveys and focused interactive hotspot sampling with ROV SuBastian.
This project demonstrated how modern data science can greatly increase the efficiency of conventional research at sea, and improve the productivity of interactive seafloor exploration with the all too familiar “stumbling in the dark” mode. “Developing totally new operational workflows is risky, however, it is very relevant for applications such as seafloor monitoring, ecosystem survey and planning the installation and decommissioning of seafloor infrastructure,” said Thornton.
The idea behind this Adaptive Robotics mission was not to upturn the structure of how things are done at sea, but simply to remove bottlenecks in the flow of information and data-processing using computational methods and Artificial Intelligence. The algorithms used are able to rapidly produce simple summaries of observations, and form subsequent deployment plans. This way, scientists can respond to dynamic changes in the environment and target areas that will lead to the biggest operational, scientific, or environmental management gains.
More than 1.3 million seafloor images were collected and algorithmically analyzed to find biological hotspots and precisely target them for interactive sampling and observations. The initial wide-area seafloor imagery was acquired with an underwater vehicle “Ae2000f” using high-altitude 3D visual mapping cameras at underwater sites between 680 and 780 meters depth. The international team deployed multiple AUVs, developed by the University of Tokyo, which were equipped with 3D visual mapping technology developed jointly by the University of Sydney, University of Southampton, and the University of Tokyo and the Kyushu Institute of Technology as part of an international collaboration.
The conversion of the initial wide area survey imagery into three dimensional seafloor maps and habitat type summaries onboard Falkor, allowed the researchers to plan the subsequent robotic deployments to perform higher resolution visual imaging, environmental and chemical surveying, and physical sampling in areas of greatest interest, particularly at the ephemeral hotspots of biological activity that intermittently form around transitory methane seeps. Nineteen AUV deployments and fifteen ROV dives were completed in total during the expedition, including several multi-vehicle operations.
Thanks to rapid processing of data, a photogrammetric map of one of the best studied gas hydrate deposits was completed. This is believed to be the largest 3D color reconstruction of the seafloor, by area, in the world, measuring more than 118,000 square meters or 11.8 hectares, and covering a region of approximately 500 x 350 meters. While the average resolution of the maps obtained is 6 mm, the areas of most interest were mapped with resolution an order of magnitude higher, which would not have been possible without the ability to intelligently target the sites of interest with high resolution imaging surveys and process the large volumes of acquired data within hours of their acquisition at sea.
This was one of the areas surveyed by AE2000f, in rough proportion to RV Falkor (83 m/272 ft – seen upper left corner) – a very considerable portion of seafloor to image and process during a single expedition. Credit: SOI
Normally, maps like this would take several months to process and only after the completion of an expedition, at which point the science team is no longer at the site, and the habitats may have already evolved or expired. Instead, the research team was able to compose the 3D maps on board of Falkor within days of the images being acquired. The composite map was used during the expedition to plan operations, including the recovery of seafloor instruments and was invaluable for revisiting specific sites, such as active bubble plumes, making the entire operation more efficient.
“It is quite amazing to see such large areas of the seafloor mapped visually, especially only days after the raw data was collected. It is not just the size of the map, but also the way we were able to use it to inform our decisions while still on site. This makes a real difference as the technology makes it possible to visualize wide areas at very high resolution, and also easily identify and target areas where we should collect data. This has not previously been possible,” said Thornton.
This article has been republished from materials provided by Schmidt Ocean Institute. Note: material may have been edited for length and content. For further information, please contact the cited source.
Find out more: https://schmidtocean.org/cruise/adaptive-robotics-at-barkley-canyon-and-hydrate-ridge/
With machine learning systems now being used to determine everything from stock prices to medical diagnoses, it's never been more important to look at how they arrive at decisions. A new approach out of MIT demonstrates that the main culprit is not just the algorithms themselves, but how the data itself is collected.