Adaptive Robotics at Barkley Canyon and Hydrate Ridge

Five Senses

Aug. 09 2018

Marcel Proust’s quote Not in seeking new landscapes, but in having new eyes” resonates with the work of the Adaptive Robotics team. The experts aim to advance the way in which humanity is able to observe the oceans, and push the experience further, tapping into the potential of all five senses.

Today, scientists and researchers around the world have a general idea of the geological structures on the seafloor. This is because we have mapped the entirety of our planet using satellites. However, when it comes to generating deep understanding that could ultimately guide policies and industry, the five kilometer resolution generated by Earth-orbiting platforms does not offer enough information to draw meaningful conclusions. This becomes abundantly clear in the following comparison of an image of London acquired by Google Earth (which averages a resolution of fifteen meters) and a picture of the same area at a five kilometer resolution – the average resolution of our seabed maps.

A comparison of an image of London acquired by Google Earth and a picture of the same area, at a five kilometer resolution.

Clarity improves when the image is increased to a hundred meter resolution, which is the level of detail with which we have mapped 10% of our oceans. But if there is a need to zoom in, features get blurry again.

Clarity improves when acquired with a hundred meter resolution, but zooming in is still very difficult.

They are much clearer when a ten meter resolution is used. Currently, only 0.05% of our oceans are mapped with such level of detail.

Images are much clearer when a ten meter resolution is used, yet we have only mapped 0.05% of our oceans with such level of detail.

Kilometers of seawater lie between even the most powerful of lenses and the underwater objects experts need to observe. Technology has not evolved to a point where it can circumvent physics, which is why currently the only way to get sharper images is to get closer. The engineers on Falkor are working hard in an effort to give new eyes to researchers, or at least improve their eyesight underwater.

Full Immersion
But they have no intention to stop there. By resorting to specific sensors – which are the true interface between the robots and the environment – the team is effectively taking advantage of the five senses.

Sight is a very important tool to explore underwater environments, but through the use of different sensors, the team on board is taking advantage of all five senses.

The sense of touch is measured by temperature and pressure sensors. The robots generate signals as functions of the pressure and temperatures they experience while diving down, helping the experts to paint a more exact picture of the surveyed environments.

Using sound: Sonars are fundamental to the operations of the vehicles. By emitting and receiving sound waves, they are able to detect objects around them and navigate at the desired depth and distance from the seafloor, while avoiding collisions.

“Taste” and “smell” input are received by from chemical sensors. The chemistry of hydrate fields sustains some of the richest known ecosystems on the seafloor, which is why sensors on every platform will “smell” the water in search for different chemical signatures. ROV SuBastian will carry an advanced laser spectrometer to test pore water, the fluids contained inside the sediments on the seafloor, and even taste the sediments themselves.

The logistic and financial efforts to send AUVs down to the depths of the ocean are extremely high, so capitalizing on every single deployment is fundamental. Engineers strive to improve the quality of the images we are able to acquire, but beyond simple visual observations, researchers are covering every angle to provide an in-depth description of the visited sites. This way, they will also enable the algorithms they are developing to produce more objective and useful insights based off of more detailed information. Proust would be proud – they are not just developing new eyes in this case, but also a new nose, tongue, and hand! 

Researchers Tetsu Koike and Kazunori Nagano work on the camera load onboard AE200f, which is advancing our eyesight when exploring the ocean floor.SOI / Monika Naranjo Gonzalez

Share This