Coordinated Robotics Part 2: ʻAuʻAu Channel

Backstage

Feb. 01 2018

“The first problem we had is that the vehicle wasn’t communicating with the ship’s USBL system, so we weren’t able to keep track of where the Iver was underwater,” says Nick Goumas. “That made the mission very high risk. So we decided we were confident running a simple compass calibration mission, which is when the vehicle only goes a couple meters below the surface and runs a couple legs: north to south, east to west, and then at the 45 degrees angles between those. It collects data and processes it to figure out if the compass chip inside the vehicle has some sort of intrinsic deviation that is giving it an error.”

The AUV Iver sails close to Falkor. The team is perfecting the performance of its cameras. They expect it to be able to run in a more automatic fashion and flesh out procedures to ensure that it functions smoothly in the future.Gideon Billings

This is not the final goal of the actual work the Iver vehicle will carry out – it is simply one of the many steps that must take place to make sure it performs correctly. Many people are unaware of all the oceanographic work taking place in the world right now, and those who are aware often just get a view of thoroughly-tested and proven tools, such as a CTD package. The effort behind the development of scientific tools is painstaking, often slow, and certainly the product of stubbornness and passion. It often takes place away from the spotlight, backstage.

“We finally figured out what the problem with the USBL was, and today we’ll start doing some more real-looking missions,” explains Nick, senior research engineer at the Deep Robot Optical Perception (DROP) lab at the University of Michigan. “We will start closer to the surface and work our way down, take a look at the data and make sure that it is looking right – that the Iver is not rolling or pitching too hard underwater, or coming up too fast. We’ll do that all day, and tomorrow we’ll probably start trying to image the bottom and see what’s there.”

Nick Goumas and Gideon Billings keep track of the AUV’s performance during a test dive in the ‘Au ‘Au Channel.SOI / Monika Naranjo Gonzalez
Two Iver AUVs are taking part in the Coordinated Robotics expedition. The University of Sydney and the University of Michigan tackled the same objectives in different ways.SOI / Monika Naranjo Gonzalez

Iver
Two Iver Autonomous Underwater Vehicles (AUVs) are taking part in the Coordinated Robotics expedition. Once an Iver vehicle is purchased, its owners can customize it as they see fit. This explains why, although both Ivers on board are able to conduct the same kind of missions and acquire the same type of data, the engineering of their interior – and their software – are completely different. The University of Sydney and the University of Michigan tackled the same objectives in different ways.

The Michigan team added two camera sections and two light sections in the center of the vehicle, which enables it to capture stereo images. Using an algorithm, they can take two images and stitch them together, operating in a similar way to how our eyes deal with depth perception. “When you overlap them, you see the differences. Those differences tell you how far the pixel is from the camera,” Nick explains. “And with that information, you can build a tridimensional map of the bottom. When you go back and forth, you acquire a battery of images and you get an actual three-dimensional map of an area: that is what our research is based on.”

The University of Michigan team installed two cameras in the AUV, which enables it to capture stereo images. Using an algorithm, they can stitch them together, similar to how our eyes deal with depth perception.Gideon Billings

Tenacity
It is one level of challenge to buy an Iver, design its systems and install them in the vehicle. It is another to make sure all of them work in unison in the ever-changing conditions of the ocean. “There are challenges both in hardware and software,” says Gideon Billings, PhD pre-candidate in Robotics at the University of Michigan in the DROP lab. “Challenges with the hardware and mission planning relate to making sure that we stay far enough off the bottom so we don’t hit anything, but at the same time running close enough to the bottom so that we can get sharp images. We need to make sure that our strobes are bright enough to capture good pictures and also that we can get a high enough frame rate so that there’s not too much blur in the image.”

Beyond getting the cameras and strobes to work properly, the team must get the vehicle to behave as intended. “We’ll spend a few days on the operational side, getting it to behave exactly how we want. Once that’s done, we’ll move our focus to the cameras, which are just as finicky as anything else, so that might be a couple of days of sending it down, getting it close to the bottom, taking pictures and bringing them up. They might be all washed out by the strobes, or have bad image blur if you were going too fast, so there’ll be a whole other cycle,” adds Nick. Tenacity is fundamental when trialing new oceanographic tools, especially if they must perform in an autonomous way.

The Iver is ready to go, splashing around as its propeller gets going.SOI / Monika Naranjo Gonzalez

USBL
The team knows there is always the chance that something can go wrong on a mission. That can be the result of many different factors: anything from a simple malfunction of the vehicle, to getting caught on a fishnet. The Iver does not have instruments to scan the space in front of it, so if there is an object in its path, it could potentially collide and break, causing it to flood and drop to the bottom of the ocean. The team needs a system that allows them to check its location constantly, or at least to be able to communicate its status. That is where the Ultra Short Baseline (USBL) comes into play – it uses sound waves to send data packets back and forth to the ship. When the Iver is in the ocean and the crew needs to know where it is, they use the USBL to send out a pulse. The Iver perceives it and pings back. As the signal comes back, it hits the different sensors on the USBL in a particular order, which helps the team determine the angle its coming from. Using the time delay between the moment each sensor perceived the signal and applying trigonometry, they are able to determine the position, distance, and heading.

On top of that, the USBL can send small packets of data. Distance and  heading are the main pieces of information, but the signal also carries leftover slots, so it is able to communicate different data, such as speed.

Once an Iver vehicle is purchased, its owners can customize it as they see fit.George Wakeham

Progress
Trials go on, day after day. Progress is made, but patience is mandatory. Will it ever be just a plug-and-play system? “It will definitely get easier,” answers Nick. “It will never be as easy as just going somewhere and throwing it in on the first day. But with the cameras, once we get them running properly, we shouldn’t have to touch those ever again. Getting more cruises under our belt to get things running in a more automatic fashion and fleshing out procedures will ensure that things go smoothly.”

Gideon Billings and Nick Goumas, from the University of Michigan, get ready to deploy the Iver.SOI / Monika Naranjo Gonzalez

Share This