Coordinated Robotics

Tools and Teamwork: Introducing Squidle

SOI/ Logan Mock Bunting
Mar. 15 2015

Autonomous Underwater Vehicles (AUV) like Sirius and gliders are powerful oceanographic tools that can help scientists obtain seafloor images and other types of data. Currently on the Coordinated Robotics cruise, Dr. Oscar Pizarro and his team are doing just that.

Michael Utley helps lower the Slocum Glider into a zodiac for ballasting calibration.
Michael Utley helps lower the Slocum Glider into a zodiac for ballasting calibration.SOI/ Logan Mock Bunting

A picture is worth a thousand data points

A Lagrangian Float at work, drifting above reef making images for scientists to study and label.
A Lagrangian Float at work, drifting above reef making images for scientists to study and label.SOI/ Logan Mock Bunting

Underwater imagery is a collaborative process used for many purposes including tracking changes in geography and biodiversity. AUVs take photos and then scientists identify and label the content in the pictures. Once the images are labeled, scientists tell the computers what the marked imagery represents. This “teaches” the computer’s basic visual facts along the lines of: “this pattern of pixels is a fish,” “this shape is a coral,” or “this texture and color is sand, while this object that looks different is kelp.” The computer processes that “learn” these lessons are called algorithms.

Scientists on Falkor's deck react to information and readings from AUV’s sensors being sent onto their laptops.
Scientists on Falkor’s deck react to information and readings from AUV’s sensors being sent onto their laptops.SOI/ Logan Mock Bunting

The teamwork occurring between the scientists and machines is producing massive amounts imagery. In fact, tens of thousands of pictures can be collected by a single underwater robot in one day. You can imagine that with all those photos, it would be impossible for scientists to look at every single image and label every aspect within it. So instead, they look at small sub-sets of imagery, developing algorithms from a limited number of pictures.

The more pictures that have labeled content, the better and faster the algorithms learn. Just like in school, the more information you can study, the better you will do on the test. Dr. Ariell Friedman, from the Marine Robotics group at the Australian Centre for Field Robotics, University of Sydney has developed a tool that will allow the public to help with the image algorithms. The web tool that Dr. Friedman created is called “Squidle” and makes  this process, fast, fun and useful. He advocates heavily for Citizen Science. “The idea behind the Citizen Science endeavor is to engage the general public, people who want to help advance science and who are interested in seeing images from places that not many people in the world ever get to see.”

Citizen science’s contribution

By looking through these images and labeling them, the public can seriously assist scientists in a useful and meaningful way. Experts can use the citizen science labels to then target areas for specific pursuits and annotation. The clabels can also be used to deduce the classification results out, beyond the areas that the experts have looked at. Citizen Science participation can also be used to train these computer algorithms to do the classifications.

squidle_webpage_med

This is where everyone can help. By participating in the image tagging on the Squidle site anyone can help better the robots who help the scientists. Every individual has the opportunity to advance marine science, increasing the efficiency of algorithmic learning, and advancing human/robotic interaction. Please visit the Squidle site and tag a few photos HERE.


Share This