Capturing coral reefs on cameras without artificial lights can often turn out to be disappointing due to the overall blueish hue. While casual photographers could get around this with a few minutes of photo editing, researchers are getting limited from using computer vision and machine learning techniques to study these coral reefs. To get around this problem, an engineer and oceanographer Derya Akkaynak has built an algorithm called Sea-thru.
As the name hints, the algorithm aims to remove the disturbances and deviations caused by the water from capturing the actual colors of coral reefs. Akkaynak and Tali Treibitz presented a paper on this topic at the IEEE Conference on Computer Vision and Pattern Recognition this June.
“The Sea-thru method first calculates backscatter using the darkest pixels in the image and their known range information. Then, it uses an estimate of the spatially varying illuminant to obtain the range-dependent attenuation coefficient.”, states Akkaynak in the research paper.
However, this algorithm needs distance information to work. The distance estimation is currently done by capturing several images of the same scene from different angles which the algorithm uses to estimate the distance. A report from Scientific American notes that scientists already include distance information in image datasets by using a technique called photogrammetry.
“What I like about this approach is that it’s really about obtaining true colors. Getting true color could really help us get a lot more worth out of our current data sets.”, says Pim Bongaerts, a coral biologist at the California Academy of Sciences.
So, what are your thoughts on Sea-thru? Do you think it is a step in the right direction for exploring and analyzing the coral reefs? Let us know in the comments.