{"id":2451,"date":"2020-01-20T12:18:05","date_gmt":"2020-01-20T12:18:05","guid":{"rendered":"https:\/\/blogs.mathworks.com\/headlines\/?p=2451"},"modified":"2020-01-20T12:18:05","modified_gmt":"2020-01-20T12:18:05","slug":"computer-vision-algorithm-removes-the-water-from-underwater-images","status":"publish","type":"post","link":"https:\/\/blogs.mathworks.com\/headlines\/2020\/01\/20\/computer-vision-algorithm-removes-the-water-from-underwater-images\/","title":{"rendered":"Computer vision algorithm removes the water from underwater images"},"content":{"rendered":"<p>Underwater photography is hard to get right. Special filters, artificial lights, and top-of-the-line underwater cameras can help, but there\u2019s still a lot of water between the camera and the object in the photo. We\u2019ve become accustomed to the blue-green tint of underwater photography.<\/p>\n<p>&nbsp;<\/p>\n<p><div id=\"attachment_2455\" style=\"width: 510px\" class=\"wp-caption alignnone\"><a href=\"https:\/\/blogs.mathworks.com\/headlines\/2020\/01\/20\/computer-vision-algorithm-removes-the-water-from-underwater-images\/8682-before\/\" rel=\"attachment wp-att-2455\"><img aria-describedby=\"caption-attachment-2455\" decoding=\"async\" loading=\"lazy\" class=\"wp-image-2455\" src=\"https:\/\/blogs.mathworks.com\/headlines\/files\/2020\/01\/8682-before.jpeg\" alt=\"\" width=\"500\" height=\"333\" \/><\/a><p id=\"caption-attachment-2455\" class=\"wp-caption-text\">Before: A coral reef in the Red Sea, Israel. Image Credit Matan Yuval, Marine Imaging Lab, University of Haifa<\/p><\/div><\/p>\n<p>&nbsp;<\/p>\n<p>How would the ocean look without water? \u00a0What are the true colors of a coral reef? Thanks to a new computer vision algorithm called <a href=\"https:\/\/www.deryaakkaynak.com\/sea-thru\" target=\"_blank\" rel=\"noopener noreferrer\">Sea-thru,<\/a> we can see what an underwater scene would look like if it was photographed through the air instead of water.<\/p>\n<p>&nbsp;<\/p>\n<p><div id=\"attachment_2459\" style=\"width: 510px\" class=\"wp-caption alignnone\"><a href=\"https:\/\/blogs.mathworks.com\/headlines\/2020\/01\/20\/computer-vision-algorithm-removes-the-water-from-underwater-images\/8682-after\/\" rel=\"attachment wp-att-2459\"><img aria-describedby=\"caption-attachment-2459\" decoding=\"async\" loading=\"lazy\" class=\"wp-image-2459 \" src=\"https:\/\/blogs.mathworks.com\/headlines\/files\/2020\/01\/8682-after.jpg\" alt=\"\" width=\"500\" height=\"333\" \/><\/a><p id=\"caption-attachment-2459\" class=\"wp-caption-text\">After: The same image as above, after processing with Sea-thru algorithm. Image Credit: Matan Yuval, Marine Imaging Lab, University of Haifa<\/p><\/div><\/p>\n<p>&nbsp;<\/p>\n<p>Sea-thru removes the visual distortion that occurs as light travels through the water to produce a color-accurate image. Sea-thru was developed by Dr. Derya Akkaynak, and engineer and oceanographer, and Dr. Tali Treibitz, an electrical engineer. And the results are not only stunning but also a physically accurate correction.<\/p>\n<p>According to <em><a href=\"https:\/\/www.scientificamerican.com\/article\/sea-thru-brings-clarity-to-underwater-photos1\/\" target=\"_blank\" rel=\"noopener noreferrer\">Scientific American<\/a><\/em>, \u201cSea-thru&#8217;s image analysis factors in the physics of light absorption and scattering in the atmosphere, compared with that in the ocean, where the particles that light interacts with are much larger. Then the algorithm effectively reverses image distortion from water pixel by pixel, restoring lost colors.\u201d<\/p>\n<p>&nbsp;<\/p>\n<p><div style=\"width: 546px\" class=\"wp-caption alignnone\"><a href=\"https:\/\/inteng-storage.s3.amazonaws.com\/img\/iea\/4N61a78b6J\/sizes\/untitled-design-51_resize_md.png\" target=\"_blank\" rel=\"noopener noreferrer\"><img decoding=\"async\" loading=\"lazy\" class=\"\" src=\"https:\/\/inteng-storage.s3.amazonaws.com\/img\/iea\/4N61a78b6J\/sizes\/untitled-design-51_resize_md.png\" width=\"536\" height=\"302\" \/><\/a><p class=\"wp-caption-text\">Image Credit: Tom Shlesinger, Institute of Global Ecology, Florida Tech<\/p><\/div><\/p>\n<p>&nbsp;<\/p>\n<p>Akkaynak developed <a href=\"http:\/\/openaccess.thecvf.com\/content_CVPR_2019\/papers\/Akkaynak_Sea-Thru_A_Method_for_Removing_Water_From_Underwater_Images_CVPR_2019_paper.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">Sea-thru<\/a> as a post-doctoral fellow at the <a href=\"http:\/\/csms.haifa.ac.il\/profiles\/tTreibitz\/index.html\" target=\"_blank\" rel=\"noopener noreferrer\">Marine Imaging Lab<\/a>, run by Dr. Tali Treibitz, at the University of Haifa. Sea-thru is licensed to SeaErra Ltd.<\/p>\n<p>The reason for the blue-green tint in underwater photos is the way light travels through deep water, where blue and violet wavelengths are absorbed least compared to the other wavelengths. The more water the light travels through, the less red, yellow, and orange wavelengths reach the object. In coastal waters, the blue and green light is absorbed faster, leaving more red light and causing the dominant brown hue.<\/p>\n<p>There\u2019s also an issue of small particles in the water, which create backscatter or haze. The further the photographer is from the object in the photo, the hazier the object becomes, much like looking at an object through a light fog.<\/p>\n<h2>How Sea-thru works<\/h2>\n<p><a href=\"https:\/\/www.deryaakkaynak.com\/research\" target=\"_blank\" rel=\"noopener noreferrer\">Sea-thru<\/a> works on raw images or videos taken with natural lighting, removing the need for expensive and difficult to set-up artificial underwater lighting. It also helps in reconstructing images of objects that are further away from the light source, where the underwater strobes don&#8217;t reach.<\/p>\n<p>The computer vision algorithm is a physics-based color reconstruction algorithm designed for underwater RGB-D images, where <em>D<\/em> stands for the <em>distance <\/em>from the camera to the object. The Sea-thru algorithm requires the distance between each pixel in the scene from the camera, as almost all parameters governing loss of colors and contrast depend on distance in some non-linear way.<\/p>\n<p>For her research, Akkaynak created a distance (range) map by capturing multiple images of the same scene. She captured images with a single camera from slightly different angles so that all of them overlap. Included in the images is a color chart placed by the object of interest. Then, using a commercial photogrammetry software called Agisoft Metashape Professional, Sea-thru built a <a href=\"https:\/\/sketchfab.com\/Marine_Imaging_Lab\/collections\/sea-thru-before-after\" target=\"_blank\" rel=\"noopener noreferrer\">3D reconstruction<\/a> of each scene from these images and exported the distance maps once the 3D reconstruction is complete.<\/p>\n<p>&nbsp;<\/p>\n<div class=\"mceTemp\"><\/div>\n<p><div id=\"attachment_2465\" style=\"width: 759px\" class=\"wp-caption alignnone\"><a href=\"https:\/\/blogs.mathworks.com\/headlines\/2020\/01\/20\/computer-vision-algorithm-removes-the-water-from-underwater-images\/sea-thru-method-maps\/\" rel=\"attachment wp-att-2465\"><img aria-describedby=\"caption-attachment-2465\" decoding=\"async\" loading=\"lazy\" class=\"wp-image-2465 \" src=\"https:\/\/blogs.mathworks.com\/headlines\/files\/2020\/01\/Sea-thru-method-maps.png\" alt=\"\" width=\"749\" height=\"200\" \/><\/a><p id=\"caption-attachment-2465\" class=\"wp-caption-text\">The darkest pixels used in the backscatter estimation are shown in red. Full research poster <a href=\"https:\/\/www.deryaakkaynak.com\/research\" target=\"_blank\" rel=\"noopener noreferrer\">here<\/a>. Image credit: Akkaynak et.al.<\/p><\/div><\/p>\n<p>&nbsp;<\/p>\n<p>The range maps are grayscale .<em>tiff<\/em> files, which are then read into <a href=\"https:\/\/www.mathworks.com\/products\/matlab.html\" target=\"_blank\" rel=\"noopener noreferrer\">MATLAB<\/a> and combined with the RAW images. The images are processed according to this <a href=\"https:\/\/rcsumner.net\/raw_guide\/RAWguide.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">guide<\/a>. MATLAB is used to estimate backscatter by searching the image for very dark pixels or shadowed pixels. In the image above, the darkest pixels used in backscatter estimation are shown in red. The backscatter in the images is subtracted with MATLAB.<\/p>\n<p>In the next step, Sea-thru computed the relevant parameters of color attenuation using only the pixels and the depth map D and inverts them to reveal original colors. The attenuation of each wavelength of light is calculated with MATLAB. With this information, Sea-thru inverts the image to reveal the true color.<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p><div style=\"width: 647px\" class=\"wp-caption alignnone\"><a href=\"https:\/\/image.businessinsider.com\/5dd69e19fd9db23c606b0e92?width=1300&amp;format=jpeg&amp;auto=webp\" target=\"_blank\" rel=\"noopener noreferrer\"><img decoding=\"async\" loading=\"lazy\" class=\"\" src=\"https:\/\/image.businessinsider.com\/5dd69e19fd9db23c606b0e92?width=1300&amp;format=jpeg&amp;auto=webp\" alt=\"\" width=\"637\" height=\"319\" \/><\/a><p class=\"wp-caption-text\">A coral reef in the Red Sea, Israel. Image Credit: Matan Yuval, Marine Imaging Lab, University of Haifa<\/p><\/div><\/p>\n<p>&nbsp;<\/p>\n<p>Sea-thru works on videos as well as still images. See a side-by-side video from Lake Tanganyika, Zambia below:<\/p>\n<p><iframe loading=\"lazy\" title=\"Sea-thru on video: Lake Tanganyika 2\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/oSrBMX8e6yo?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture\" allowfullscreen><\/iframe><\/p>\n<p>Video credit: Alex Jordan<\/p>\n<h2>Accurate Images Help Researchers<\/h2>\n<p>Climate researchers and oceanographers study underwater ecosystems such as coral reefs to better understand both their current state and how these systems change over time. These studies often rely on imaging and data technology to document and understand the impact of climate change on corals and other marine systems. Sea-thru can help these efforts by providing a color-true representation of the image data. Accurate colors in images will make it easier for machine learning tools to accurately identify species in the images, for example.<\/p>\n<p>The algorithm differs from applications such as Photoshop, with which users can artificially enhance underwater images by uniformly pumping up reds or yellows.<\/p>\n<p>\u201cWhat I like about this approach is that it&#8217;s really about obtaining true colors,\u201d Pim Bongaerts, a coral biologist at the California Academy of Sciences, told <em>Scientific American<\/em>. \u201cGetting true color could really help us get a lot more worth out of our current data sets.\u201d<\/p>\n<p>&nbsp;<\/p>\n<p><div style=\"width: 710px\" class=\"wp-caption alignnone\"><a href=\"https:\/\/image.businessinsider.com\/5dd6a6f2fd9db24508465a66?width=700&amp;format=jpeg&amp;auto=webp\" target=\"_blank\" rel=\"noopener noreferrer\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large\" src=\"https:\/\/image.businessinsider.com\/5dd6a6f2fd9db24508465a66?width=700&amp;format=jpeg&amp;auto=webp\" width=\"700\" height=\"350\" \/><\/a><p class=\"wp-caption-text\">A coral reef in the Red Sea, Israel. Image Credit: Matan Yuval, Marine Imaging Lab, University of Haifa<\/p><\/div><\/p>\n<p>&nbsp;<\/p>\n<p>\u201cThere are a lot of challenges associated with working underwater that put us well behind what researchers can do above water and on land,\u201d says Nicole Pedersen, a researcher on the 100 Island Challenge, a project at the University of California, San Diego. For this project, scientists take up to 7,000 pictures per 100 square meters to assemble 3-D models of reefs.<\/p>\n<p>\u201cProgress has been hindered by a lack of computer tools for processing these images,\u201d Pedersen told <em>Scientific American<\/em>. \u201cSea-thru is a step in the right direction.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"<div class=\"overview-image\"><!-- Featured Image From URL plugin --> <img decoding=\"async\" src=\"https:\/\/image.businessinsider.com\/5dd69e19fd9db23c606b0e92?width=1300&#038;format=jpeg&#038;auto=webp\" alt=\"\" style=\"\"><\/div>\n<p>Underwater photography is hard to get right. Special filters, artificial lights, and top-of-the-line underwater cameras can help, but there\u2019s still a lot of water between the camera and the object in&#8230; <a class=\"read-more\" href=\"https:\/\/blogs.mathworks.com\/headlines\/2020\/01\/20\/computer-vision-algorithm-removes-the-water-from-underwater-images\/\">read more >><\/a><\/p>\n","protected":false},"author":138,"featured_media":-1,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[1],"tags":[],"_links":{"self":[{"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/posts\/2451"}],"collection":[{"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/users\/138"}],"replies":[{"embeddable":true,"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/comments?post=2451"}],"version-history":[{"count":7,"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/posts\/2451\/revisions"}],"predecessor-version":[{"id":2473,"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/posts\/2451\/revisions\/2473"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/"}],"wp:attachment":[{"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/media?parent=2451"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/categories?post=2451"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/tags?post=2451"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}