Message Board

Respected readers, authors and reviewers, you can add comments to this page on any questions about the contribution, review, editing and publication of this journal. We will give you an answer as soon as possible. Thank you for your support!

Full name
E-mail
Phone number
Title
Message
Verification Code
Volume 45 Issue 7
Jul.  2023
Turn off MathJax
Article Contents
Cui Bin’ge,Yang Guang,Fang Xi, et al. Red tide detection using GF-1 WFV image based on deep learning method[J]. Haiyang Xuebao,2023, 45(7):147–157 doi: 10.12284/hyxb2023070
Citation: Cui Bin’ge,Yang Guang,Fang Xi, et al. Red tide detection using GF-1 WFV image based on deep learning method[J]. Haiyang Xuebao,2023, 45(7):147–157 doi: 10.12284/hyxb2023070

Red tide detection using GF-1 WFV image based on deep learning method

doi: 10.12284/hyxb2023070
  • Received Date: 2022-01-30
  • Rev Recd Date: 2022-12-05
  • Available Online: 2022-12-26
  • Publish Date: 2023-07-01
  • Red tide is a major marine ecological disaster in China. Effectively monitoring the occurrence and spatial distribution of red tide is of great significance for their prevention and control. Traditional red tide monitoring is mainly conducted by watercolor satellites with low spatial resolution. However, there are monitoring blind areas for frequent small-scale red tides. GF-1 WFV remote sensing images, featuring high spatial resolution and a wide imaging range, can be used to monitor small-scale red tides. However, the traditional method for watercolor satellites cannot be used for GF-1 WFV satellite data as GF-1 WFV remote sensing images are characterized by low spectral resolution and few bands. And it is hard to extract the information about red tide as they differ in both shape and scale. Due to diverse shapes of the red tide distribution, this paper proposes a scale-adaptive red tide detection network (SARTNet) for GF-1 WFV sensing images. This network adopts a two-layer backbone structure to integrate the shape and detail features of red tide and introduces an attention mechanism to model the correlation between features of red tides at different scales, thereby improving its performance in detecting red tides that are complexly distributed. The experimental results show that the red tide detection performance of SARTNet is better than that of the existing methods, with an F1 score above 0.89; and it is less affected by environmental factors, with few missing and misstated pixels for red tide information at different scales.
  • loading
  • [1]
    伍玉梅, 王芮, 程田飞, 等. 基于卫星遥感的赤潮信息提取研究进展[J]. 渔业信息与战略, 2019, 34(3): 214−220. doi: 10.13233/j.cnki.fishis.2019.03.009

    Wu Yumei, Wang Rui, Cheng Tianfei, et al. Progress in retrieval of red tide from satellite data[J]. Fishery Information & Strategy, 2019, 34(3): 214−220. doi: 10.13233/j.cnki.fishis.2019.03.009
    [2]
    翟伟康, 许自舟, 张健. 河北省近岸海域赤潮灾害特征分析[J]. 海洋环境科学, 2016, 35(2): 243−246, 251. doi: 10.13634/j.cnki.mes.2016.02.015

    Zhai Weikang, Xu Zizhou, Zhang Jian. Analysis on characteristics of red tide disaster in Hebei coastal waters[J]. Marine Environmental Science, 2016, 35(2): 243−246, 251. doi: 10.13634/j.cnki.mes.2016.02.015
    [3]
    姜德娟, 王昆, 夏云. 渤海赤潮遥感监测方法比较研究[J]. 海洋环境科学, 2020, 39(3): 460−467. doi: 10.12111/j.mes20200321

    Jiang Dejuan, Wang Kun, Xia Yun. Comparative studies on remote sensing techniques for red tide monitoring in Bohai Sea[J]. Marine Environmental Science, 2020, 39(3): 460−467. doi: 10.12111/j.mes20200321
    [4]
    郝艳玲, 曹文熙, 崔廷伟, 等. 基于半分析算法的赤潮水体固有光学性质反演[J]. 海洋学报, 2011, 33(1): 52−65.

    Hao Yanling, Cao Wenxi, Cui Tingwei, et al. The retrieval of oceanic inherent optical properties based on semianalytical algorithm during the red ride[J]. Haiyang Xuebao, 2011, 33(1): 52−65.
    [5]
    毛显谋, 黄韦艮. 多波段卫星遥感海洋赤潮水华的方法研究[J]. 应用生态学报, 2003, 14(7): 1200−1202. doi: 10.3321/j.issn:1001-9332.2003.07.037

    Mao Xianmou, Huang Weigen. Algorithms of multiband remote sensing for coastal red tide waters[J]. Chinese Journal of Applied Ecology, 2003, 14(7): 1200−1202. doi: 10.3321/j.issn:1001-9332.2003.07.037
    [6]
    王其茂, 马超飞, 唐军武, 等. EOS/MODIS遥感资料探测海洋赤潮信息方法[J]. 遥感技术与应用, 2006, 21(1): 6−10. doi: 10.3969/j.issn.1004-0323.2006.01.002

    Wang Qimao, Ma Chaofei, Tang Junwu, et al. A method for detecting red tide information using EOS/MODIS data[J]. Remote Sensing Technology and Application, 2006, 21(1): 6−10. doi: 10.3969/j.issn.1004-0323.2006.01.002
    [7]
    Ahn Y H, Shanmugam P. Detecting the red tide algal blooms from satellite ocean color observations in optically complex Northeast-Asia Coastal waters[J]. Remote Sensing of Environment, 2006, 103(4): 419−437. doi: 10.1016/j.rse.2006.04.007
    [8]
    Caballero I, Fernández R, Escalante O M, et al. New capabilities of Sentinel-2A/B satellites combined with in situ data for monitoring small harmful algal blooms in complex coastal waters[J]. Scientific Reports, 2020, 10(1): 8743. doi: 10.1038/s41598-020-65600-1
    [9]
    Khalili M H, Hasanlou M. Harmful algal blooms monitoring using SENTINEL-2 satellite images[C]//Proceedings of the International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Karaj, Iran: ISPRS, 2019: 609−613.
    [10]
    Xing Qianguo, Guo Ruihong, Wu Lingling, et al. High-resolution satellite observations of a new hazard of golden tides caused by floating sargassum in winter in the Yellow Sea[J]. IEEE Geoscience and Remote Sensing Letters, 2017, 14(10): 1815−1819. doi: 10.1109/LGRS.2017.2737079
    [11]
    Yunus A P, Dou Jie, Sravanthi N. Remote sensing of chlorophyll-a as a measure of red tide in Tokyo Bay using hotspot analysis[J]. Remote Sensing Applications: Society and Environment, 2015, 2: 11−25. doi: 10.1016/j.rsase.2015.09.002
    [12]
    Rahman A F, Aslan A. Detecting red tide using spectral shapes[C]//Proceedings of 2016 IEEE International Geoscience and Remote Sensing Symposium (IGARSS). Beijing: IEEE, 2016: 5856−5859.
    [13]
    Liu Rongjie, Zhang Jie, Cui Binge, et al. Red tide detection based on high spatial resolution broad band satellite data: a case study of GF-1[J]. Journal of Coastal Research, 2019, 90(SI): 120−128.
    [14]
    Liu Rongjie, Xiao Yanfang, Ma Yi, et al. Red tide detection based on high spatial resolution broad band optical satellite data[J]. ISPRS Journal of Photogrammetry and Remote Sensing, 2022, 184: 131−147. doi: 10.1016/j.isprsjprs.2021.12.009
    [15]
    刘岳明, 杨晓梅, 王志华, 等. 基于深度学习RCF模型的三都澳筏式养殖区提取研究[J]. 海洋学报, 2019, 41(4): 119−130.

    Liu Yueming, Yang Xiaomei, Wang Zhihua, et al. Extracting raft aquaculture areas in Sanduao from high-resolution remote sensing images using RCF[J]. Haiyang Xuebao, 2019, 41(4): 119−130.
    [16]
    崔艳荣, 邹斌, 韩震, 等. 卷积神经网络在卫星遥感海冰图像分类中的应用探究——以渤海海冰为例[J]. 海洋学报, 2020, 42(9): 100−109.

    Cui Yanrong, Zou Bin, Han Zhen, et al. Application of convolutional neural networks in satellite remote sensing sea ice image classification: a case study of sea ice in the Bohai Sea[J]. Haiyang Xuebao, 2020, 42(9): 100−109.
    [17]
    徐知宇, 周艺, 王世新, 等. 面向GF-2遥感影像的U-Net城市绿地分类[J]. 中国图象图形学报, 2021, 26(3): 700−713. doi: 10.11834/jig.200052

    Xu Zhiyu, Zhou Yi, Wang Shixin, et al. U-Net for urban green space classification in Gaofen-2 remote sensing images[J]. Journal of Image and Graphics, 2021, 26(3): 700−713. doi: 10.11834/jig.200052
    [18]
    姜宗辰, 马毅, 江涛, 等. 基于深度置信网络(DBN)的赤潮高光谱遥感提取研究[J]. 海洋技术学报, 2019, 38(2): 1−7.

    Jiang Zongchen, Ma Yi, Jiang Tao, et al. Research on the extraction of red tide hyperspectral remote sensing based on the deep belief network (DBN)[J]. Journal of Ocean Technology, 2019, 38(2): 1−7.
    [19]
    李敬虎, 邢前国, 郑向阳, 等. 基于深度学习的无人机影像夜光藻赤潮提取方法[J]. 计算机应用, 2022, 42(9): 2969−2974.

    Li Jinghu, Xing Qianguo, Zheng Xiangyang, et al. Noctiluca scintillans red tide extraction method from UAV images based on deep learning[J]. Journal of Computer Applications, 2022, 42(9): 2969−2974.
    [20]
    Lee H, Kwon H, Kim W. Generating hard examples for pixel-wise classification[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2021, 14: 9504−9517. doi: 10.1109/JSTARS.2021.3112924
    [21]
    Zhao Xin, Liu Rongjie, Ma Yi, et al. Red tide detection method for HY-1D coastal zone imager based on U-net convolutional neural network[J]. Remote Sensing, 2022, 14(1): 88.
    [22]
    刘锟, 付晶莹, 李飞. 高分一号卫星4种融合方法评价[J]. 遥感技术与应用, 2015, 30(5): 980−986.

    Liu Kun, Fu Jingying, Li Fei. Evaluation study of four fusion methods of GF-1 PAN and multi-spectral images[J]. Remote Sensing Technology and Application, 2015, 30(5): 980−986.
    [23]
    王蕊, 王常颖, 李劲华. 基于数据挖掘的GF-1遥感影像绿潮自适应阈值分区智能检测方法研究[J]. 海洋学报, 2019, 41(4): 131−144.

    Wang Rui, Wang Changying, Li Jinhua. An intelligent divisional green tide detection of adaptive threshold for GF-1 image based on data mining[J]. Haiyang Xuebao, 2019, 41(4): 131−144.
    [24]
    张海龙, 孙德勇, 李俊生, 等. 基于GF1-WFV和HJ-CCD数据的我国近海绿潮遥感监测算法研究[J]. 光学学报, 2016, 36(6): 0601004. doi: 10.3788/AOS201636.0601004

    Zhang Hailong, Sun Deyong, Li Junsheng, et al. Remote sensing algorithm for detecting green tide in China coastal waters based on GF1-WFV and HJ-CCD data[J]. Acta Optica Sinica, 2016, 36(6): 0601004. doi: 10.3788/AOS201636.0601004
    [25]
    程益锋, 黄文骞, 吴迪, 等. 基于高分一号卫星影像的珊瑚岛礁分类方法[J]. 海洋测绘, 2018, 38(6): 49−53. doi: 10.3969/j.issn.1671-3044.2018.06.012

    Cheng Yifeng, Huang Wenqian, Wu Di, et al. Coral reefs classification methods based on GF-1 satellite image[J]. Hydrographic Surveying and Charting, 2018, 38(6): 49−53. doi: 10.3969/j.issn.1671-3044.2018.06.012
    [26]
    邝辉宇, 吴俊君. 基于深度学习的图像语义分割技术研究综述[J]. 计算机工程与应用, 2019, 55(19): 12−21. doi: 10.3778/j.issn.1002-8331.1905-0325

    Kuang Huiyu, Wu Junjun. Survey of image semantic segmentation based on deep learning[J]. Computer Engineering and Applications, 2019, 55(19): 12−21. doi: 10.3778/j.issn.1002-8331.1905-0325
    [27]
    Li Hanchao, Xiong Pengfei, Fan Haoqiang, et al. DFANet: deep feature aggregation for real-time semantic segmentation[C]//Proceedings of 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Long Beach: IEEE, 2019: 9522−9531.
    [28]
    Newell A, Yang Kaiyu, Deng Jia. Stacked hourglass networks for human pose estimation[C]//Proceedings of the 14th European Conference on Computer Vision. Amsterdam: Springer, 2016: 483−499.
    [29]
    Mao Xiaojiao, Shen Chunhua, Yang Yubin. Image restoration using very deep convolutional encoder-decoder networks with symmetric skip connections[C]//Proceedings of the 30th International Conference on Neural Information Processing Systems. Barcelona: Curran Associates Inc. , 2016: 2810−2818.
    [30]
    Zhou Feng, Hu Yong, Shen Xukun. Scale-aware spatial pyramid pooling with both encoder-mask and scale-attention for semantic segmentation[J]. Neurocomputing, 2020, 383: 174−182. doi: 10.1016/j.neucom.2019.11.042
    [31]
    Zhao Hengshuang, Shi Jianping, Qi Xiaojuan, et al. Pyramid scene parsing network[C]//Proceedings of 2017 IEEE Conference on Computer Vision and Pattern Recognition. Honolulu: IEEE, 2017: 2881−2890.
    [32]
    Hu Jie, Shen Li, Sun Gang. Squeeze-and-excitation networks[C]//Proceedings of 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Salt Lake City: IEEE, 2018: 7132−7141.
    [33]
    Ronneberger O, Fischer P, Brox T. U-Net: convolutional networks for biomedical image segmentation[C]//Proceedings of the 18th International Conference on Medical Image Computing and Computer-Assisted Intervention. Munich: Springer, 2015: 234−241.
    [34]
    Chen L C, Zhu Yukun, Papandreou G, et al. Encoder-decoder with atrous separable convolution for semantic image segmentation[C]//Proceedings of the European Conference on Computer Vision. Munich: Springer, 2018: 833−851.
    [35]
    Rouse J W Jr, Haas R H, Schell J A, et al. Monitoring vegetation systems in the Great Plains with ERTS[R]. College Station, TX, United States: NASA, 1974.
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(13)  / Tables(6)

    Article views (610) PDF downloads(75) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return