Wave-tracking in the surf zone using coastal video imagery with deep neural networks

Jinah Kim, Jaeil Kim, Taekyung Kim, Dong Huh, Sofia Caires

Research output: Contribution to journalArticlepeer-review

18 Scopus citations

Abstract

Abstract: In this paper, we propose a series of procedures for coastal wave-tracking using coastal video imagery with deep neural networks. It consists of three stages: video enhancement, hydrodynamic scene separation and wave-tracking. First, a generative adversarial network, trained using paired raindrop and clean videos, is applied to remove image distortions by raindrops and to restore background information of coastal waves. Next, a hydrodynamic scene of propagated wave information is separated from surrounding environmental information in the enhanced coastal video imagery using a deep autoencoder network. Finally, propagating waves are tracked by registering consecutive images in the quality-enhanced and scene-separated coastal video imagery using a spatial transformer network. The instantaneous wave speed of each individual wave crest and breaker in the video domain is successfully estimated through learning the behavior of transformed and propagated waves in the surf zone using deep neural networks. Since it enables the acquisition of spatio-temporal information of the surf zone though the characterization of wave breakers inclusively wave run-up, we expect that the proposed framework with the deep neural networks leads to improve understanding of nearshore wave dynamics.

Original languageEnglish
Article number304
JournalAtmosphere
Volume11
Issue number3
DOIs
StatePublished - 1 Mar 2020

Keywords

  • Coastal video imagery
  • Coastal wave-tracking
  • Deep neural networks
  • Hydrodynamic scene separation
  • Image registration
  • Video enhancement

Fingerprint

Dive into the research topics of 'Wave-tracking in the surf zone using coastal video imagery with deep neural networks'. Together they form a unique fingerprint.

Cite this