Reconstructing Sea Level Variability at the Ieodo Ocean Research Station (1993–2023) Using Artificial Intelligence, Machine Learning, and Reanalysis Integration
Abstract. This study presents a comprehensive approach for reconstructing a high-quality, continuous monthly sea level time series at the Ieodo Ocean Research Station (IORS) from 1993 to 2023 using advanced artificial intelligence (AI) and machine learning (ML) models. After applying quality control to the in-situ KIOST data, including inverse barometric effect correction, 3σ filtering, and a 75 % data coverage threshold, we validated trends using nearby PSMSL tide gauges and four ocean reanalysis datasets (CMEMS, GLORYS, ORAS5, HYCOM). The trend analysis showed a higher rate of sea level rise from in-situ data (4.94 mm/yr, Oct 2003–Dec 2023) compared to satellite and model-based estimates (e.g., CMEMS: 3.53 mm/yr, Jan 1993–Dec 2023), suggesting localized sea level rise in the East China Sea. Initial gap-filling used statistical models such as harmonic regression and regression-based climatology. A blended approach combining climatology and trend components achieved the best accuracy (RMSE ~0.056 m, R2 = 0.688). We then implemented various AI/ML models through an Iterative Imputer framework. Ensemble models (e.g., XGBoost) performed perfectly after 2003 but did not generalize well before 2004. Deep learning models like LSTM and GRU effectively captured seasonal and nonlinear patterns post-2003, with LSTM achieving RMSE = 0.023 m and R2 = 0.95. Time series models Prophet and SARIMA-SIN successfully reconstructed the full time series, with SARIMA-SIN estimating the highest trend (5.61 mm/yr). Multiple linear regression using reanalysis data served as a baseline, but AI/ML models outperformed it in both accuracy and generalization. This study provides a reproducible, interpretable, and physically consistent framework for reconstructing sea level variability in semi-enclosed coastal seas.