Small Geodetic Datasets and Deep Networks: Attention-Based Residual LSTM Autoencoder Stacking for Geodetic Time Series
Metadata only
Date
2022Type
- Conference Paper
ETH Bibliography
yes
Altmetrics
Abstract
In case only a limited amount of data is available, deep learning models often do not generalize well. We propose a novel deep learning architecture to deal with this problem and achieve high prediction accuracy. To this end, we combine four different concepts: greedy layer-wise pretraining, attention via performers, residual connections, and LSTM autoencoder stacking. We present the application of the method in geodetic data science, for the prediction of length-of-day and GNSS station position time series, two of the most important problems in the field of geodesy. In these particular cases, where we have only relatively short time series, we achieve state-of-the-art performance compared to other statistical and machine learning methods. Show more
Publication status
publishedExternal links
Editor
Book title
Machine Learning, Optimization, and Data ScienceJournal / series
Lecture Notes in Computer ScienceVolume
Pages / Article No.
Publisher
SpringerEvent
Subject
Deep learning; Residual learning; Greedy layer-wise pretraining; Attention; Geodetic time seriesOrganisational unit
09707 - Soja, Benedikt / Soja, Benedikt
Notes
Conference lecture held on October 8, 2021.More
Show all metadata
ETH Bibliography
yes
Altmetrics