ISSN 2096-4498

   CN 44-1745/U

二维码

Tunnel Construction ›› 2024, Vol. 44 ›› Issue (8): 1643-1651.DOI: 10.3973/j.issn.2096-4498.2024.08.011

Previous Articles     Next Articles

Prediction of Shield Attitude Using Deep Residual Long Short-Term Memory Model

ZHOU Kangmin1, 2, 3, CHENG Kang4, ZENG Shaoxiang2, 3, DING Zhi1, *, YU Song5, FENG Zhiguo5   

  1. (1. Key Laboratory of Safe Construction and Intelligent Maintenance for Urban Shield Tunnels of Zhejiang Province, Hangzhou 310015, Zhejiang, China; 2. College of Civil Engineering, Zhejiang University of Technology, Hangzhou 310014, Zhejiang, China; 3. Engineering Research Center of Ministry of Education for Renewable Energy Infrastructure Construction Technology, Hangzhou 310014, Zhejiang, China; 4. China Railway 11th Group Co., Ltd., Wuhan 430061, Hubei, China; 5. China Railway Major Bridge Reconnaissance & Design Institute Co., Ltd., Wuhan 430050, Hubei, China)
  • Online:2024-08-20 Published:2024-09-13

Abstract: Traditional machine learning models often fall short in accurately predicting shield tunneling posture because of performance limitations when increasing network depth. To address this issue, a shield posture prediction method that leverages a deep residual long short-term memory(LSTM) model is proposed. This method integrates residual connections into the LSTM framework to address network degradation and enhance the models ability to learn long-term dependencies in shield tunneling time-series data. Additionally, a Bayesian optimization algorithm is employed to fine-tune hyperparameters and optimize the shield posture prediction model. Validation conducted in a realworld shield tunneling project in Zhejiang demonstrates that the deep residual LSTM model outperforms conventional LSTM and artificial neural network models. Taking the shield tail horizontal deviation prediction as an example, the deep residual LSTM model has a determination coefficient (R2) of 0.90 and a mean absolute error(MAE) of 0.76 mm. In comparison, the LSTM model yields an Rof 0.64 and MAE of 1.08 mm, whereas the artificial neural network model shows an Rof 0.68 and MAE of 1.93 mm. Furthermore, compared to the LSTM model, the deep residual LSTM model can effectively utilize more network layers (from 5 to 8 layers), demonstrating the significant role of residual connections in preventing network degradation and improving feature learning.

Key words: shield tunnel, long short-term memory, residual connection, machine learning, Bayesian optimization, attitude prediction