Deep Learning-Based Channel Estimation in OFDM Systems for Time-Varying Rayleigh Fading Channels

Document Type : Research Paper

Authors

1 Electrical department

2 Electrical and Computer Engineering, Malek Ashtar University, Tehran, Iran

Abstract

For Orthogonal Frequency Division multiplexing (OFDM) systems in environments with high mobility and non-stationary channel characteristics, channel estimation is a very challenging task. To handle this issue, a deep learning (DL)-based channel estimation and data extraction algorithm is proposed. The purpose of this paper is to analyze DL-based OFDM data extraction algorithm in time-variant Rayleigh fading channels. Moreover, the model is examined in time-invariant environments. The proposed long short-term memory with projection layer (LSTMP) model, can not only exploit the features of channel variation from previous channel estimations, but also extract more features from pilots and received signals. Moreover, the LSTMP can take advantage of the LSTM estimation to further improve the performance of the channel estimation by reducing the complexity and increasing the accuracy. The LSTMP is first trained with simulated data in an offline manner and then tracks the dynamic channel in an online manner. The simulation results show that the proposed LSTMP model algorithm can be effectively employed to adapt to the characteristics of time-variant channels, compared to the conventional algorithms. Additionally, the trade-off between accuracy and complexity is discussed and compared with that of Convolutional Neural Network (CNN) and LSTM.

Keywords