Abstract:
Link adaptation, multiuser resource scheduling and adaptive MIMO precoding are
implemented in the Long Term Evolution (LTE) downlink in order to improve spectral
efficiency and enhance effective utilization of the available radio resources. These
processes require the transmitter to have an accurate knowledge of the channel state
information (CSI). This is typically provided via feedback from the receiver. Due to
processing and feedback delays, the CSI used at the transmitter is outdated leading to
performance degradation causing a decrease in the overall system capacity.
Channel prediction is an important technique that can be used to mitigate the system
degradation that arises as a result of the inevitable feedback delay. The minimum mean
square error (MMSE) based algorithms have been proven to have high performance in
channel estimation and prediction. However this superior performance is accompanied
by a high computational complexity due to the matrix inversion required as well as the
large size of the channel matrix.
In this thesis, the problem of channel aging on the LTE downlink is discussed. After a
review of the LTE architecture along with its MIMO-OFDM radio interface, an
overview of transmissions through a wireless fading channel is presented. A system
model for block fading channels is then presented and a MMSE channel prediction
method is derived. A reduced complexity approximate MMSE channel (AMMSE)
prediction algorithm is then proposed to reduce the high computational complexity
inherent in the MMSE method. The complexity reduction is achieved through
reduction of the size of the channel matrix as well as approximating the matrix
inversion through iteration. Evaluation of the proposed approximate MMSE algorithm