Orthogonal frequency division multiplexing (OFDM) is the most promising for the future high data rate wireless communication systems. In an OFDM system, to demodulate data bits on each subcarrier, we have to know about the reference phase and amplitude of the constellation on each subcarrier. As a result, the exact channel estimation is required. The conventional discrete Fourier transform (DFT)-based channel estimation could increase the performance by zero-substituting the noise only exiting part in time domain. Comparing with linear minimum mean square error (LMMSE) estimation, it just needs the channel information about maximum channel delay time, and the mean square error (MSE) floor [5] dose not occur at high SNR. In spite of these advantages, the performance of DFT-based estimation is very susceptible to maximum channel delay time. As a result, conventional DFT-based estimation always fixes the zero-substitution size as the largest channel delay time in system application scenario to prevent the MSE floor caused by missing a parts of channel impulse response.
In this paper, we proposed an improved DFT-based channel estimation for OFDM. In this method, proposed channel estimation could increase the performance against channel various environments by deciding significant channel taps. This algorithm does not require any kinds of information about the channel statistics, since they decide significant channel taps in time domain based on the threshold, and we propose the most suitable threshold to minimize the MSE of estimation. This threshold is conformed by computer simulation.
In computer simulation, we firstly make up the application scenario and determine the worst case channel, which has the longest maximum channel delay, and the best case channel, which has the shortest maximum channel delay. Computer simulation demonstrates that proposed algorithm can improved MSE about 5dB based on simulation scenario and the MSE floor is not occurred in any cha...