Many telecommunication systems allow only a short delay in the terminal. This restriction stems from the fact that a delay will disturb the conversation. Therefore only a short delay is permissible in noise reduction processing. The well-known spectral subtraction method has been modified to introduce only a short delay and a variance reduced gain function. This modification is accomplished by viewing the spectral subtraction as an FFT filter design method which designs a time-varying filter for each block of data. The design will, however, only give the amplitude function, thus in order to obtain a causal filter a phase must be imposed. The spectral subtraction will not provide any phase information hence a minimum phase is imposed on the filter. The designed filter is transformed to the time-domain and the actual noise reduction filtering will be performed in the time-domain with only a short delay. The proposed method reduces the noise by 10 dB with a maximum processing delay of 7 samples.