Typically, a signal to be digitized is passed through an analog anti-aliasing filter just prior to the Analog to Digital Converter (ADC) to remove those components that the digitizing process would mirror to the needed frequency band (aliases). At high sampling rates, the frequencies, which alias to the useful band are far away, so this anti-aliasing filter can be simple, inexpensive. However, the resulted many samples need considerable processing power unless decimated early in the processing.

The decimation itself introduces aliases, too, so the bandwidth must be further limited. A digital filter can remove those disturbing frequency components, which the analog anti-aliasing filter did not. The best response is achieved with linear phase filters. We discuss the different definitions for the best alias rejection and present algorithms to design the corresponding optimum filters. We investigate the best implementations as well, also when we decimate together with quadrature conversion and digital mixing. Special filter types are examined (comb-, multi-rate-, cascaded half-band filters) and the effects of forcing some coefficients to 0.