In

wireless communication, there may exist multiple possible paths from the sender to the receiver for the signal to travel through, due to

reflection,

scattering,

diffraction, etc. Imagine two people calling each other in a mountainous place, one will probably hear multiple echoes of the other's voice; in wireless communication the cause of multipath effects is the same, except with

electromagnetic waves instead of

sound waves. The signal picked up by the receiver contains the contribution from each path, and by adding all these contributions (

phase-shifted and

delayed versions of the transmitted signal with different

amplitudes),

distortion results. Such distortion is usually called

*fading*, and such channels called

*fading channels*, as such distortion usually manifests itself as received signals that go very weak from time to time, sorta like

SW radio. Multipath/fading effects are very detrimental towards reliable digital communication, unless countered with

diversity techniques, although recent advancements, such as

BLAST, suggest that the existence of multipath effects can also be exploited to increase

channel capacity in certain cases, by using multiple transmitting/receiving antennas.

The maximum difference in delay among all paths is called *delay spread*, denoted as T_{m}. As electromagnetic waves travel at the speed of light, the delay spread can usually calculated by dividing the difference in length between the longest and the shortest path by the speed of light. If T_{m} is much smaller than the symbol duration T (the duration of a symbol; a symbol contains one bit for binary modulation schemes, or M bits for 2^{M}-ary modulation schemes; T depends on the transmission rate, but is often on the order of 10~100 microseconds), then the shape of the transmitted signal is not much distorted by the delay spread (imagine a rectangular pulse whose length is T, after adding up all the delayed multipath components, the edge of the pulse becomes smoother, but the basic shape remains if T_{m}≪T), only its amplitude and phase are affected. We call this *flat fading*. Conversely, if T_{m} is comparable or larger than the symbol duration T, then the (envelope) shape of the transmitted signal changes significantly, causing inter-symbol interference, and equalization is now required for good receiver performance. Such case is called *frequency-selective fading*, for the bandwidth of the transmitted signal is relatively large (roughly equals to 1/T), and the amount of fading is not the same across the whole frequency band, causing distortion in the envelope shape. The concept *coherence bandwidth* f_{c}=1/T_{m} is also used sometimes, roughly denoting the minimum difference of two frequencies that fade independently.

In the case of flat fading, there are still often a lot of paths from the sender to receiver, only that they have roughly the same length and the same delay, with respect to symbol duration. For example, if the receiver is indoors, it is probable that all paths goes the same way from the sender to the room, but each of them bounce in a different way on the walls before finally reaching the receiver. The delay differences between these paths are usually very slight, since the room is small (remember the speed of light is about 300 meters per microsecond!), therefore the fading is flat; however, the length difference between the paths may well be larger than the half-wavelength of the wave, which is on the order of tens of centimeters for commonly used systems today, so when each path reaches the receiver, their phase (of the carrier) is usually wildly different, causing their amplitude to add up or cancel, essentially at random. One common case is when the number of paths with similar delays is high, and none of the paths have much higher signal strength than others, therefore one can invoke the central limit theorem to show that the total complex gain (the sum of the complex gain of each path) is roughly a complex Gaussian variable with zero mean, which means that the amplitude gain of the received signal, relative to the transmitted signal, follows Rayleigh distribution, and the phase shift follows a uniform distribution on [0,2π] (completely random phase). This is called *Rayleigh fading*. Another common case is that one of the paths is particularly strong, while others are roughly equal in strength (for example, the strong path may be the line-of-sight path, while the others may be the result of scattering that diminishes their strength). In this case the total complex gain is a complex Gaussian variable with a non-zero mean, so the amplitude of the received signal follows a Rice distribution, and we call this *Rician fading*. As the strength of the line-of-sight path weakens relative to the other paths in Rician fading, the probability that the total amplitude gain is low becomes higher, thus the receiver performance becomes worse, until it becomes Rayleigh fading. The above has been stated in the case of flat fading; for frequency-selective fading, one can analyze the set of paths whose delay is near a given value, and the same analysis applies.

As the wavelength is short (tens of centimeters), the phase-shift of each path will change significantly when the sender, the receiver, or any of the scatterers in between is moved by more than a half-wavelength, causing a dramatic change in the amplitude and phase of the received signal. Therefore, if either the sender or the receiver is in motion, the characteristics of the channel (for flat fading channels, the amplitude gain and the phase-shift) will be time-variant. The time-variantness of the channel is measured by the *coherence time* t_{c}, which equals to the inverse of the doppler frequency f_{D}=vf/c, where v is the velocity of the sender relative to the receiver, f is the frequency of the carrier, and c is the speed of light. If t_{c}≫T, the channel is basically invariant during a symbol duration, so it is called a *slow-fading* channel. Otherwise, it is called a *fast-fading* channel. Fast-fading channels change so rapidly that many signal-processing algorithms, such as adaptive equalization, no longer works, therefore it is usually avoided in practice, by reducing the symbol duration (increasing the symbol rate) or limiting the speed of the sender/receiver.