In telecommunications, guard intervals are used to ensure that distinct transmissions do not interfere with one another, or otherwise cause overlapping transmissions. These transmissions may belong to different users (as in TDMA) or to the same user (as in OFDM).

The purpose of the guard interval is to introduce immunity to propagation delays, echoes and reflections, to which digital data is normally very sensitive.

Use in digital communications systems

edit

In OFDM, the beginning of each symbol is preceded by a guard interval. As long as the echoes fall within this interval, they will not affect the receiver's ability to safely decode the actual data, as data is only interpreted outside the guard interval.

In TDMA, each user's timeslot ends with a guard interval. Thus, the guard interval protects against data loss within the same timeslot, and protects the following user's timeslot from interference caused by propagation delay. It is a common misconception that TDMA timeslots begin with the guard interval, as with OFDM. However, in specifications for TDMA systems such as GSM, the guard period is defined as being at the end of the timeslot.

Longer guard periods allow more distant echoes to be tolerated but reduce channel efficiency. For example, in DVB-T, guard intervals are available as 1/32, 1/16, 1/8 or 1/4 of a symbol period. The shortest interval (1/32) provides the lowest protection and the highest data rate; the longest interval (1/4) provides the highest protection but the lowest data rate. Ideally, the guard interval is set to just above the delay spread of the channel.

802.11 guard interval

edit

The standard symbol guard interval used in IEEE 802.11 OFDM is 0.8 μs. To increase data rate, 802.11n added optional support for a 0.4 μs guard interval. This provides an 11% increase in data rate. To increase coverage area, IEEE 802.11ax (Wi-Fi 6) provides optional support for 0.8 μs, 1.6 μs, and 3.2 μs guard intervals.[citation needed]

The shorter guard interval results in a higher packet error rate when the delay spread of the channel exceeds the guard interval or if timing synchronization between the transmitter and receiver is not precise. A scheme could be developed to work out whether a short guard interval would be of benefit to a particular link. To reduce complexity, manufacturers typically only implement a short guard interval as a final rate adaptation step when the device is running at its highest data rate.[1]

See also

edit

References

edit
  1. ^ Perahia and Stacey, Next Generation Wireless LANs, Cambridge University Press, 2008
edit