What is Jitter?

Jitter meaning in Digital terms is Generic digital-data term that is defined here for the sake of digital audio-visual content. Jitter is the unwanted variation of one or more characteristics of a periodic signal in electronics and telecommunications, which may be seen in characteristics such as the interval between successive pulses, or the amplitude, frequency, or phase of successive cycles. Jitter can apply to a number of signal qualities (e.g. amplitude, phase, pulse width or pulse position), and can be quantified in the same terms as all time-varying signals (e.g. RMS, or peak-to-peak displacement). Also like other time-varying signals, jitter can be expressed in terms of spectral density (frequency content). Jitter period is the interval between two times of maximum effect (or between two times of minimum effect) of a jitter characteristic, for a jitter that varies regularly with time. Jitter frequency, the more commonly quoted figure, is its inverse.


reference: Federal Agencies Digital Guidelines Initiative – Glossary