Jitter Speed Test

What can be the effect of excessive Jitter? 

 

The rate of packets streaming out of the de-jitter buffer is known as the "channel rate". The rate at which the buffer receives data is known as the "fill rate". If the buffer size is too little then if the channel rate exceeds the fill rate, then it will eventually underflow, resulting in stalled packet stream. If the sink rate exceeds the channel rate, then eventually the buffer will overflow, resulting in packet loss. However, if the buffer size is too large, then the network element will introduce excessive latency. 

 

How would you measure IP packet Jitter? 

 

Jitter is measured by plotting the time-stamps of the packet inter-appearance times versus time. 

 

This is useful to identify variances in jitter over time, however it is additionally useful to be able to plot the dispersion of inter-appearance intervals versus frequency of occurrence as a histogram. If the jitter value is enormous to the point that it causes packets to be received out of the range of the de-jitter buffer, then the out-of-range packets are dropped. Being able to identify outliers is a guide in identifying if the network jitter performance is either likely to or already the cause of packet loss. 

 

A series of packets with long inter-appearance intervals, will inevitably result in a corresponding explosion of packets with short inter-appearance intervals. It is this explosion of traffic, that can result in buffer overflow conditions and lost packets. This happens if the sink rate exceeds the channel rate for a while that exceeds the length of the remaining buffer size, when represented in microseconds. 

 

How would you establish the de-jitter buffer size? 

 

To establish the necessary de-jitter buffer size, an alternative type of jitter measurement known as Delay Factor (DF) is used. This is a temporal measurement indicating the temporal buffer size necessary to de-jitter the traffic. 

 

In IP Video networks, the media payload is transported over RTP (Real Time Protocol). One type of DF measurement takes advantage of the way that the RTP header carries timestamp data which reflects the inspecting moment of the RTP data packet. This is known as Time-Stamped Delay Factor or TS-DF (as defined by EBU Tech 3337). 

 

The TS-DF measurement is based on the relative travel time, which is the difference between a packet's RTP timestamp and the receiver's clock at the time of appearance, measured in microseconds. The measurement period is 1 second, with the main packet toward the beginning of the measurement period being considered to have no jitter and is used as a reference packet. 

 

For each subsequent packet, the relative travel time between this packet and the reference packet is calculated and toward the end of the measurement period, the most extreme and least values are extracted and the Time-Stamped Delay Factor is calculated as: 

 

TS-DF = D(Max) - D(Min) 

 

The most extreme value of TS-DF over a given period is indicative of the de-jitter buffer size required during that period, for a receiving device at that network node.


Perform Jitter Speed Test.