Packet streams observed in today’s networks show a high degree of autocorrelation as successive packets tend to exhibit similar delay and loss characteristics. Network emulators used to conduct performance evaluations therefore have to support the generation of such autocorrelated streams. Typically, this is done using first order autoregressive processes. In this paper, we show that a common implementation of this approach leads to problems when emulating correlated packet loss, and therefore possibly produces erroneous results and conclusions when evaluating objectively application quality or QoE in subjective studies. Further typical network emulator implementations make even low packet jitter become an unexpected QoE killer. © 2015 IEEE.