In my experiement for LTE downlink latency measurement, I sent packets at the rate of 10 packets per second (one packet every 100 milli second) towards my UE under test. I observed that almost all packets had the air interface latency of around 8 milli seconds. I suppose this is normal. But for me to confirm, is there any document that I can read the shows what happens when a packet arrives at eNodeB? Is this 8 milli second the minimum delay that every packet will experience?
8 ms seems a little high for a minimum - I would guess a little over 5 ms. However if your measurement point is in the UE then 8 ms makes sense. It takes 1 ms for eNB to transmit the packet, another 3 ms before the UE can respond, 1 ms for the ACK to reach the eNB for a total of 5 ms - add in processing time in the eNB for the ACK and thats why I say a little over 5 ms. However the UE doesn't know that his ACK was successfully received until 3 ms later for a total of 8 ms. (if the ACK is not received by eNB it will retransmit the packet and the UE must respond again) .
In general, the latency of a specific packet is a combination of a number of delays. The packet arrives on the s1-U interface, the eNB processes the packet and places it in a queue, then queuing delay, then 1st HARQ transmission in one ms, 4 ms later UE responds with ACK or NACK which takes 1 ms to transmit, eNB processes the response, and then possible HARQ re-transmissions add additional delay, and possibly RLC re-transmissions add even more delay. If you are measuring 8 ms per packet, then I'd guess there is no other traffic in the cell and the RF conditions are great so that there are never any re-transmissions. The size of the packet may have an effect - e.g. if it's very large the eNB may not be able to send it all in one TTI (i.e. 1 ms). How busy the cell is will affect the queuing delay.
BTW: 36321 is the LTE MAC spec where the HARQ transmission details are described. Its a tough read, so I'd suggest an internet search instead.