Charlie joined Award Solutions in 2011, bringing his expertise in CDMA 2000 1xRTT/1xEV-DO, IEEE 802.16e WiMAX and LTE. He specializes in wireless telecommunications systems, focusing on existing and future Wireless Technologies and has over 11 years of RF design, and Electromagnetic Interference/Compatibility experience in the defense industry, and 18 years of experience in the wireless telecom industry.Previous to Award Solutions, Charlie served as Wireless Chief Technology Officer for Huawei Technologies, USA where he was responsible for Strategic Planning and Network Solutions for the US market. Charlie guided the growth of the CTO Office organization from its initial inception, with strong influence in also building the R&D, Product Management, and Product Marketing organizations from the 2007 through 2010 timeframe. He and his team generated the overall Strategic planning on a yearly basis for Huawei Technologies NA, inclusive of the Wireless, Optical, and Broadband Access Product Portfolios. In addition, Charlie drove ecosystem partnering with chipset and terminals vendors for the company.Prior to that, Charlie was employed at Nortel Networks for eight years in a number of positions with successively increased levels of responsibility. He served as the Senior Product Line Manager for CDMA Performance, driving the Performance software portfolio covering improvements to Access Failure/Dropped Call Performance, Mobility Management, Voice Quality, Capacity and Coverage Performance, and a multiyear, phased program of Operational measurements supporting 1xRTT and 1xEV-DO data rollout. Later, he served as the Senior Product Line Manager for CDMA Evolution, developing near- and medium-term plans for the CDMA Access product including 1xEV-DO Rev A and 1xEV-DV, and IP-enabled RAN planning. He also provided product direction planning input to the Nortel 3GPP2 standards team for inclusion into standards activities. Later Charlie was promoted to the position of Director of Wireless Technology and Product Strategy, reporting to the Chief Technology Officer for Wireless, where he was responsible for devising longer-term wireless technology strategies. Charlie was responsible for developing the IEEE 802.16e/WiMAX product strategy which included chipset and packet core platform analysis, go-to-market planning, and recommendations for company investments into next generation access technologies (WiMAX, Flarion, 802.20). Prior to Nortel, Charlie worked at two operators, serving as a Senior Product Manager at GTE, where he was one of four product managers responsible for nationwide rollout of ADSL in 1998 and before that as RF Design Manager for Sprint for the Dallas/Fort Worth and Austin Texas markets for Sprint PCS, where CDMA service was first rolled out in 1996. Currently, Charlie is a Senior Consultant at Award Solutions with his current focus being on UMTS/HSPA/HSPA+ and LTE, including both theoretical and hands-on courses.Charlie holds a Bachelor’s degree in Physics from the University of Maryland.
As the principal Standards Development Organization (SDO) responsible for developing LTE Specifications, 3GPP is tasked with providing a framework within which Infrastructure vendors and Device vendors can reliably build products built to said specifications, that should successfully interoperate with each other. However there is always the dilemma that must be dealt with by any Standards Body, which is to say, “how deeply and completely must specifications be written?” In my past experience at Infrastructure vendors, in working with R&D teams and Standards people within R&D, there was always a belief that Standards should not be overly defined so as to leave room for vendor differentiation. Thus there is a balancing act that must take place – the specifications should be written tightly enough to enable interoperability, but not so tightly as to preclude vendors from building in their own functionalities and enhancements to add value above and beyond simply, “building to the Standard.”
In light of this, the topic of this blog (to be delivered in two parts) is to discuss the possibility that an important metric may be under-defined in 3GPP. There seems to be significant anecdotal evidence coming from the LTE industry that there is an issue with device vendors measuring and reporting a metric called Reference Signal Received Quality (RSRQ) in a standard fashion. It may be that the interpretation of the definition of RSRQ in 3GPP is leading to significantly different implementations by Device vendors such that the use of RSRQ as a mobility trigger is being called into question, at least at this relatively early stage of LTE development.
Let’s examine the definition of the metric. Per 3GPP TSG RAN; EUTRA; TS36.214 Physical Layer – Measurements, the following definition in blue is given:
Reference Signal Received Quality (RSRQ) is defined as the ratio N×RSRP/(E-UTRA carrier RSSI), where N is the number of RB’s of the E-UTRA carrier RSSI measurement bandwidth. The measurements in the numerator and denominator shall be made over the same set of resource blocks.
E-UTRA Carrier Received Signal Strength Indicator (RSSI), comprises the linear average of the total received power (in [W]) observed only in OFDM symbols containing reference symbols for antenna port 0, in the measurement bandwidth, over N number of resource blocks by the UE from all sources, including co-channel serving and non-serving cells, adjacent channel interference, thermal noise etc.
The reference point for the RSRQ shall be the antenna connector of the UE.
If receiver diversity is in use by the UE, the reported value shall not be lower than the corresponding RSRQ of any of the individual diversity branches.
At first glance the definition seems to make sense. In multiplying RSRP by the number of N Resource Blocks in the measurement bandwidth (NxRSRP/E-UTRA carrier RSSI), the definition is striving to make an apples to apples comparison of RSRP in the numerator versus the contribution of RSRP to the RSSI in the denominator. Typically, RSSI is the measurement of all energy seen by a receiver in a given measurement bandwidth and includes thermal noise, additional self-induced noise as represented by the receiver noise figure, and all desired as well as undesired signals. The reference signals of the serving cell would certainly qualify as desired signals thus they would also contribute to the denominator. RSRP in itself is an average measurement of the signal strength of individual Reference Signals over a measurement (Channel) bandwidth while RSSI is a cumulative measurement over the total channel bandwidth. Multiplying RSRP by N Resource blocks would seem to be making the numerator a cumulative quantity over the total channel bandwidth thus lending itself to a logical comparison to the cumulative quantity expressed by RSSI in the denominator. Before going further, we should discuss the definition of RSRP itself.
Reference Signal Received Power (RSRP) in itself is an average measurement of the signal strength of Reference Signals only and the definition is given in TS36.214 as well, per the below text in blue.
Reference signal received power (RSRP), is defined as the linear average over the power contributions (in [W]) of the resource elements that carry cell-specific reference signals within the considered measurement frequency bandwidth.
For RSRP determination the cell-specific reference signals R0 according TS 36.211  shall be used. If the UE can reliably detect that R1 is available it may use R1 in addition to R0 to determine RSRP.
The reference point for the RSRP shall be the antenna connector of the UE.
If receiver diversity is in use by the UE, the reported value shall not be lower than the corresponding RSRP of any of the individual diversity branches.
Let’s go through some quick examples of how RSRP should be measured based on some assumptions for a single transmit antenna at the eNodeB and a single receive antenna at the UE. Let’s assume that the operating parameters are:
FDD Operation, 10 MHz wide carrier bandwidth (50 Resource Blocks), Transmit Power settings of the downlink Reference Signals = +20 dBm, Path Loss between eNode B transmit antennas and UE receive antennas = 100 dB
In LTE on the downlink, we set an equal Energy per Resource Element (EPRE) for the different resource elements, with the reference signals being set to an absolute value of transmit power, and all other resource elements carrying other channels and signals set to an offset value versus the reference signal setting. Assuming flat fading for the channel, and not taking into account antennas gains and coaxial feeder/jumper losses, in our example the RSRP would be simply:
+20 dBm – 100 dB = -80 dBm
For our 10 MHz wide carrier, there would be 50 resource blocks with 2 reference signals per symbol (always in symbols 0 and 4 in each Resource Block for two antenna operations in the downlink). If each reference signal is transmitted at +20 dBm (equal to 100 milliwatts in linear terms), the total power consumed in the eNode B power amplifier by the reference signals is 100 Reference Signals x 100 mW/RS = 10 Watts (+40 dBm) for a single transmit branch. However since RSRP is the received signal strength averaged over all the reference signals in the measurement bandwidth of 10 MHz, we use +20 dBm in our example calculation and we assume a path loss of 100 dB applied against each of the reference signals. What’s interesting to note here is that if we were to calculate a cumulative RSRP at the UE receiver it is logical to think that the cumulative RSRP would be:
+40 dBm – 100 dB = -60 dBm
In reality, for a non-flat fading scenario, the UE would take into account the fading characteristics of the channel and this would result in the equivalent of a frequency-dependent path loss over the channel bandwidth, and the measured RSRP would then differ somewhat from our example, but again, we are assuming flat fading for simplicity.
Given that this blog will extend into multiple parts, before concluding we can draw the first interesting observation. In looking at the cumulative RSRP versus the average RSRP of the example, we see there is a difference of 20 dB based on the fact that the cumulative RSRP is measuring the total contributions of 100 reference signals over one symbol time in 50 resource blocks versus an average number. So why does the definition use N RB’s X RSRP in the numerator instead of N Reference Signals X RSRP? We notice in the definition of RSRQ that there is a definition for how RSSI is measured as well, which itself is an averaged measure. Stay tuned for Part 2 of this blog as we go deeper into our interpretation.
Thanks for the article.
Are any mobility decisions in LTE currently taken based on the measured RSRQ ? Or are they simply based on RSRP levels?
Hi mjk, from my personal viewpoint RSRP seems to be the primary trigger for mobility decisions in the industry right now. Certainly Standards and Vendor Implementatins allows for the use of both RSRP and RSRQ as mobility triggers and in the end it is an operator decision on what to choose, but for right now RSRQ seems to take a back seat to RSRP. Certainly this can change in the future. Thanks for the question!