Evaluation of LTE signal quality
We would like to implement LTE signal quality reporting in our products. Which LTE radio parameters are most relevant and how to interpret them?
E.g. command 'AT!="showphy"' returns following statistics:
DL SYNCHRO STATISTICS ===================== Synchro state : DRX_SLEEP PPU SIB1 ACQ watchdog : 0 Frequency Hypothesis RF (Hz) : 0 RSRP (dBm) : -82.43 RSRQ (dB) : -11.18 Channel estimation state (Cell-spec.) : HIGH CINR Channel estimation state (UE-spec.) : LOW CINR Channel estimation state (MBSFN) : LOW CINR Channel estimation CINR : 19.00 Channel length : SHORT AGC AGC RX gain (dB) : 44.79 RX PSD BO (dBFs) : -21.21 RX PSD (dBm) : -85.27 Noise level RS (dBm) : -100.49 Digital gain (dB) : 3.68 CINR RS (dB) : 18.06 NARROWBANDS Last DL NB : Central Last UL NB : 0 AFC Frequency offset RF (Hz) : -376 Frequency offset BB (Hz) : 0 PBCH MIB received quantity : 0 MIB timeout quantity : 0
Command 'AT+CESQ' reurns "not known or not detectable" for rxlev (Received signal strength level), ber (Channel bit error rate), rscp (Received signal code power) and ecno (Ratio of the received energy per PN chip to the total received power spectral density). Quality parameters rsrq (Reference signal received quality) and rsrp (Reference signal received power) are determined.
Anything down to -12dBm is usually OK
How exactly do you usually interpret RSRQ values from range -3 to -20dB?
In a table available in the following link RSRQ < -11dB is considered to indicate poor signal quality:
Attach/connect time relation with link quality is quite interesting.
I like to use RSRQ because it's known range (-3 to -20) makes it easy to filter out bogus values. The big problem is that connection reliability & data rate usually has a lot more to do with tower loading than signal strength. Often how long it takes to attach & connect says a lot more about what sort of data session your likely to have.