It's Raining Cats and Cats: A Performance Benchmark Study of 3GPP-Based IoT Devices in a Lab Environment
本報告實施3GPP型IoT設備的基準調查，提供廣泛的網路條件、使用方案運作的Cat M1 及 Cat NB1 模組、設備的電力消耗/能量需求的分析。
In collaboration with the H2020 EU TRIANGLE project and the Connectivity Section at Aalborg University, SRG conducted a lab benchmark study of 3GPP-based IoT technologies, with emphasis on eMTC (LTE Cat M1) and NB-IoT (LTE Cat NB1). Specifically, we sought to determine whether the proverbial 10-year battery life for 3GPP IoT-based solutions is possible. More importantly, we wanted to determine how varying the data payload (2 bytes to 1,600 bytes), the frequency of the reporting period (2 hours to monthly), and the RF conditions (-120+ dBm < x < 100 dBm) impacts the estimated battery life of a commercial IoT module or device.
For this study, TRIANGLE, an EU funded project focused on identifying new KPIs for next-generation networks and comprised of a consortium of vendors and academia, performed tests on our behalf, leveraging its lab facilities in Europe. SRG helped identify and procure the modules/devices, solidify the test plan, and we conducted the analysis of the data presented in this report. This Signals Ahead report marks our first foray into the world of IoT benchmarking. This document provides an update to the original study that we published in February 2019. Specifically, it includes LTE-M results which we did not have available at the time that we published the first report. As originally promised when we published the report, this updated version will not count against your organization's subscription since we counted the first report against your subscription. However, the licensing terms for Signals Ahead still apply. No external sharing of this report, either in whole or in part.
Our analysis indicates that a 10-year battery life (5-watt hours) is feasible and with many scenarios the estimated battery life greatly exceeds this threshold, not to mention the average life expectancy of a human residing in the US circa 1800. However, it is equally possible to have an expected battery life that is only a fraction of the 3GPP target, simply by introducing more challenging, albeit realistic, parameters (payload, reporting frequency, network conditions). It is also worth mentioning that we measured modem performance whenever possible. Cellular IoT devices require ancillary circuitry, not to mention the circuitry required to perform other requirements (sensors, display screen, etc.). These additional components have their own power requirements - something that we did not include as part of this study.
Thanks to multiple sensitivity studies and very precise measurements of power consumption for each IoT state, it was possible to determine the relative impact that each IoT state (PSM, synchronization, service request, etc.) has on the battery life. Needless to say, the relative importance of energy efficiency with each state greatly depends on the scenario. PSM, for example, can consume almost the entire energy budget (high double digits on a percentage basis) or it can be inconsequential (low single digits) with other IoT states having a greater influence.
We also found that a single-mode (Cat NB1 only) solution can deliver a much longer battery life than a multi-mode (Cat NB1 / M1) solution, largely due to the higher current in the PSM state. For the multi-mode solutions, it wasn't always the case that the Cat NB1 mode of operation delivered a longer battery life than the Cat M1 mode of operation, although this outcome was specific to just one multi-mode solution we tested.
In 2016 we published a Signals Ahead report on IoT (SA 12/19/16, "The Murky Underworld of IoT: How 3GPP-based Solutions Pave the Way to a Connected World"). In that report we traced the history and lineage of today's 3GPP-based IoT technologies and how they came into existence through the standardization process.
Although the report is a great read and we encourage subscribers to dust off a copy - we did prior to putting pen to paper for this report - we'll make it easy and provide a very short summary of report. If nothing else, this summary helps clarify a few definitional terms, identifies important differences in device categories, and helps explain some of the challenges associated with doing an IoT benchmark study.
In 2014, 3GPP initiated two activities related to machine-type-communications (MTC). Both initiatives sought a solution that could enable massive numbers of low-cost/low-power devices, capable of transmitting small amounts of data with ubiquitous coverage. Although one initiative leveraged GSM/GPRS and another initiative leveraged LTE, they both shared a common characteristic - they only required 200 kHz of spectrum to operate.
Fast forward to the completion of Release 12 in March 2015, 3GPP introduced a new LTE device category, called Category 0 (Cat 0). The characteristics of a Cat 0 device embodied many of the objectives of the MTC movement, including reduced device complexity and lower cost. Specifically, the baseband modem in Cat 0 devices was limited to a 1.4 MHz radio channel, although the RF portion retained support for a 20 MHz channel. Additionally, the Cat 0 devices were limited to downlink/ uplink peak speeds of 1 Mbps while trying to get by with a single receive antenna, lower transmit power (20 dBm versus 23 dBm) and the half-duplex mode of operation. Thanks to a series of unfortunate events and bad timing, the market for Cat 0 dried up before it materialized. It didn't help that some of its features intended to drive down costs also severely reduced coverage, thereby limiting the usefulness of the new device category. Furthermore, 3GPP didn't define any new enhancements to data or control channels for this device category which would optimize performance for infrequent low bit rate applications, operating in challenging RF conditions.
Concurrent with the standardization work on LTE Cat 0 devices, 3GPP initiated several study items to evaluate new technologies and methods which could achieve the MTC objectives. Long story short, 3GPP settled on two new clean-slate approaches. First, both literally and figuratively, 3GPP specified LTE Cat M1 devices in Release 13. From an overarching 3GPP perspective, this approach is called eMTC (enhanced MTC). The GSMA uses the term LTE-M for marketing purposes, so to some extent there is equivalency between LTE Cat M1 (originally called Cat M), eMTC and LTE-M. At the very tail end of Release 13, 3GPP approved NB-IoT (Narrowband IoT) as another alternative means to achieve MTC functionality. In reality, 3GPP continued Release 13 standardization work on NB-IoT after the oicial completion date of Release 13, much like they did with 5G NR and the late drop of Release 15.
3GPP took longer to complete NB-IoT due to competing proposals from two consortiums. One consortium promoted NB-LTE and one consortium championed NB-CIoT. Merging the two competing acronyms proved to be somewhat easier than merging the two technologies with the initial way forward including multiple elements from both proposals and no clear guidance on what a unified solution would entail with lots of missing pieces and ambiguity still remaining. Even with the solidified proposal, there remained several optional features, which vendors could choose to implement. The use of a single tone (15 kHz or 3.75 kHz sub-carrier spacing) or an optional multi-tone in the uplink is one example that still exists. This debate is now water under the bridge and 3GPP continues to evolve eMTC and NB-IoT in future releases, ultimately including them as part of the 5G standard.
Although Cat M1 and Cat NB1 devices target the same market opportunity, there are meaningful differences between their two sets of attributes. A Cat M1 device requires 1.4 MHz of spectrum (1.08 MHz radio channel) and supports peak data speeds of up to 1 Mbps. A Cat NB1 device