Ericsson LTE latency 2305G at Verizon and AT&T is delivering latency around 10 ms. Mei Meiyuan writes, "For 4G mode, the best round-trip radio latency is about 10ms for FDD and about 13ms for TDD model2). In theory, With short TTI, the best round-trip radio latency is about 2ms(FDD,2 OFDM symbols 8ms(TDD, 7 OFDM symbols under planning,model2)." 

(Editor's note: Those measures are from the base station to the receiver. To reflect the real user experience, you must add the latency from the base station back to the relevant server. That's 20-50 milliseconds or much more. In the future, edge clouds will often reduce the combined latency to 20-30 ms total.)

Many politicians and salesmen still think 5G latency is much better than 4G latency. As the new equipment reaches the field, people will discover the difference in latency is only a few


Both will be much lower than what is today in the field, but technology has advanced in 4G, not just 5G.

5G can get close to 1 ms in the lab using a short TTI, Transmission Time Interval. 4G using short TTI is only 1-2 ms longer. Short TTI was developed before 2010 and is included in the 3GPP standards for both 4G & 5G. Today's equipment, both 4G & 5G, does not support it. That will change. The illustration is from an Ericsson paper, which also has a good set of references. The fastest methods, on the left, are still in the lab.

Huawei has a goal for 4G "To be consistent with 5GNR in term of frequency utilization as much as possible."

dave askOn Oct 1, Verizon turned on the first $20B 5G mmWave network. It will soon offer a gigabit or close to 30M homes. Thousands of sites are live in Korea; AT&T is going live with mobile, even lacking phones. The hype is unreal. Time for reporting closer to the truth.

The estimates you hear about 5G costs are wildly exaggerated. Verizon is building the most advanced wireless network while reducing capex. Deutsche Telekom and Orange/France Telecom also confirm they won't raise capex.

Massive MIMO in either 4G or "5G" can increase capacity 4X to 7X, including putting 2.3 GHz to 4.2 GHz to use. Carrier Aggregation, 256 QAM, and other tools double and triple that. Verizon sees cost/bit dropping 40% per year.

Cisco & others see traffic growth slowing to 30%/year or less.  I infer overcapacity almost everywhere.  

Believe it or not, 80% of 5G (mid-band) for several years will be slower than good 4G, which is more developed.


5G Why Verizon thinks differently and what to do about it is a new report I wrote for STL Partners and their clients.

STL Partners, a British consulting outfit I respect, commissioned me to ask why. That report is now out. If you're a client, download it here. If not, and corporate priced research is interesting to you, ask me to introduce you to one of the principals.

It was fascinating work because the answers aren't obvious. Lowell McAdam's company is spending $20B to cover 30M+ homes in the first stage. The progress in low & mid-band, both "4G" and "5G," has been remarkable. In most territories, millimeter wave will not be necessary to meet expected demand.

McAdam sees a little further. mmWave has 3-4X the capacity of low and mid-band. He sees an enormous marketing advantage: unlimited services, even less congestion, reputation as the best network. Verizon testing found mmWave rate/reach was twice what had been estimated. All prior cost estimates need revision.

My take: even if mmWave doesn't fit in your current budget, telcos should expand trials and training to be ready as things change. The new cost estimates may be low enough to change your mind.