Recently, I was basking in my comfy command chair relishing the adverts on television. They’re really the only thing worth watching since the Gilmore Girls ended (my wife, Gina the Gorgeous, and I just finished watching the re-runs… again) and while one is waiting for the next season of Doctor Who starring Jodie Whittaker to launch in 2020.
One of these adverts showed a bunch of people streaming video at some weird game that seemed to keep on stopping and starting. Oh yes, I think it was called American football, where the “American” qualifier is used to distinguish it from the real football (a.k.a. soccer) that’s played in the rest of the world.
Unlike real football, where the same teams play for 90 minutes (split into two 45-minute halves separated by a 15-minute break), each side in American football has three teams – one for attack, one for defense, and one to do special things the other two can’t (my mother always used to tell me I was special, and I foolishly took it to be a compliment).
As a stranger to this country (and they don’t get much stranger than me), I sort of think of American football as a bit like Rugby football — so-named because it started circa 1845 at Rugby School in Rugby, Warwickshire, England – not the least because they both use an awkwardly-shaped, non-spherical ball. The two main differences are that Rugby players can only throw the ball backwards and they wear only shorts and T-shirts, as opposed to American Football players who appear to wear the sporting equivalent of spacesuits.
Of course, there’s always the amusingly named Australian Rules Football. In addition to the fact that this antipodean gladiatorial contest is played on an elliptical field with three goals at each end, and that the players (who wear extremely small shorts and T-shirts with the arms cut off) may position themselves anywhere on the field and use any part of their bodies to move the ball, Australian Rules Football is primarily distinguished by the fact that there are no discernable rules.
But we digress…
While I was watching the advert showing the American football match (What? You forgot about that? Maybe you’d better go back to the beginning of this column and start again.), my mind started to wander, as it does. I began to ponder what would happen in the future — say Super Bowl 2025 — when the half-time show starts, and the 50,000-strong audience starts to stream everything to their friends back home using their super-duper 5G phones.
What? How should I know who’ll be playing? For all I know the competing teams could be the Jacksonville Jaguars and the Detroit Lions, with Elton John on his “Almost the Last Farewell Tour” heading the half-time festivities backed by The Who (singing My Generation) and the Rolling Stones (singing Not Fade Away).
As an aside, Gina and I saw Elton John here in Huntsville, Alabama, a couple of years ago. Gina was singing and dancing in the aisles. “Sit down woman,” I said, “the concert hasn’t started yet!” But seriously, Elton was fantastic, on his feet singing and playing the piano for two and a half hours straight. (I can play all the same notes, but not necessarily in the same order with the same timing).
But, once again, we digress…
The point of all this (yes, of course there’s a point), is that I started to wonder just what sort of bandwidth and infrastructure would be required to handle 50,000 people streaming 5G video from the same arena at the same time. Furthermore, if all of these people were streaming video to the outside world, what would happen if someone outside is trying to contact someone inside the stadium with a mission-critical call (“Your father’s having a midlife crisis!” or “The cat’s stuck in a tree!” or “Your mother’s stuck on the roof!”).
Another Aside (An Official One This Time)
Speaking of The Who and My Generation, what’s all this 5G stuff anyway? I like to think I know as much about 5G as the next man. Unfortunately the next man doesn’t know much at all.
If you are wandering around England with a metal detector and you find a stash of buried treasure with a coin stamped “King Dilbert I,” how would you know if this was real or a forgery? In fact, there are two big giveaways. The first is that there never was a King Dilbert. The second is that they never put ‘I’ on a coin — it’s always just “King Richard” or “Queen Elizabeth” for the first monarch bearing that name; they only start adding numbers for the second and up; e.g., “Queen Elizabeth II” and “King Richard III.”
The point (yes, once again, there is a point; try to keep up) is that the first commercial automated cellular network, which was analog in nature and supported only voice calls, was launched in Japan by Nippon Telegraph and Telephone in 1979. We now refer to this as 1G, but they didn’t use this nomenclature at the time because they were too busy working on that one to worry about the possibility of more generations in the future.
It was only when 2G hit the scene circa 1991 that the 1G moniker was retroactively bestowed. 2G represented the transition from the analog to digital transmission of voice, and it also saw the introduction of the ability to transfer data on top of voice, albeit at ludicrously low rates compared to today.
If we skip over 2.5G and 2.75G, then 3G networks started to be introduced circa 1998 to 2000. 3G itself saw incremental evolution into 3.5G and 3.75G before we entered the current 4G era, which commenced circa 2008.
Faster and Better 5G (It Even Makes Your Teeth Brighter)
And so, we come to 5G. What exactly is 5G? Well, it’s a lot of things to a lot of people — a “wireless network for all seasons,” if you will. In fact, there are three main use cases: Enhanced Mobile Broadband (eMBB), Ultra Reliable Low Latency Communications (URLLC), and Massive Machine Type Communications (mMTC).
With faster connections, higher throughput, and more capacity, eMBB is the one we tend to think about as general users. People throw numbers around, like “a minimum of 1 Gbps and perhaps up to 10 Gbps” (the latter sounds better if you say it breathlessly), but I’ll believe it when I see it.
URLLC is targeted at mission-critical applications that require uninterrupted and robust data exchange, such as surgeons performing teleoperations from remote locations where latency and reliability become key issues.
Last but not least, mMTC will be used to connect to a large number of low-power, low-cost devices, which have high scalability and increased battery lifetime, in a wide area. In a nutshell, mMTC is intended to subsume low bandwidth applications in the sub-gigahertz frequencies of the ISM (Industrial, Scientific, and Medical Bands). These applications are currently served by technologies like SigFox, Lora, Zigbee, LTE-M, and NB-IoT – and HaLow, where the latter is a relatively new kid on the block.
The eMBB part of 5G is supposed to start rolling out in 2020, with URLLC and mMTC following in 2021 or later. But wait…there’s more…
Frequency bands for 5G are separated into two different frequency ranges: Frequency Range 1 (FR1) is popularly claimed to embrace everything below 6 GHz (in fact, it’s been extended to cover potential new spectrum offerings from 410 MHz to 7.125 GHz), while Frequency Range 2 (FR2) embraces millimeter wave (a.k.a. mmW or mmWave) frequency bands from 24.25 GHz to 52.6 GHz.
With a range of 500 meters to a kilometer, depending on the environment, the mmWave (FR2) bands are much shorter range than their FR1 counterparts. Also, they are easily blocked or distorted by absorption, reflection, and refraction associated with buildings and other structures to the point that they may as well be blocked. However, they offer much higher bandwidths than the FR1 bands. Furthermore, using advanced beamforming technology, I’ve been told that you could use mmWave to individually target two users sitting side-by-side in a football stadium with separate connections on exactly the same frequency. If this is true, then all I can say is “color me impressed.”
Take Me to the (Foot)Ball Game
I hope you noticed the way we casually returned to the topic of the football stadium at the end of the previous section. Much like Colonel John “Hannibal” Smith in The A-Team, I love it when a plan comes together.
As I mentioned earlier in this column, I was wondering what sort of bandwidth and infrastructure would be required to handle 50,000 people streaming 5G video from the same arena at the same time. Also, I was pondering the problem of someone outside the stadium trying to call someone inside the stadium while all this video streaming was going on.
Since I know so little about 5G, I decided to make this someone else’s problem, so I posed my questions to my chum, Nir Shapira, who is the Business Development Director for the Mobile Broadband Business Unit at CEVA. These guys create state-of-the-art DSP IP, including the 5G IP that will be used in the SoCs and FPGAs powering 5G mobile devices and 5G infrastructure.
First, Nir addressed the problem of someone trying to call in while everyone else was streaming out (by which I mean streaming video – not leaving the stadium). Nir says that 5G phones will support both the sub-6GHz (FR1) and mmWave (FR2) frequency bands, and that my hypothetical inbound call would come in using the sub-6GHz band, which can theoretically support phone-to-tower and tower-to-phone distances of up to 45 miles. Meanwhile, the 49,999 other supporters in the stadium would be using the mmWave bands to stream their videos. These bands will only work only within the stadium, where they will be “gathered up” by base stations before being conveyed to the outside world.
Nir went on to say that my proposed use case brings to the extreme the network densification trend, that it’s going to be very challenging, and that it will require sophisticated and costly systems. Regarding my 2025 scenario, Nir spake as follows:
Let’s start with 10,000 users. We can assume each user requires one uplink to stream one HD video. We can assume a total throughput of 4Mbps per stream. We can assume that each 5G channel of 100MHz, under optimal conditions, can support around 1Gbps of data speed. This means that each 100MHz spectrum slice (or carrier) can theoretically support 250 users.
This is a relatively high number of users to handle, protocol wise, so let’s assume that a small base station (also called a Small Cell or Radio Head) can handle 250 users in 100MHz. Thus, you would require 40 such base stations to handle every 10,000 users.
This assumes you do not have any spectrum issues. Let’s say you have 800MHz available (as is available in mmWave), so you can service 250 * 8 = 2000 of the users. When operating with mmWave, you can minimize interference between sectors to ease spectral re-use. With spectrum re-use throughout the stadium, you can easily re-use the spectrum five times (in five sectors in the stadium) to service 10,000 users.
Of course, that’s “only” 10,000 users. My original scenario was based on a 50,000 throng of supporters, but I’ll leave you to do the math, because Nir was in full flow by this time and went on to say:
I had a chance to talk with a colleague who is in the business of mass indoor and stadium deployments. He points out that the future stadium experience will be very different from what we know today, and that requirements will be even more fantastic than what you have imagined.
Imagine 50,000 spectators, each one picking up a pair of augmented reality (AR) goggles upon entry. From the time of entry, each of these spectators will get a constant 30Mbps downlink stream to augment his or her experience. This is extremely challenging, even for 5G, and will require highly complex coordination and interference mitigation between all service nodes. To be honest, we don’t know if 5G has the full answer to this — the answer will probably require a joint solution between multiple services like 5G and Wi-Fi 6, for example.
My knee-jerk reaction was to exclaim, “What on earth is Wi-Fi 6?” On reflection, however, I decided that I really didn’t want to know; at least, not today. Nir continued as follows:
One other thing you should check out is MEC (Mobile Edge Computing/Cloud), because this also relates to the stadium use case. With MEC, instead of routing the data all the way back to the core network, the servers are located close to, or inside, the edge venue. This enormously reduces latency (and relates to the URLLC pillar of 5G), which is of an essence in use cases like virtual reality (VR) and augmented reality (AR). It also relieves the pressure from the network backbone. So, in the case of the AR example, the servers for the AR application will typically be located in the stadium itself.
In a legacy 4G network, all data needs to be backhauled to the core network. So, if you were wearing an AR headset and you were to turn your head, this data is transferred to the core, and the remote cloud server needs to respond and change your field of view. You can imagine that, with current technology, you would experience lags. With MEC, by comparison, the servers are local.
MEC is going to be a big thing in 5G and will be one of the major ways in which companies can monetize something out of it. As 5G is still in its infancy, MEC is not yet developed, which is why actual deployment lags previous expectations. The technology is fantastic, but if Mobile Network Operators (MNOs) or other stake holders (e.g., stadium owners) cannot see how it directly relates to revenues, then they will have a hard time justifying the enormous infrastructure investment required.
Nir and his colleague do bring up an interesting point. I had completely forgotten about the possibility of using augmented reality in such an environment (see also Fundamentals: VR, MR, AR, DR, and HR). You know when you watch a football match on television, how graphical elements are superimposed on the field, such as the yellow “down line” and the blue “line of scrimmage.” In extreme situations, such as snowstorm or blizzard conditions, an entire virtual field with yard and boundary markers can be projected onto the field in order to allow league officials, broadcasters, and viewers some way to follow action when all field markings are obscured by snow, fog, or mud. Well, all of this — and much, much more — could be added to each live viewer’s augmented reality experience.
Good grief. Now I have something else to cogitate and ruminate over. What are all the different things that could be added to an augmented reality version of a live sports event, including baseball, basketball, football, tennis… the list goes on. Help me out here, what do you think about all this?
If you are interested in VR, AR, DR, MR, etc., you might want to check out this column on the Supplyframe Hardware site:
Augmented Reality is Poised to Change the World
The forthcoming VRX Conference & Expo (Dec 12-13, San Francisco) looks set to be a “must attend” event.
https://resources.altium.com/news/augmented-reality-is-poised-to-change-the-world-news
See also my column: “Are Norberts and Norbertinas the Future of the Human Race?” (https://www.eetimes.com/author.asp?section_id=216&doc_id=1331340)
Hi Max! Another great article. As for AR, I feel that it has the potential to truly revolutionize our lives. That could be either a good thing or a bad thing depending on how you look at it. On the one hand, I’d love to be able to roll up to a restaurant and see the Yelp ratings, wait times, etc. from the relative comfort of my car. On the other hand, we saw the mess that was Pokemon Go. People were so distracted by the AR, that many accidents ensued, trespassing, and general mayhem. But, we still need to develop a reasonable alternative to Google’s Glass headset. Something that can perform and a heads-up display and doesn’t make your head off balance. Of course, we’d still have to solve the issues of unpermitted recording. Your football game actually falls under that rule as the game and all of its images are technically property of the NFL. Sure, with 50,000 fans all streaming, this would be complicated to enforce. But, I’m sure that there is some set of lawyers already working that sort of thing out.
Hi taichichuan — I’m sorry for the delay in my response — I tell you, every time I blink another year whizzes by. Don’t talk to me about Pokemon Go — my wife and 24-year-old (at that time) son had me driving them around to places with more Pokemons, at which time they would scamper around like … things that scamper around chasing the little scamps and trying to out-score each other. Did you ever see the “Hyper-Reality” video on YouTube (https://youtu.be/YJg02ivYzSs) — I fear this may be our future — they day may come when we long for a robot apocalypse LOL