This is a guest post by Chris Bronk, assistant professor of computer and information systems and associate director of the Center for Information Security Research and Education at the University of Houston.
In the development of new information technology (IT) there exists a degree of irrational exuberance. Indeed, consultancy Gartner has described the innovation to adoption process of IT as a “hype cycle,” in which the peak of our inflated expectations is soon followed by a trough of disillusionment and an eventual plateau of productivity in which a technology becomes suitably mature. 5G, the Fifth Generation of mobile wireless technologies, is somewhere in the hype cycle. There has been much conversation about 5G, and it will produce some novel capabilities, but a lingering question exists about how much energy it will consume vis-a-viz prior wireless mobile networks.
More on:
Before 5G’s energy consumption issues are discussed (along with some interesting energy features), it’s useful to know some of what will make it a significant improvement on the current, Long Term Evolution (LTE) systems that our smartphones and other cellular wireless devices use today. The main event is that 5G will be faster, perhaps as much as twenty times as fast as current LTE networks.
It will also be very low latency, which means that the speed at which 5G signals are sent and received will be effectively imperceptible. How fast? Researchers at Deutsche Telekom have reported latency figures of 3 milliseconds (ms). Consider that the amount of time it takes for visual stimuli to travel from the eye to the brain is about 10 ms (LTE latency is about 50 ms).
This low latency means that applications in which instantaneous communication is necessary become more possible – think self-driving cars whose processing is faster than the human brain sharing the road via distributed computer control. Such systems could allow traffic flows to be fully automated. No more traffic lights!
So what makes 5G different?
The transformative nature of 5G will largely be achieved in how radio frequency is allocated and employed. Current U.S. mobile devices “talk” to the network at frequencies from 700 megahertz to 6 gigahertz. This service will continue because base stations (i.e. cell phone towers) at these bands allow for the transmission of data by radio over significant distance. What’s new is in the millimeter wave bands – 24-86 GHz. This slice of radio spectrum can carry large amounts of data, but not nearly as far.
More on:
That’s where considering energy usage comes in. A lot more equipment needs to be installed and potentially more data needs to be processed. Additionally, energy efficiency needs to be a core design principle. First of all, however, it is important to note that millimeter wave communications are prone to interference. For example, radio at above 20 GHz doesn’t go through walls well. It doesn’t go through leaves well. It doesn’t play nicely with rain. What does this mean? Many, many more antennas. Suddenly 5G starts sounding like WiFi or maybe some evolution of WiMax technologies.
In other words, interference means different infrastructure. A number of significant differences exist between LTE and 5G when considering energy usage. First, because of the new millimeter band pieces of spectrum used, there will likely be a densification of existing cellular networks with the massive addition of small cells and a provision for peer-to-peer (P2P) communication. In 5G, simultaneous transmission and reception will be possible, which likely necessitates new investment in fiber optics to move the data. Some wireless functions will move to cloud processing and much more of the infrastructure will be virtual in nature.
So what’s the energy angle?
Computers use electricity. But how much? This is a question my colleague Krishna Palem and I worked on answering about a decade ago. The problem then was that computer microprocessors had developed a heat problem due to the high frequencies of electric current involved (upwards of 2GHz). Device consumption numbers were increasing, which led to wondering if computer energy utilization was going to rise rapidly and begin rapidly gobbling up much more electricity.
At the time we did the work, we assumed that about 3 percent of global energy use was in the IT sector, but some things were hard to measure – like energy usage in cell phone networks. We developed a term for pushing innovation in energy efficiency – a sustainability innovation quotient (SIQ). No, it didn’t take off like wildfire, but efficiency innovation is now widely considered when building new computing hardware. We moved on.
What about power consumption in networks?
Someone else picked up the ball of calculating energy usage for IT networks. Now this is not a piece about Huawei, but it turns out that the person doing academic research in the same energy analysis vein as we were is a Swedish academic – Dr. Anders Andrae – an employee of Huawei. Because Huawei ships a large amount of networking hardware, it is able to produce solid estimates on electricity demand.
Measuring networking power consumption requires the capacity to determine how much energy wired and wireless networks consume. These amount to fairly big numbers of devices and power draw. According to Huawei’s Andrae, fixed access networks consumed about 167 TWh of electricity in 2015 while wireless networks consumed roughly 50 TWh. That’s a big number – 1 TWh is a trillion watts/hour. For perspective the average American household consumes 7,200 kWh of electricity per year, but remember the networking numbers are global figures.
Because energy efficiency has become a priority, an efficiency measure, the number of bits transmitted per Joule of energy expended, has become a standard. Having an efficiency metric to work with is useful especially as electricity costs in providing mobile phone/data service represent about 70 percent of the bill.
However, a common concern is that if 5G offers much greater speed, say twenty times as much, a similar rise in energy consumption could follow. “A general concern is that higher data rates can only be achieved by consuming more energy; if the EE [energy efficiency] is constant, then 100× higher data rate in 5G is associated with a 100× higher energy consumption.” This is where headlines like, “Tsunami of data could consume 1/5 of global data by 2025,” come from.
The data in R&D on this topic are not nearly as discouraging. Today’s cellular site delivering 28Mbit/sec has an energy consumption of 1.35kW, leading to an EE of 20 kbit/Joule. Recent papers report EE numbers in the order of 10Mbit/Joule in 5G systems. So, it’s pretty clearly understood that just allowing unabated increases in power consumption is impossible and the aim for industry is to push energy utilization down, significantly.
To the Future!
In addition to transmitting or harvesting data, energy can also be moved in 5G networks. With 5G, one of the novel technologies being considered is Radio Frequency (RF) harvesting; converting energy in transmitted radio waves to user devices or even wireless infrastructure (microcells, antenna arrays, etc.). Since RF signals can carry both energy and information, theoretically RF energy harvesting and information reception can be performed from the same RF input signal. This scheme is referred to as the simultaneous wireless information and power transfer (SWIPT).
The hardware to support this doesn’t exist yet, but it has promise. However, since the operating power of the energy-harvesting component is much higher than that of the information decoding component, the energy harvesting zone is smaller than the information transmission zone.
The Data Center Blues
Unfortunately, another energy problem afoot. Although efficiency is now one of the elements incorporated into designing the next generation of mobile telecommunications infrastructure, the vast proliferation of devices, including those labeled the Internet of Things (IoT), will add up to additional energy consumption. Our biggest area of concern, however, is in data centers. Radoslav Danilak asserts that data centers will consume exponentially larger amounts of electricity, arguing, “consumption will double every four years.”
While powering data centers with renewable sources is an aspirational goal of the IT industry, of equal importance is increasing energy efficiency. Yale’s Environment 360 program noted, “Insanely, most of the world’s largest centers are in hot or temperate climates, where vast amounts of energy are used to keep them from overheating.” Placement matters in keeping cooling costs down, but designing energy efficient processors and other components for servers is also important. Global data processing does not appear anywhere near a, and 5G will add to the global energy bill of both telecommunications firms and those that conduct computing in the cloud.
So what’s the bottom line?
A lot of hyperbole surrounds 5G. The energy consumption issue is being addressed by all of the major equipment manufacturers. Carriers can’t afford massive, new power costs and will not deploy technology they can’t afford to operate. The deployment time for large and complete 5G networks will not be overnight and what constitutes 5G isn’t fully sorted out, but out of control energy consumption growth is not in the cards. That there could be innovation in how energy is harnessed and transmitted is a potentially important area for innovation. Our assumptions can and will change.