Machine Learning Could Save 5G

Unlike the previous wireless networks – 3G, 4G, and LTE – that have all successfully launched and done their part in creating faster, more reliable cellular connections, 5G networks that wireless companies are trying their hardest to roll out to customers are having a bit more trouble. The radio access networks that 5G connections work off of rely on two emerging technologies that haven’t worked out all the kinks yet.

One of these technologies, referred to as millimeter waves, which are spectrums that broadcast at frequencies between 30 and 300 gigahertz, compared to the bands below 6 GHz that were used for mobile devices in the past. Their name comes from their size, which is much smaller (about 1-10 mm) than the current radio waves that cater to smartphones that measure tens of centimeters in length.

Source: iDownload

The other new tech that 5G networks need to work to their potential is known as massive MIMO. MIMO, which stands for multiple-input multiple-output, is a system that uses multiple transmitters and receivers to relay more data back and forth than its predecessors. Massive MIMO takes this one step further by adding dozens more of these transmitters and receivers to the process. While the MIMO system can already be found in many 4G networks, massive MIMO has only received a few test trials at this point. In order to use it for 5G networks it must be able to work more continuously than it currently does.

Without these two major additions to improve data connectivity and transmission, 5G networks cannot survive – let alone be successfully rolled out to millions of cell phone users.

Machine learning can save the day

At the recent 5G summit at the Computex trade show in Taipei, Taiwan, Nokia’s CTO of Advanced Technologies Rajeev Agrawal announced the company is looking into ways to implement artificial intelligence to boost 5G connectivity and bypass many of the major problems plaguing the rollout process.

Agrawal, who is head of Nokia’s radio access network offerings, presented three possibilities for defeating the problems facing 5G connectivity – all of which include machine learning.

Enhancing MIMO networks

In a massive MIMO network, data is transferred using many more antennas than past networks. While this enables data to travel faster, it also causes these signals to interfere with one another. A process known as beamforming, which sends targeted beams of data to users and causes less interference between signals, could alleviate the issue. At least that’s what it looks like in theory.

Source: Couragez

Nokia has a system with 128 antennas all working together to form 32 beams and wants to schedule up to four beams in a specified amount of time. The company also wants to schedule those beams in a sequence that will provide the highest spectral efficiency, which is a measure of how many bits per second a base station can send to a set of users.

The number of possible ways to schedule four of 32 beams mathematically adds up to more than 30,000 options. There isn’t enough processing power to decide which of these options is best in a short amount of time.

Nokia says it was able to train neural networks how to find the best schedule offline, and then later quickly predict the best schedules on demand. The company didn’t disclose data to back up their performance or allow comparisons to other possible heuristics, though.

Closer transmissions of data

Another way to help 5G networks transmit data more efficiently is by adding in smaller transmitters along the way that can carry data in shorter spurts rather than having to hold onto them for longer periods of time. This can also allow networks to deliver signals closer to a user’s physical location. This can also help carriers solve another problem—finding the location of indoor objects, such as sensors or smart speakers in a home. GPS signals can typically identify an object’s indoor location no more accurately than within about 50 meters.

According to Agrawal, a small cell network’s radiofrequency data can be used to train a machine learning algorithm to infer the positions of network users’ equipment. A slide from his presentation claimed mean positioning errors of 10 centimeters (cm), 13 cm, and 9 cm using LTE eNB radiofrequency data from cells on different floors of a mall in China.

Source: GoodData

First, Nokia measures received signals within a small space to see if there is a loss of data along the way. It then uses these maps to train neural networks to predict the location of a device based on the strength of the signals it receives from nearby cells.

Agrawal said a machine learning system would first predict user equipment characteristics, such as mobility. Then, the system would make a prediction about what the uplink/downlink throughputs would be, against different settings, and pick the best setting.

Agarwal said he’s “not trying to say all of these [applications I presented] are right,” but to him, machine learning will be a key part of 5G networks.