Consequently, one of the biggest challenges when it comes to designing the next generation of wireless systems and networks is simply managing their complexity.
For those designing next generation wireless networks traditional predefined designs are both inadequate and inflexible.
According to MathWorks’ Principal Product Manager, Wireless Communications, Houman Zarrinkoub, predefined designs are struggling when having to deal with system complexities and are unable to adapt to rapidly changing requirements and environments.
“What we are seeing today is the emergence of AI-native technologies that are going to be able to address the issue of complexity,” Zarrinkoub explained. “They are a hot topic today and everyone is talking about them. AI techniques have traditionally been used to optimise existing components or systems as part of modern engineering techniques. Now we are seeing AI systems being used to replace legacy components and that’s a very different approach.
“It is true to say that the wireless industry has been behind many others when it has come to embracing AI but we’ve now recently seen standards bodies, such as 3GPP, introducing AI into the foundational aspects of 5G Advanced on the optimisation side.
"It has been very vocal about AI’s significance and its role in the forthcoming 5G Advanced and now 6G standards and it talks about AI’s functionality for enhanced positioning, beam management, and channel state information (CSI) feedback.”
Elsewhere the Wi Fi Alliance has established a study group on ML while the ITU has incorporated AI into its ‘pillars’ when it comes to the development of 6G.
“Although there was some reticence at the beginning AI is now being embraced by all the standardisation organisations,” said Zarrinkoub.
Consequently, engineers are going to have to integrate AI-native concepts when it comes to deploying next-generation wireless systems. So, what are the benefits?
“AI-native wireless systems inherently incorporate AI algorithms into their operational framework and for wireless engineers the benefits are going to be better coverage, higher capacity and more reliable robustness,” explained Zarrinkoub. He continued, “They are designed to be able to learn from and adapt to their environment.”
By contrast traditional designs are based on more rigid, predefined models that have scalability limitations and often require costly, time-consuming signal processing resources.
“AI-native systems can manage complexity in a more efficient way, and that’s critical. The traditional rule-based system, in terms of design, is all about human intelligence and studying the environment and tends to be quite rigid.
“By contrast AI-native design can evolve – it’s not static, and it responds to changes in the real world. It can develop and learn from changes that are made in the design process. It’s a dynamic system. But while this is all very exciting it’s scary for some because by using AI you lose explainability of a system. For example, when something goes wrong it’s much harder to understand why and to locate the issue. An AI-native system is constantly changing and is opaque. So, identifying the problem will be much harder than using the traditional design process,” Zarrinkoub suggested.
“We’ll need to build confidence going forward and create stronger boundaries between sub-systems, so it will be easier to identify and then explain a failure. But once engineers are more confident with using AI-native design, so those barriers between sub-systems can be eliminated.”
No longer optional, but essential
According to Zarrinkoub as the upcoming rollout of 5G Advanced and 6G standards will require the use of AI-native technologies, the engineers designing these modern wireless systems will have to understand that integrating AI will no longer be optional but will be essential.
These systems are complex and will require the gathering of data, the training and testing of models, and the implementation and integration of that model into the wireless system.
Engineers will need large real-world measured datasets to facilitate the development of AI-native wireless systems and the collection of data will only be achieved by acquiring over-the-air (OTA) signals or synthesising data from a digital twin.
“AI-native systems are at the mercy of data, but it isn’t practical to have all these different situations being captured, so you need digital twins and synthetic data to help create accurate signals that can then be used in training and to represent the real world.
“Synthetic data can be especially useful,” according to Zarrinkoub, “as it facilitates scalability testing, fault tolerance and anomaly detection while also aiding in environment modelling and system configuration optimisation. Most engineers will need to use digital twins to augment data to train AI-native systems as it will ensure that the system has sufficient data to handle adverse situations and efficiently manage system elements.”
Maximising model efficiency requires that training data is representative of the real-world scenarios the system will face. Engineers can use the collected data to perform training and validation of AI models, testing and simulation, as well as optimisation and performance tuning.
With the data gathered, the next step involves simulation and modelling and when it comes to training an AI model for a wireless system, it will be vital to determine parameters such as bandwidth allocation, latency, signal strength, modulation and coding.
“By using these parameters and comprehensive datasets, the engineer will be able to select and optimise machine learning algorithms for key system functions like autoencoders, channel estimation, channel feedback optimisation and resource allocation,” explained Zarrinkoub.
During the training process, engineers will need to consider factors that affect real-time performance, including computational complexity, memory usage, and parallel processing on GPUs or clusters, and after an AI model is trained it will then be tested to ensure reliable performance in real-world systems.
At this stage, the model’s performance is iteratively adapted to correct for biases, errors and inefficiencies.
An AI model is only useful when it is implemented as part of a real-world system. Scaling and resource assessment are critical, and this involves evaluating the processing power, memory requirements and data throughput needed for the AI models to operate efficiently.
Then developers will need to use automatic code generation for deploying pretrained AI models on desktop or embedded targets using low-level code. This automates the implementation process and reduces manual coding errors.
The final implementation step is the validation process that compares the performance of the implemented system to that of the original AI model, which gives engineers the opportunity to address any discrepancies or performance issues.
This final step involves the integration of the implemented AI models within the overall wireless system and this phase is necessary to ensure that the AI solution works with the rest of the legacy system. Engineers must ensure interoperability with existing system components by analysing the end-to-end system performance rather than individual algorithms and subsystems.
“Integrating AI into wireless systems presents a variety of hurdles, but engineers can use modelling and simulation to explore various scenarios and configurations,” Zarrinkoub explained.
“The big issue when it comes to AI is its explainability and accuracy at the training stage – was everything done that was required? The engineering aspect will not be the main issue.”
Transitioning from legacy wireless systems to enhanced systems without disruption will be challenging, but using AI will be the key to delivering this transition.