AI and machine learning in communication networks workshop

Go back to programme

AI and machine learning in communication networks workshop

The ITU vision document for IMT-2030 (6G) calls out ubiquitous intelligence as one of the overarching aspects commonly available to all usage scenarios of IMT-2030 (6G). AI-related capabilities such as distributed data processing, distributed learning, AI computing, AI model execution, and AI model inference, are to be supported throughout IMT-2030 (6G). This session analyses the applications in IMT-2030 but with a critical lens of requirements for AI-related capabilities in the network. How does artificial intelligence enable new usage scenarios in IMT-2030 (6G) which previous generations of IMT were not designed to support? How to use Generative AI to bring virtually generated or remote experiences closer to the user and at the same time bridge digital divides? Can AI help to achieve the twin goals of replicating the real world in a digital world and providing virtual experiences to humans while satisfying sustainability goals? The session chair would kick off the discussion and the experts would present their views on the topic, including important challenges such as the expectations on AI to enable new-age applications, AI-empowered features in 6G, and the current state of AI/ML. Discussions would focus on some of the important questions to answer such as the relation between application and services, AI, and networks, AI-enabled management of applications, and identify some of the major areas that need further study including current standards and gaps.

AI/ML overlays may not be the preferred approach in IMT-2030 (6G). ML pipeline-based integration of intelligence is often module-specific, focusing on specific components or tasks, which may or may not interact harmoniously with the rest of the system. Retrofitting AI capabilities into an existing framework leads to potential inefficiencies. This leads us to a system where AI pervades numerous functions, features, and user interactions, making it an integral part of the user experience. Hence AI Native seems to be an important technical enabler for IMT-2030 (6G) usage scenarios. Applications and services impose expectations on the network, such as intelligent placement of workloads, management of energy vs. performance, and proactive resilience, while assuring privacy, trust, and transparency. To satisfy these expectations, networks transform from providers of AI-as-Service to AI-Native platforms where application developers, integrators, and consumers can meet. Data is moved and crunched, at edge or core, AI algorithms are pulled and prompted, efficiently and on-demand, while applications and services are agnostic to these complexities. Discussions in this session will focus on some of the important considerations in AI-native design and network architectures that can host those designs. How would the creation, configuration, and management of network functions be done by the network itself? How can a network be the platform for the training and inference of models while enabling distributed edge nodes for inference? What does it take to host AI as a native service in the network?

On the sidelines of day 1, there would be demos scheduled that may showcase specific open source or other implementations, bringing out the applications, platforms, or architectures related to AI in xG.

Digital Twins for Communications: How to create and use them

Jakob Hoydis

A possible vision for 6G networks is that they can autonomously specialize to the radio environment in which they are deployed. I will discuss two key tools that are required to make this happen, namely differentiable ray racing for the creation of digital twin networks and machine learning. Differentiable ray tracing allows for gradient based optimization of many scene parameters and enables data-driven calibration of ray tracing models to measurements. Such digital twins can then be used as “gyms” for training of environment-specific communication schemes and applications. As examples, I will show how one can learn radio material parameters from channel measurements and present the architecture and performance of a recently developed 5G-compliant neural receiver which is not only compatible with different bandwidth allocations and number of layers but could possibly be implemented in real-time.

 

AI-Based Approaches in Network Security

Lucian Petrica 

AI-based classifiers can proactively protect against new forms of previously unseen cyber-attacks. Augmenting and replacing existing rule-based approaches, they reduce the risk of “zero-day vulnerabilities”. However, these workloads are computationally intense, and network deployments dictate extreme requirements for high streaming throughputs and low processing latencies. In this talk, we illustrate how extremely customizable FPGA-based inference engines support these novel demanding uses cases.

 

How close (far) are we from AI-native Wi-Fi? 

Francesc Wilhelmi 

Wi-Fi technology has increasingly gained complexity over the last decades to fulfill a broad set of requirements, now targeting ultra-high reliability within Wi-Fi 8. To address such complexity and to create new opportunities, AI and ML have gained a lot of attention in recent years, as they can address problems that are hard to solve by hand. In this talk, we will provide an overview of the status of AI/ML integration in Wi-Fi and depict the evolution path for the upcoming years towards AI nativeness, identifying the main challenges to be addressed and the potential steps to be taken by industry and standardization. 

 

Convergence of the Spheres: Autonomous Networks and Their Role in the (Telco) Multiverse

Paul Harvey  

Whether a smart pipe or an integrated platform for content, communication, and computation, our telecommunications networks are becoming more complex. The wide scale deployment of virtualisation has unlocked the power of programmability, and with it intelligent control. 

The breadth of technologies available to realise this intelligent control is growing by the day; encompassing static rules, ML technologies, and now large language models (so-called 'generativeAI'). This explosion of abilities is leading to new scalability and conflict challenges in not only managing network operation, but also these intelligent technologies themselves.  

This talk explores the role that autonomous networks have to play in this sea of technologies and describes how the recently published ITU-T "Autonomous Networks-Architecture Framework" can serve as a harmonising design language to bring these different intelligences and use cases together, as well as what is missing for the future. This is complemented by some examples of research activities at the University of Glasgow related to how we are trying to realise this convergence. 

 

Towards AI native future networks with composable digital twins

Gyu Myoung Lee 

Unlike traditional network designs that simply add AI capabilities on top of existing networks, future networks will embed AI deep into the network architecture to drive innovation and efficiency, especially in 5G beyond and 6G technologies. The design of a digital twin, as a virtual representation of an object or system, will require Al techniques to reflect physical environments more accurately. AI-powered digital twins can be used to perform various simulations, analyze performance issues, and create potential improvements for future networks and services with many different purposes. This talk will briefly introduce recent technical trends from a data- and user-centric perspective and then highlight the importance of composable digital twins approaches along with the composable architecture concept for scaling up AI-powered digital twins to support monitoring, simulation, predictive maintenance for future networks as well as vertical intelligent services. In addition, this talk presents several challenges to eventually build a virtual continuum supporting dynamic workflows with digital twin networks collaboratively towards AI native future networks and services. The proposed approach will enable building up large scale federated digital twins with AI native capabilities while supporting interoperability, composability and orchestration. 

 

A benchmark dataset to assess large language models telecommunications knowledge 

Antonio De Domenico 

In this talk, I will introduce TeleQnA, the first benchmark dataset tailored to assess the knowledge of Large Language Models (LLMs) in the telecommunications domain. Comprising 10,000 questions and answers sourced from diverse references, including standards and research articles, TeleQnA serves as a comprehensive evaluation tool for LLMs. 

I will present the automated question generation framework responsible for creating TeleQnA, detailing how human input was incorporated at various stages to ensure the quality of the questions. Using this dataset, we conducted an evaluation to assess the capabilities of LLMs, including GPT-3.5 and GPT-4. Our results reveal that while these models excel in addressing general telecom-related inquiries, they struggle with complex standards-related questions. 

Furthermore, we demonstrated how integrating telecom knowledge context significantly enhances the performance of LLMs, highlighting the need for specialized telecom foundation models. Our findings show that LLMs can rival the knowledge of active professionals in telecom, thanks to their ability to process vast amounts of information. 

 

Unleashing the Power of Cloud Platforms—Accelerating AI Adoption via Cloud-native Capabilities

Karim Rabie 

In today's fast-paced and data-driven world, more organizations increasingly embrace artificial intelligence's transformative potential in driving innovation and growth. However, realizing the full benefits of AI requires robust and scalable computing infrastructure, which can be effectively achieved by utilizing cloud/AI platforms. This session aims to explore the untapped potential of cloud platforms in accelerating AI adoption by harnessing their native capabilities.

 

Intelligent telepresence robot prototyping: Challenges and Solutions 

Ammar Muthanna 

The advent of telepresence robots has revolutionized remote communication by offering users immersive experiences that transcend geographical limitations. However, the development of intelligent telepresence robots poses multifaceted challenges, spanning technical, social, and user-centric domains. This abstract delves into the intricacies of prototyping such robots, identifying key hurdles, and proposing innovative solutions. The journey of constructing an intelligent telepresence robot begins with navigating the complexities of hardware design. Challenges arise in integrating sensory systems, locomotion mechanisms, and power management while ensuring cost-effectiveness and scalability. Concurrently, software development presents its own set of obstacles, from designing intuitive user interfaces to implementing robust navigation algorithms capable of autonomous operation in diverse environments. 

Beyond technical considerations, social factors profoundly influence the efficacy of telepresence robots. Human-robot interaction (HRI) must be carefully orchestrated to foster natural communication and mitigate user discomfort. Cultural nuances, etiquette, and privacy concerns further complicate the design process, necessitating interdisciplinary collaboration between engineers, psychologists, and sociologists. Moreover, the deployment of telepresence robots introduces ethical dilemmas surrounding privacy, surveillance, and data security. Balancing the benefits of remote presence with the protection of personal information demands meticulous attention to privacy-by-design principles and regulatory compliance. 

In response to these challenges, innovative solutions are emerging to propel the field forward. Advances in artificial intelligence facilitate adaptive behavior, enabling telepresence robots to learn from user interactions and dynamically adjust their responses. Furthermore, modular design frameworks streamline prototyping, empowering developers to rapidly iterate and customize robots for specific applications. Additionally, emerging technologies such as augmented reality (AR) and 6G connectivity enhance the telepresence experience by providing high-fidelity audiovisual feedback and reducing latency. Collaborative teleoperation platforms leverage the collective intelligence of human operators to navigate complex environments and solve unforeseen challenges in real-time. 

Summing up, the prototyping of intelligent telepresence robots demands a holistic approach that transcends technical prowess to encompass human-centric design, ethical considerations, and interdisciplinary collaboration. By surmounting these challenges with innovative solutions, researchers and engineers can unlock the full potential of telepresence robots to reshape how we communicate, collaborate, and connect across distances.

 

AI powered integrated 6G Networks: R&D at Meganetlab 6G Lab 

Artem Volkov

The IMT-2030 adopted by the ITU in 2023 defines several changes in the communication networks development: communication networks with ultra-low latency; ultra-dense networks; internet of skills (tactile); flying nets; unmanned vehicles; integrated networks segments (STIN); etc. Based on these vectors, 6G will be ultra-high dense networks with ultra-low delays, their structure will continue the decentralization trend including computing infrastructure. At the same time, following services are expected: holographic type communication and network personalization (based on digital twins); flying networks and swarm intelligence. Thus, it is expected that deeply integrated and distributed AI techniques will ensure the above challenges and achieve goals. This presentation shows architecture, methods for building a 6G network and our solutions and prototypes in  “Meganetlab 6G Lab”.

 

Role of Predictive and Generative AI in building Intelligent Telco Infrastructure  

Azhar Sayeed 

Generative AI has taken the world by storm recently while predictive AI has been around for sometime evolving from the data analytics space. Telcos have experimented  with predictive AI for various use cases. While Generative AI has opened many new doors wrt how AI can be leveraged and embedded in their operations and services. This presentation will take a deeper look at the role Predictive and Generative AI can play in building Intelligent Autonomous infrastructure for Telcos and large enterprises. It will inspect the autonomous infrastructure paradigm with the lens of building, operations and governance. 

 

Principles of Green AI - A Framework for Energy Optimization in Future Networks   

James Agajo 

The WINEST Research team which stand for (Wireless Networking and Embedded System Technology Research team) at the Federal University of Technology Minna, Nigeria, has collaborated with ITU ML5G initiatives to advance key elements for future networks. This collaboration includes exploring use case scenarios concerning energy optimization in 5G/6G and the integration of AI in future networks. Our recent focus on Green AI has developed a practical framework for monitoring and optimizing energy usage in telecom networks. This presentation discusses the requirements for environmentally sustainable artificial intelligence, the corresponding architecture components, impacts on standards, proposed solutions, and performance metrics. We conclude by outlining our future research plans and inviting collaboration for pre-standard research in ITU. 

 

How AI and LLM are transforming Programming   

Wei Meng 

Artificial Intelligence (AI) continues to redefine the boundaries of technology, with significant impacts across various domains including programming. This presentation delves into the transformative role of AI in programming, with a particular focus on AI-assisted programming facilitated by open source large language models (LLMs) that have undergone enhanced training and fine-tuning. 

We begin by examining the foundation of AI-assisted programming tools that leverage the capabilities of LLMs. These tools not only assist developers by providing code completion suggestions but also enhance problem-solving strategies through context-aware documentation assistance and real-time error correction. We will explore how enhanced training and fine-tuning of these models have significantly improved their accuracy and utility in complex programming scenarios. 

The discussion will then pivot to specific case studies where open source LLMs have been integrated into development environments, demonstrating their impact on productivity and code quality. We will analyze the methodologies involved in training these models, including the selection of training data, the tuning of model parameters for specific programming languages or frameworks, and the ethical considerations in their deployment. 

Further, we will highlight the collaborative aspect of these AI tools in programming teams, facilitating more efficient peer reviews and knowledge sharing. The potential for these models to democratize programming skills, making advanced coding techniques accessible to a broader range of developers, will also be discussed. 

Finally, we will address the challenges and future prospects of AI-assisted programming. This includes navigating the issues of dependency on AI, maintaining security when integrating AI into coding environments, and the ongoing need to balance AI assistance with human oversight. 

Attendees will leave with a comprehensive understanding of how enhanced and fine-tuned open source large language models are shaping the future of programming. They will gain insights into leveraging these advancements to enhance their development practices, improve software quality, and prepare for future technological shifts.

 

Data-driven Modelling and Optimization of Green Future Mobile Networks 

Nicola Piovesan 

The introduction of fifth-generation (5G) radio technology has revolutionized communications, bringing unprecedented automation, capacity, connectivity, and ultra-fast, reliable communications. Despite these advancements, energy efficiency remains a critical challenge. While 5G networks are approximately four times more energy-efficient than 4G networks, their energy consumption is still up to three times higher, leading to increased carbon emissions and operational costs.  

This talk addresses this pressing issue by exploring the integration of big data and machine learning to enhance the energy efficiency of 5G networks. In particular, we explore the application of these technologies in analyzing energy consumption data from thousands of base stations, paving the way for the development of machine learning models that can precisely quantify energy use in the network—a pivotal step in boosting energy efficiency. 

 

Turn Off the 5G Lights without Service Going Blind 

Paul Patras 

The Telcoms sector is responsible for ~2% of the global carbon emissions and radio access networks (RANs) account for ~80% of all mobile network energy consumption. Energy consumption is further responsible for 20–40% of network operating expenses, according to GSMA. In this talk I will show how the power of artificial intelligence can be harnessed to tackle this sustainability and operational challenges. I will present a solution that forecasts user traffic demands with high accuracy and controls the number of RAN resources active at any point in time, thereby reducing the infrastructure's energy consumption by up to 60%, without impacting service quality. Finally, I will explain how this solution can be packaged for the emerging Open RAN mobile networking paradigm and I will demonstrate its performance in action, considering an urban deployment with over 200 cells. 

 

On Measurement and Modeling for Joint Communications and Sensing

Nada Golmie

As the needs for sensing the physical world to support detection, tracking, AR/VR and slew of other applications increase, the use of communications waveforms for sensing is becoming more attractive and is likely to emerge as one of the main features in NextG development. In this talk we discuss key challenges in measurement science and modeling approaches for advanced communications and sensing going forward. We will review the main building blocks for joint communications and sensing in terms of efficient spectrum use, higher frequency bands and the use machine learning. The millimeter-wave and terahertz bands hold the promise of significant bandwidth and speed due to large swaths of untapped spectrum. In addition, as massive data volumes are being collected, analyzed, and delivered, communications and sensing systems have become too complex to develop, manage, and operate. The insights that are “mined” from the data using Machine Learning (ML) techniques have become standard practice. In this talk, we discuss state -of -the art and key challenges in measurement and modeling techniques to expedite the development and pave the way for the next “G”. 

 

Communications in the AI Era: What Role will 6G play? 

Zoran Utkovski

The next generation (6G) of mobile communication networks will integrate communications, sensing, and AI capabilities. This integration will introduce a paradigm shift in the way data is acquired, processed, and transmitted in the network. In this context, several challenges arise. First, the exponential growth of generated data requires the processing and transmission of vast amounts of data at the network edge. Second, large AI models show increasing demand for computing power, yet hardware development does not keep pace with these requirements. Third, the widespread adoption of conventional artificial neural networks (ANNs) leads to a notable increase in energy consumption, resulting in higher costs for training and inference. Against this background, this talk discusses the intersection of wireless communications and AI for edge intelligence applications. One aspect that will be covered is related to the integration of wireless communications with neuromorphic sensing and neuromorphic processing, mimicking the workings of the human brain. The considered approach allows for reduction in communication overhead, implementation complexity, and energy consumption, making it amenable for applications such as human-robot interaction, collaborative robotics, and health monitoring. The work has been conducted within the “6G Research and Innovation Cluster” (6G-RIC), funded by the German Ministry of Education and Research (BMBF) in the program “Souverän. Digital. Vernetzt.” 

 

AI for 6G Network Management and Control Optimisation

Qi Wang

Artificial Intelligence (AI) coupled with beyond 5G and 6G networking capabilities can empower highly autonomous networking solutions for data-driven network management and control optimisation. This talk will discuss several challenges and promising solutions to this end, such as the generation of the necessary dataset from realistic operational environments, which is usually the first and a major barrier for machine learning; the identification of the most relevant features for different network management and control use cases, from a myriad of observable features in a complicated beyond 5G towards 6G networking infrastructure implementing virtualisation, softwarisation, cloudification, edge computing and various network management and control tools; architecture and enablers that can facilitate the dataset extraction, visualisation and labelling process for AI model training; and an experimental testbed for empirical deployment and evaluation of this approach. The talk will further introduce related work in recent European projects, and suggest potential topics for standardisation opportunities.

 

Transforming Tomorrow: Exploring the Frontiers of Next-Generation AI Technologies in Telecommunication

Buse Bilgin

In the rapidly evolving landscape of telecommunications, the integration of next-generation AI technologies is paving the way for AI-native systems that promise to redefine network architectures, service delivery, and user experiences. This presentation will delve into the transformative potential of Large Language Models (LLMs), Generative Adversarial Networks (GANs), quantum machine learning, and digital twins, highlighting their application in the context of emerging 6G networks. LLMs are set to revolutionize automated customer support and content customization, while GANs can enhance data simulation and network security. Quantum ML offers solutions for complex optimization problems with unprecedented speed and efficiency, and digital twins provide real-time network management and predictive maintenance. Despite these advantages, the deployment of such technologies poses challenges, including integration complexity, increased security vulnerabilities, and ethical concerns, necessitating rigorous testing, adaptive algorithms, and strong ethical standards. The presentation will conclude with a call for collaborative efforts among AI researchers, telecom experts, and policymakers to establish guidelines that ensure safe, effective, and fair AI implementations, setting the stage for a future where 6G technology enables truly innovative and inclusive telecommunications solutions.

 

Empowering Radio Access Networks with Large-Scale Generative AI

Lina Bariah

In the era of rapid technological advancements, Radio Access Networks (RAN) face increasing demands for higher efficiency, lower latency, and optimal resource utilization. In this talk, the speaker will explore the transformative potential of large-scale generative AI in addressing these challenges and revolutionizing the performance of RAN. We will explore how integrating advanced generative AI models can lead to significant improvements in network operations, from dynamic spectrum management to intelligent traffic prediction and enhanced signal processing. The speaker will also discuss key strategies for deploying generative AI in RAN, real-world applications, and the future impact on next-generation networks. The speaker aims to give insights into the future landscape of next-generation networks, where AI-driven solutions play a pivotal role in enabling more robust, adaptive, and efficient communication systems.

 

 

Share this session

Are you sure you want to remove this speaker?