Manuscript Submission Deadline
Call for Papers (Final Issue)
The global demand for data traffic has experienced explosive growth over the past years. In the era of the new generation of communication systems, data traffic is expected to continuously straining the capacity of future communication networks. Along with the remarkable growth in data traffic, new applications of communications, such as wearable devices, autonomous systems, drones, and the Internet of Things (IoT), continue to emerge and generate even more data traffic with vastly different performance requirements. This growth in the application domain brings forward an inevitable need for more intelligent processing, operation, and optimization of tomorrow’s communication networks.
To realize this vision of intelligent processing and operation, there is a need to integrate machine learning, known to be the vessel that carries artificial intelligence, into the design, planning, and optimization of future communication networks. Particularly, the emerging framework of deep learning can be a key enabler for intelligent processing in a broad range of scenarios. Modern machine learning techniques provide ample opportunities to enable intelligent communication designs while addressing various problems ranging from signal detection, classification, and sparse signal recovery to channel modeling, network optimization, resource management, routing, transport protocol design, and application/user behavior analysis.
Beyond intelligent network management, machine learning will allow future communication networks and their applications, e.g., IoT, to exploit big data analytics so as to enhance situational awareness and overall network operation. Particularly, the massive amounts of data generated from multiple sources that range from network measurements to IoT sensor readings as well as drones and surveillance images can be used to show the comprehensive operational view of the massive number of devices within the network. Additionally, this comprehensive view can be exploited to detect anomaly events in communication networks.
This JSAC Series will focus on machine learning solutions to problems in communication networks, across various layers and within a broad range of applications. The topics of interest include, but are not limited to, machine learning, especially deep learning, for signal detection, channel modeling, resource optimization, routing protocol design, transport layer optimization, user/application behavior prediction, software defined networking, congestion control, communication network optimization, security and anomaly detection. The objective of this series is to bring together the state-of-art research results and industrial applications of machine learning for intelligent communications. Original contributions previously unpublished and not currently under review by another journal are solicited in relevant areas, including (but not limited to) the following:
- Machine/deep learning for signal detection, channel modeling, estimation, interference mitigation, and decoding.
- Resource and network optimization using machine learning techniques.
- Distributed learning algorithms and implementations over realistic communication networks.
- Machine learning techniques for application/user behavior prediction and user experience modeling and optimization.
- Machine learning techniques for anomaly detection in communication networks.
- Machine learning for emerging communication systems and applications, such as drone systems, IoT, edge computing, caching, smart cities, and vehicular networks.
- Machine learning for transport-layer congestion control.
- Machine learning for integrated radio frequency/non-radio frequency communication systems.
- Machine learning techniques for information-centric networks and data mining.
- Machine learning for network slicing, network virtualization, and software defined networking.
- Performance analysis and evaluation of machine learning techniques in wired/wireless communication systems.
- Scalability and complexity of machine learning in networks.
- Techniques for efficient hardware implementation of neural networks in communications.
- Synergies between distributed/federated learning and communications.
- Secure machine learning over communication networks.
Manuscript Submission: 15 December 2021 (Firm, No Extension)
First Notification: 15 February 2022
Revised Paper Due: 10 March 2022
Acceptance Notification: 15 April 2022
Final Manuscript Due: 25 April 2022
Publication Date: July 2022
Imperial College London
Khaled B. Letaief
Hong Kong University of Science and Technology
Area 1: Signal Processing
Queen Mary University London
Lu Lu, Intel Labs
Michalis Matthaiou, Queen’s University Belfast
Slawomir Stanczak, Fraunhofer Heinrich Hertz Institute
Tommy Svensson, Chalmers University of Technology
Kanapathippillai Cumanan, University of York
Area 2: Learn to Transmit and Receive
Alexios Balatsoukas-Stimming, Eindhoven University of Technology
Gaojie Chen, University of Surrey
Elisabeth de Carvalho, Aalborg University
Hao Ye, Qualcomm
Area 3: Resource Management and Network Optimization
University of Houston
Carlo Fischione, Royal Institute Technology
Le Liang, Southeast University
Dusit Niyato, Nanyang Technological University
Beibei Wang, Origin Wireless Inc.
Area 4: Distributed/Federated Learning and Communications
Imperial College London
Kaibin Huang, University of Hong Kong
Navid Naderializadeh, University of Pennsylvania
Yuanming Shi, ShanghaiTech University
Nguyen Tran, University of Sydney
Area 5: Selected Topics
University of Leeds
Yong Li, Tsinghua University
Omid Semiari, University of Colorado
Xiangwei Zhou, Louisiana State University
Guanding Yu, Zhejiang University
Qiang Ni, Lancaster University