Header menu link for other important links
X
Improving the throughput of a cellular network using machine learning - A case study of LTE
P. Gaikwad, S. Amuru,
Published in Institute of Electrical and Electronics Engineers Inc.
2021
Abstract
Long Term Evolution (LTE) focused on providing high data rates at low latency when compared to previous-generation technologies. The recent research and development in machine learning for wireless communication networks focus on making these networks more efficient, intelligent, and optimal. We propose a machine learning algorithm to improve the performance of LTE in a real-time deployments. Specifically, we focus on the case of single-user multiple-input multiple-output transmission mode (TM4 as known in LTE). The channel quality feedback from user to the base stations plays a crucial role to ensure successful communication with low error rate in this transmission mode. The feedback from the user includes precoding matrix indicator (PMI), rank indicator apart from the channel quality feedback. However, in practical systems, as the base station must support several users, there is a delay expected from the time a user sends feedback until the time it is scheduled. This time lag can cause significant performance degradation depending on the channel conditions and also in cases when the user is mobile. Hence, to eliminate this adverse impact, we present a machine learning model that predict future channels and the feedback from the user is calculated based on these predictions. Via several numerical simulations, we show the effectiveness of the proposed algorithms under a variety of scenarios. Without loss of generality, the same work can be applied in the context of 5G NR. LTE is used only as a case study due to its vast prevalence and deployments even as of today. © 2021 IEEE.
About the journal
JournalData powered by Typeset2021 National Conference on Communications, NCC 2021
PublisherData powered by TypesetInstitute of Electrical and Electronics Engineers Inc.