Joint optimization of fronthaul compression and bandwidth allocation in heterogeneous CRAN

Wenchao Xia, Jun Zhang, Tony Q.S. Quek, Shi Jin, Hongbo Zhu

Research output: Contribution to journalConference articlepeer-review

5 Citations (Scopus)

Abstract

In this paper, we consider the uplink transmission of a heterogeneous cloud radio access network, where a macro base station (BS) and many remote radio heads (RRHs) coexist to serve user equipment units. For cost-savings, only the BS is connected to the baseband unit (BBU) pool via fiber links, whereas the RRHs are associated with the BBU pool through wireless fronthaul links with limited capacities. By employing Wyner-Ziv (WZ) coding scheme, the RRHs first compress the received signal and then transmit the corresponding quantized version to the BBU pool. We derive deterministic equivalent for ergodic uplink sum rate and use this result to jointly optimize quantization noise matrix and bandwidth allocation between radio access networks and fronthaul links. An algorithm based on Dinkelbach's algorithm is also proposed to determine the optimal solutions. Numerical results show that as the normalized fronthaul capacity increases, more bandwidth is allocated to radio access networks. Besides, uniform quantization with WZ coding across RRHs can achieve near-optimal performance under high signal-to-quantization-noise ratio.

Original languageEnglish
Pages (from-to)1-6
Number of pages6
JournalProceedings - IEEE Global Communications Conference, GLOBECOM
Volume2018-January
DOIs
Publication statusPublished - 2017
Event2017 IEEE Global Communications Conference, GLOBECOM 2017 - Singapore, Singapore
Duration: 2017 Dec 42017 Dec 8

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Computer Networks and Communications
  • Hardware and Architecture
  • Signal Processing

Fingerprint

Dive into the research topics of 'Joint optimization of fronthaul compression and bandwidth allocation in heterogeneous CRAN'. Together they form a unique fingerprint.

Cite this