Enabling intelligence in fog computing to achieve energy and latency reduction

Quang Duy La, Mao V. Ngo, Thinh Quang Dinh, Tony Q.S. Quek, Hyundong Shin

研究成果: Review article同行評審

128 引文 斯高帕斯(Scopus)


Fog computing is an emerging architecture intended for alleviating the network burdens at the cloud and the core network by moving resource-intensive functionalities such as computation, communication, storage, and analytics closer to the End Users (EUs). In order to address the issues of energy efficiency and latency requirements for the time-critical Internet-of-Things (IoT) applications, fog computing systems could apply intelligence features in their operations to take advantage of the readily available data and computing resources. In this paper, we propose an approach that involves device-driven and human-driven intelligence as key enablers to reduce energy consumption and latency in fog computing via two case studies. The first one makes use of the machine learning to detect user behaviors and perform adaptive low-latency Medium Access Control (MAC)-layer scheduling among sensor devices. In the second case study on task offloading, we design an algorithm for an intelligent EU device to select its offloading decision in the presence of multiple fog nodes nearby, at the same time, minimize its own energy and latency objectives. Our results show a huge but untapped potential of intelligence in tackling the challenges of fog computing.

頁(從 - 到)3-9
期刊Digital Communications and Networks
出版狀態Published - 2019 2月

All Science Journal Classification (ASJC) codes

  • 硬體和架構
  • 電腦網路與通信


深入研究「Enabling intelligence in fog computing to achieve energy and latency reduction」主題。共同形成了獨特的指紋。