Enabling intelligence in fog computing to achieve energy and latency reduction

Quang Duy La, Mao V. Ngo, Thinh Quang Dinh, Tony Q.S. Quek, Hyundong Shin

Research output: Contribution to journalReview articlepeer-review

132 Citations (Scopus)


Fog computing is an emerging architecture intended for alleviating the network burdens at the cloud and the core network by moving resource-intensive functionalities such as computation, communication, storage, and analytics closer to the End Users (EUs). In order to address the issues of energy efficiency and latency requirements for the time-critical Internet-of-Things (IoT) applications, fog computing systems could apply intelligence features in their operations to take advantage of the readily available data and computing resources. In this paper, we propose an approach that involves device-driven and human-driven intelligence as key enablers to reduce energy consumption and latency in fog computing via two case studies. The first one makes use of the machine learning to detect user behaviors and perform adaptive low-latency Medium Access Control (MAC)-layer scheduling among sensor devices. In the second case study on task offloading, we design an algorithm for an intelligent EU device to select its offloading decision in the presence of multiple fog nodes nearby, at the same time, minimize its own energy and latency objectives. Our results show a huge but untapped potential of intelligence in tackling the challenges of fog computing.

Original languageEnglish
Pages (from-to)3-9
Number of pages7
JournalDigital Communications and Networks
Issue number1
Publication statusPublished - 2019 Feb

All Science Journal Classification (ASJC) codes

  • Hardware and Architecture
  • Computer Networks and Communications


Dive into the research topics of 'Enabling intelligence in fog computing to achieve energy and latency reduction'. Together they form a unique fingerprint.

Cite this