TY - JOUR
T1 - Enabling intelligence in fog computing to achieve energy and latency reduction
AU - La, Quang Duy
AU - Ngo, Mao V.
AU - Dinh, Thinh Quang
AU - Quek, Tony Q.S.
AU - Shin, Hyundong
N1 - Publisher Copyright:
© 2018 Chongqing University of Posts and Telecommunications
PY - 2019/2
Y1 - 2019/2
N2 - Fog computing is an emerging architecture intended for alleviating the network burdens at the cloud and the core network by moving resource-intensive functionalities such as computation, communication, storage, and analytics closer to the End Users (EUs). In order to address the issues of energy efficiency and latency requirements for the time-critical Internet-of-Things (IoT) applications, fog computing systems could apply intelligence features in their operations to take advantage of the readily available data and computing resources. In this paper, we propose an approach that involves device-driven and human-driven intelligence as key enablers to reduce energy consumption and latency in fog computing via two case studies. The first one makes use of the machine learning to detect user behaviors and perform adaptive low-latency Medium Access Control (MAC)-layer scheduling among sensor devices. In the second case study on task offloading, we design an algorithm for an intelligent EU device to select its offloading decision in the presence of multiple fog nodes nearby, at the same time, minimize its own energy and latency objectives. Our results show a huge but untapped potential of intelligence in tackling the challenges of fog computing.
AB - Fog computing is an emerging architecture intended for alleviating the network burdens at the cloud and the core network by moving resource-intensive functionalities such as computation, communication, storage, and analytics closer to the End Users (EUs). In order to address the issues of energy efficiency and latency requirements for the time-critical Internet-of-Things (IoT) applications, fog computing systems could apply intelligence features in their operations to take advantage of the readily available data and computing resources. In this paper, we propose an approach that involves device-driven and human-driven intelligence as key enablers to reduce energy consumption and latency in fog computing via two case studies. The first one makes use of the machine learning to detect user behaviors and perform adaptive low-latency Medium Access Control (MAC)-layer scheduling among sensor devices. In the second case study on task offloading, we design an algorithm for an intelligent EU device to select its offloading decision in the presence of multiple fog nodes nearby, at the same time, minimize its own energy and latency objectives. Our results show a huge but untapped potential of intelligence in tackling the challenges of fog computing.
UR - http://www.scopus.com/inward/record.url?scp=85056659319&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85056659319&partnerID=8YFLogxK
U2 - 10.1016/j.dcan.2018.10.008
DO - 10.1016/j.dcan.2018.10.008
M3 - Review article
AN - SCOPUS:85056659319
SN - 2468-5925
VL - 5
SP - 3
EP - 9
JO - Digital Communications and Networks
JF - Digital Communications and Networks
IS - 1
ER -