A simple cluster-scaling policy for MapReduce clouds

Sheng Wei Huang, Ce Kuen Shieh, Syue Ru Lyu, Tzu Chi Huang, Chien Sheng Chen, Ping Fan Ho, Ming Fong Tsai

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Due to the rise of cloud computing, many cloud services have been developed. Google proposed a programming model called MapReduce for processing large amounts of data. After YAHOO! proposed Hadoop, many companies and enterprises have started using this programming model to establish their own cluster for handling large amounts of data.

Original languageEnglish
Title of host publication2013 International Symposium on Wireless and Pervasive Computing, ISWPC 2013
DOIs
Publication statusPublished - 2013
Event2013 International Symposium on Wireless and Pervasive Computing, ISWPC 2013 - Taipei, Taiwan
Duration: 2013 Nov 202013 Nov 22

Publication series

Name2013 International Symposium on Wireless and Pervasive Computing, ISWPC 2013

Other

Other2013 International Symposium on Wireless and Pervasive Computing, ISWPC 2013
CountryTaiwan
CityTaipei
Period13-11-2013-11-22

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Software

Fingerprint Dive into the research topics of 'A simple cluster-scaling policy for MapReduce clouds'. Together they form a unique fingerprint.

Cite this