Face2Music : Implementing a music control system based on facial emotion recognition using the OM2M framework

  • 周 純靜

Student thesis: Master's Thesis


In recent years AI and the Internet of Things (IoT) have developed rapidly People can use these technologies to meet their own needs including emotional health When people's lives face excessive pressure their emotions are affected and even mental illness may occur In order to reduce emotional problems and consider the convenience of using services this study combines the techniques of AI and the IoT to implement the interactive human-machine interaction system - Face2Music including the Emotion Controller Module Speaker Manager Module and Music Manager Module In the emotion control module through AI emotion recognition the user can use the expression to control the music playback device immediately (our system gives a feedback of playing music if a happy expression is detected) The speaker management module can adjust the power and volume settings of the music player and control the group or individual The music management module is equipped with a music database and HTTP server and provides a web management page that can modify the music play list and control the speaker by music player In order to make the service more widely available and reach the concept of smart homes and smart cities we use OM2M as the framework to build three modules into a four-layer structure - Application Layer Network Layer Gateway Layer and Device Layer
Date of Award2019
Original languageEnglish
SupervisorChuan-Ching Sue (Supervisor)

Cite this