In this paper, we propose a novel system for automatically summarizing home videos based on a user experience model. The user experience model takes account of user's spontaneous behaviors when viewing videos. Based on users' reaction when viewing videos, we can construct a systematic framework to automate video summarization. In this work, we analyze the variations of viewer's eye movement and facial expression when he or she watching the raw home video. We transform these behaviors into the clues of determining the important part of each video shot. With the aids of music analysis, the developed system automatically generates a music video (MV) style summarized home videos. Experiments show that this new type of editing mechanism can effectively generate home video summaries and can largely reduce the efforts of manual summarization.