Efficient access of game videos is urgently demanded due to the emergence of live streaming platforms and the explosive numbers of gamers and viewers. In this work we facilitate efficient access from two aspects: game event detection and highlight detection. By recognizing predefined text displayed on screen when some events occur, we associate game events with time stamps to facilitate direct access. We jointly consider visual features, events, and viewer's reaction to construct two highlight models, and enable compact game presentation. Experimental results show the effectiveness of the proposed methods. As one of the early attempts on analyzing broadcasted game videos from the perspective of multimedia content analysis, our contributions are twofold. First, we design and extract game-specific features considering visual content, event semantics, and viewer's reaction. Second, we integrate clues from these three domains based on a psychological approach and a data-driven approach to characterize game highlights.