Maker education mainly involves “hands-on” as the core concept and combines various educational theories to redefine interactions between learners and teachers in a learning environment. Identification of meaningful “hands-on” behaviors is crucial to evaluate students’ learning performance, although an instructor’s observation of every student is not feasible. However, such observation is possible with the aid of the artificial intelligence (AI) image processing technique; the AI learning behavior recognition system can serve as the second eyes of teachers, thus accounting for individual differences. However, in previous studies, learning behavior recognition was applied to the traditional or static classroom. A behavior recognition system for identifying “hands-on” actions in the learning context has still not been developed. Therefore, this study designed a human posture evaluation system, obtained human articulation node information from learning field images, and built a learning behavior recognition model suitable for maker education based on the AI convolutional neural network (CNN). A learning behavior model was defined, along with a number of student behavior indexes. Subsequently, the effectiveness of the model and behavior indexes was verified through practical learning activities. The model evaluation results indicated that the proposed model achieved a training accuracy of 0.99 and a model accuracy of 0.83. Thus, the model can be applied to dynamic maker activity learning environments.
All Science Journal Classification (ASJC) codes