Analysis of Learning Behavior of Human Posture Recognition in Maker Education

Yueh Min Huang, An Yen Cheng, Ting Ting Wu

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)


Maker education mainly involves “hands-on” as the core concept and combines various educational theories to redefine interactions between learners and teachers in a learning environment. Identification of meaningful “hands-on” behaviors is crucial to evaluate students’ learning performance, although an instructor’s observation of every student is not feasible. However, such observation is possible with the aid of the artificial intelligence (AI) image processing technique; the AI learning behavior recognition system can serve as the second eyes of teachers, thus accounting for individual differences. However, in previous studies, learning behavior recognition was applied to the traditional or static classroom. A behavior recognition system for identifying “hands-on” actions in the learning context has still not been developed. Therefore, this study designed a human posture evaluation system, obtained human articulation node information from learning field images, and built a learning behavior recognition model suitable for maker education based on the AI convolutional neural network (CNN). A learning behavior model was defined, along with a number of student behavior indexes. Subsequently, the effectiveness of the model and behavior indexes was verified through practical learning activities. The model evaluation results indicated that the proposed model achieved a training accuracy of 0.99 and a model accuracy of 0.83. Thus, the model can be applied to dynamic maker activity learning environments.

Original languageEnglish
Article number868487
JournalFrontiers in Psychology
Publication statusPublished - 2022 May 30

All Science Journal Classification (ASJC) codes

  • Psychology(all)


Dive into the research topics of 'Analysis of Learning Behavior of Human Posture Recognition in Maker Education'. Together they form a unique fingerprint.

Cite this