TY - GEN
T1 - Use BCI to Generate Attention-Based Metadata for the Assessment of Effective Learning Duration
AU - Shen, Yang Ting
AU - Chen, Xin Mao
AU - Lu, Pei Wen
AU - Wu, Ju Chuan
N1 - Funding Information:
The financial support from Ministry of Science and Technology (MOST) including project “SyncBIM Platform” (MOST106-2221-E-035-038-MY2) and project “Resi-lient Livable Smart City:” (MOST106-2627-M-035-003-), is greatly acknowledged.
Funding Information:
Acknowledgements. The financial support from Ministry of Science and Technology (MOST) including project “SyncBIM Platform” (MOST106-2221-E-035-038-MY2) and project “Resilient Livable Smart City:” (MOST106-2627-M-035-003-), is greatly acknowledged.
PY - 2018
Y1 - 2018
N2 - This paper proposes a novel method for evaluating the video-based learning performance by using brain computer interface (BCI). We develop Interactive Brain Tagging system (IBTS) to collect learns’ physiological affective metadata: attention. IBTS uses the EEG headset to measure learners’ brainwave and convert it into the evaluable attention value. When learners are watching video, their attention values are recorded every one second and marked in each corresponding video clip. We visaulize the variation of attention and tried to find out the continuous duration of higher attention level in a video. We used a 15Â min’ video to conduct the experiment with 31 subjects. The result presented the difference of individual and collective attention duration. Moreover, in our case, the collected result suggested that the appropriate video time with higher attention may locate in 232Â s.
AB - This paper proposes a novel method for evaluating the video-based learning performance by using brain computer interface (BCI). We develop Interactive Brain Tagging system (IBTS) to collect learns’ physiological affective metadata: attention. IBTS uses the EEG headset to measure learners’ brainwave and convert it into the evaluable attention value. When learners are watching video, their attention values are recorded every one second and marked in each corresponding video clip. We visaulize the variation of attention and tried to find out the continuous duration of higher attention level in a video. We used a 15Â min’ video to conduct the experiment with 31 subjects. The result presented the difference of individual and collective attention duration. Moreover, in our case, the collected result suggested that the appropriate video time with higher attention may locate in 232Â s.
UR - http://www.scopus.com/inward/record.url?scp=85050478724&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85050478724&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-91152-6_31
DO - 10.1007/978-3-319-91152-6_31
M3 - Conference contribution
AN - SCOPUS:85050478724
SN - 9783319911519
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 407
EP - 417
BT - Learning and Collaboration Technologies. Learning and Teaching - 5th International Conference, LCT 2018, Held as Part of HCI International 2018, Proceedings
A2 - Zaphiris, Panayiotis
A2 - Ioannou, Andri
PB - Springer Verlag
T2 - 5th International Conference on Learning and Collaboration Technologies, LCT 2018 Held as Part of HCI International 2018
Y2 - 15 July 2018 through 20 July 2018
ER -