Citing as an online learning support tool for student-generated assessment

Fu Yun Yu, Ju Ko Wei

Research output: Contribution to journalArticlepeer-review


While enabling students to refer to and build on peer-produced work (termed ‘citing’ herein) during content creation appears pedagogically promising, its associated learning effects remain under-studied. This research aimed at examining the effects of online citing of peer-generated assessment items during student test-construction on promoting learning and task performance. Additionally, any negative effects this approach may have on inducing cognitive and emotional burdens were examined. A pretest-posttest quasi-experimental research design was adopted, where two treatment groups were devised (i.e. the citing and no-citing groups). A group of fifth-graders from six classes participated in an 11-week study. Based on the results of the analysis of covariance, it was found that students in the citing group scored significantly higher than those in the no-citing group on academic achievement and question-generation performance. In addition, citing did not induce significantly higher cognitive load or learning anxiety as compared to the no-citing situation.

Original languageEnglish
Pages (from-to)165-186
Number of pages22
JournalTechnology, Pedagogy and Education
Issue number2
Publication statusPublished - 2024

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Education
  • Communication
  • Computer Science Applications


Dive into the research topics of 'Citing as an online learning support tool for student-generated assessment'. Together they form a unique fingerprint.

Cite this