Summary: | Online discussions have been found to be a powerful platform for collaborative learning. Students interact online and this has contributed towards individual student's learning process. However, the issues that need to be addressed in online discussions are assessment of students' participation and the level of activity with reference to numerous discussion threads. Currently, the assessment of online discussion is based on content or interaction and each does not have standardized detailed descriptions or rubrics to determine the level of participation among the online interactants. To address the problem of assessment, this research investigated and verified the use of content combined with interaction as significant assessment criteria. The proposed framework to address the problem used the Quantitative log file (QLF) and rubrics to gauge the level of students' online participation. The QLF for content included novelty and key knowledge whereas interaction included pair response, final response, and interaction rate. The framework was applied in a prototype based on MOODLE environment called Rubric Assessment Participation System (RAPS). Questionnaires were distributed to fifty respondents in order to justify the assessment criteria of online participation. Six users were selected to test the prototype which combined content and interaction as assessment criteria in the rubrics and the result showed that RAPS can be used as an assessment tool for online discussions.
|