Improving Peer Evaluation Quality in Massive Open Online Courses

dc.contributor.advisorChaudhuri, Swaraten_US
dc.contributor.committeeMemberWarren, Joeen_US
dc.contributor.committeeMemberJermaine, Chrisen_US
dc.creatorLu, Yanxinen_US
dc.date.accessioned2016-01-25T16:13:26Zen_US
dc.date.available2016-01-25T16:13:26Zen_US
dc.date.created2015-05en_US
dc.date.issued2015-05-26en_US
dc.date.submittedMay 2015en_US
dc.date.updated2016-01-25T16:13:27Zen_US
dc.description.abstractAs several online course providers such as Coursera, Udacity and edX emerged in 2012, Massive Open Online Courses (MOOCs) gained much attention across the globe. While MOOCs provide learning opportunities for many people, several challenges exist in the context of MOOC and one of those is how to ensure the quality of peer grading. Interactive Programming in Python course (IPP) that Rice has offered for a number of years on Coursera has suffered from the problem of low-quality peer evaluations. In this thesis, we propose our solution to improve the quality of peer evaluations by motivating peer graders. Specifically, we want to answer the question: when a student knows that his or her own peer grading efforts are being examined and they are able to grade other peer evaluations, do those tend to motivate the student to do a better job when grading assignments? We implemented a web application where students can grade peer evaluations and we also conduct a series of controlled experiments. Finally, we find a strong effect on peer evaluation quality simply because students know that they are going to be studied using a software that is supposed to help with peer grading. In addition, we find strong evidence that by grading peer evaluations students tend to give better peer evaluations. However, the strongest effect seems to be obtained via the act of grading others’ evaluations, and not from the knowledge that one’s own peer evaluation will be examined.en_US
dc.format.mimetypeapplication/pdfen_US
dc.identifier.citationLu, Yanxin. "Improving Peer Evaluation Quality in Massive Open Online Courses." (2015) Master’s Thesis, Rice University. <a href="https://hdl.handle.net/1911/88103">https://hdl.handle.net/1911/88103</a>.en_US
dc.identifier.urihttps://hdl.handle.net/1911/88103en_US
dc.language.isoengen_US
dc.rightsCopyright is held by the author, unless otherwise indicated. Permission to reuse, publish, or reproduce the work beyond the bounds of fair use or other exemptions to copyright law must be obtained from the copyright holder.en_US
dc.subjectMOOCen_US
dc.subjectPeer Evaluationen_US
dc.subjectEducationen_US
dc.titleImproving Peer Evaluation Quality in Massive Open Online Coursesen_US
dc.typeThesisen_US
dc.type.materialTexten_US
thesis.degree.departmentComputer Scienceen_US
thesis.degree.disciplineEngineeringen_US
thesis.degree.grantorRice Universityen_US
thesis.degree.levelMastersen_US
thesis.degree.nameMaster of Scienceen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
LU-DOCUMENT-2015.pdf
Size:
862.92 KB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 2 of 2
No Thumbnail Available
Name:
PROQUEST_LICENSE.txt
Size:
5.84 KB
Format:
Plain Text
Description:
No Thumbnail Available
Name:
LICENSE.txt
Size:
2.6 KB
Format:
Plain Text
Description: