Data-Driven Tools for Introductory Computer Science Education

dc.contributor.advisorWarren, Joe D.
dc.creatorTang, Lei
dc.date.accessioned2019-05-17T16:09:05Z
dc.date.available2019-05-17T16:09:05Z
dc.date.created2018-08
dc.date.issued2018-08-10
dc.date.submittedAugust 2018
dc.date.updated2019-05-17T16:09:05Z
dc.description.abstractThe software industry spends a tremendous amount of effort and resources on software testing and maintenance to improve the quality of software. However, a large portion of the cost may be saved by training high-quality software developers with better Computer Science education. Skilled software developers can not only produce code of fewer bugs and better design but also identify and fix issues more effectively. Therefore, in this thesis, we researched building useful educational tools for facilitating Computer Science education, particularly in introductory programming courses. Since understanding the code execution is the first step of writing high-quality code and software testing, in the first study, we built a web-based interactive tool to teach students necessary comprehension and analysis skills to understand the program execution. Secondly, we built an automated tool for students to interactively practice writing test cases and debugging programs. The tool gauges the test coverage of students' test sets using a large corpus of buggy programs we collected in our previous course sessions. The tool returns the buggy programs as immediate feedback which students' test sets failed to catch. Students need to study those returned buggy programs to gradually improve the testing coverage of their test sets. In the third project, we built a tool that automatically generates high-quality test cases to construct concise test sets for testing students' coding assignment solutions. The tool utilizes heterogeneous historical student incorrect implementations to guide the test case search process. Its generated test cases are expected to provide better test coverage than instructor built tests cases. To validate the effectiveness of our tools, we conducted studies in introductory programming courses among students at Rice and online students of our Massive Open Online Courses (MOOC). The studies showed that, compared with studying traditional Computer Science curriculum, students made significant improvements in understanding basic Computer Science concepts and software testing skills by interacting with our educational tools.
dc.format.mimetypeapplication/pdf
dc.identifier.citationTang, Lei. "Data-Driven Tools for Introductory Computer Science Education." (2018) Diss., Rice University. <a href="https://hdl.handle.net/1911/105845">https://hdl.handle.net/1911/105845</a>.
dc.identifier.urihttps://hdl.handle.net/1911/105845
dc.language.isoeng
dc.rightsCopyright is held by the author, unless otherwise indicated. Permission to reuse, publish, or reproduce the work beyond the bounds of fair use or other exemptions to copyright law must be obtained from the copyright holder.
dc.subjectComputer Science Education
dc.subjectInteractive Learning
dc.subjectSoftware Testing
dc.subjectVisualization
dc.subjectWeb-based Tool
dc.subjectAutomatic Test Case Generation
dc.subjectData-Driven
dc.subjectAutomated Programming Assessment System
dc.subject
dc.titleData-Driven Tools for Introductory Computer Science Education
dc.typeThesis
dc.type.materialText
thesis.degree.departmentComputer Science
thesis.degree.disciplineEngineering
thesis.degree.grantorRice University
thesis.degree.levelDoctoral
thesis.degree.nameDoctor of Philosophy
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
TANG-DOCUMENT-2018.pdf
Size:
9.17 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 2 of 2
No Thumbnail Available
Name:
PROQUEST_LICENSE.txt
Size:
5.84 KB
Format:
Plain Text
Description:
No Thumbnail Available
Name:
LICENSE.txt
Size:
2.6 KB
Format:
Plain Text
Description: