Exercises are an essential part of software engineering courses as students can apply the knowledge taught in class to specific problems. When exercises are graded or even part of an exam, it is crucial to assure students have worked on the assignments independently. Especially for large courses, com- paring submissions by hand is too time-consuming, and instructors should use automated tools to detect plagiarism efficiently.
The current integration of plagiarism detection in Artemis requires instructors to use external software to review plagiarism incidents and sort out false positives manually. This process involves managing dozens of files and is error-prone as accidentally confounding student submissions can cause false accusations. Differences in each instructor’s workflow might lead to inconsistent assessment results.
This thesis improves the existing integration by processing the plagiarism results directly in Artemis and highlighting similarities between submissions by color. Instructors can confirm or deny a plagiarism incident and leave feedback to students whose submissions were confirmed as plagiarized. Artemis displays a similarity distribution of detected results that helps instructors evaluate the plagiarism found. Instructors can filter out submissions irrelevant for comparison to increase plagiarism detection performance.