In the evolving landscape of computer science education, the burgeoning student population necessitates a shift from traditional teaching paradigms. The Artemis teaching platform, with its interactive learning approach, offers a scalable solution for large courses. However, a notable challenge remains: the uniform distribution of programming exercises, regardless of individual student proficiency. Recognizing the diverse skill set and learning needs of students, this master thesis proposes a transformative solution — the utilization of Generative Pre-trained Transformer models, particularly Large Language Models (LLM), for adaptive programming exercise generation.
The primary objective is to dynamically create exercises tailored to students’ learning objectives and proficiency levels, enhancing both engagement and knowledge retention. Central to this approach are the four foundational components of a programming exercise: the problem statement, template repository, solution repository, and test repository. Depending on the desired difficulty level, the problem statement and template repository are adjusted either by providing detailed guidance for beginners or by consolidating tasks and reducing scaffolding for advanced learners.
Harnessing the power of prompt engineering in conjunction with LLM, the model contextualizes the inputs from the exercise components and generates suitably tailored exercises. Instructors are provided with an intuitive interface, allowing them to either generate variant exercises or adapt existing ones to cater to diverse learning needs. The proposed solution not only streamlines the exercise creation process but also ensures a more personalized and effective learning experience for students.
Artemis is open source and available on https://github.com/ls1intum/Artemis