Development of a Tool to Analyze Source Code Submitted by Novice Programmers and Provide Learning Support Feedback With Comments

Abstract

Novice students make various mistakes in the process of learning computer programming. In courses with more than 100 students, it is difficult to provide accurate and detailed feedback regarding errors in the source code submitted for their assignments. Therefore, we created a source code analyzer and developed a tool to provide detailed feedback to each student. It performs unit tests with misspelled classes and method names. From the results, the tool generates comments, such as "Let us check the method name" or "Let us check the execution result.” The tool can generate an average of more than 8,000 Japanese characters per assignment in an actual programming lecture with more than 100 students. In this study, we report on the developed tool, its adaptation to an existing learning management system, and its evaluation.



Author Information
Tatsuyuki Takano, Kanto Gakuin University, Japan
Osamu Miyakawa, Tokyo Denki University, Japan
Takashi Kohama, Tokyo Denki University, Japan

Paper Information
Conference: ACEID2023
Stream: Design

This paper is part of the ACEID2023 Conference Proceedings (View)
Full Paper
View / Download the full paper in a new tab/window


To cite this article:
Takano T., Miyakawa O., & Kohama T. (2023) Development of a Tool to Analyze Source Code Submitted by Novice Programmers and Provide Learning Support Feedback With Comments ISSN: 2189-101X – The Asian Conference on Education & International Development 2023 Official Conference Proceedings https://doi.org/10.22492/issn.2189-101X.2023.64
To link to this article: https://doi.org/10.22492/issn.2189-101X.2023.64


Comments & Feedback

Place a comment using your LinkedIn profile

Comments

Share on activity feed

Powered by WP LinkPress

Share this Research

Posted by James Alexander Gordon