Modeling Exercise
Overview
Conducting a Modeling exercise consists of 3 steps:
- Instructor prepares exercise: Creates and configures the modeling exercise in Artemis.
- Student solves exercise: Student works on the exercise and submits the solution.
- Tutors assess submissions: Reviews the submitted exercises and creates results for the students.
Note: Artemis uses an npm package called Apollon as its modeling editor. It has a standalone version which can be accessed via https://apollon.ase.in.tum.de/. The standalone version is free to use without the necessity of creating an account and offers additional features, including sharing and exporting diagrams. For more information, visit Apollon Standalone.
Setup
The following sections describe the supported features and the process of creating a new modeling exercise.
- Open
- Navigate into Exercises of your preferred course
Create New Modeling Exercise
Click on Create new modeling exercise.
The following screenshot illustrates the first section of the form. It consists of:
- Title: Title of an exercise
- Categories: Category of an exercise
- Difficulty: Difficulty of an exercise (No level, Easy, Medium or Hard)
- Mode: Solving mode of an exercise. This cannot be changed afterward (Individual or Team)
- Release Date: Date after which students can access the exercise
- Due Date: Date till when students can work on the exercise
- Assessment Due Date: Date after which students can view the feedback of the assessments from the instructors
- Inclusion in course score calculation: Option that determines whether or not to include exercise in course score calculation
- Points: Total points of an exercise
- Bonus Points: Bonus points of an exercise
- Diagram Type: Type of diagram that is used throughout an exercise
Fields marked with red are mandatory to be filled.
The field Diagram Type determines the components that students/instructors can use while working on the exercise. This option cannot be changed after creating the exercise. For example: If you select class diagram as its diagram type, users (instructors/students) will now only be able to use components of class diagrams throughout the exercise.
The following screenshot illustrates the second section of the form. It consists of:
- Enable automatic assessment suggestions: When enabled, Artemis tries to automatically suggest assessments for diagram elements based on previously graded submissions for this exercise
- Enable feedback suggestions from Athena: When enabled, Artemis tries to automatically suggest assessments for diagram elements using the Athena service
- Problem Statement: The task description of the exercise as seen by students
- Assessment Instructions: Instructions for instructors while assessing the submission
If you are not clear about any of the fields, you can access additional hints by hovering over the icon for many of them.
The following screenshot illustrates the last section of the form. It consists of:
- Example Solution: Example solution of an exercise
- Example Solution Explanation: Explanation of the example solution
- Example Solution Publication Date: Date after which the example solution is accessible for students. If you leave this field empty, the solution will only be published to tutors
Once you are done defining the schema of an exercise, you can now create an exercise by clicking the button.
You will then be redirected to Example Submissions for Assessment Training Page.
In this page, you can either Create Example Submission or Use as Example Submission for Assessment Training. Example submissions can be used to assess the submissions of students semi-automatically. Artemis uses those submissions to automatically apply the known assessment comments to similar model elements in other submissions as well.
- Select
if you want to create an example submission from scratch
- Alternatively, after the exercise already started, you can also use some submissions submitted by students as an example submission. For that, click on
Artemis uses semi-automatic grading of modeling exercises using machine learning. You can train the model by selecting the Use in Assessment Training checkbox while creating an example submission.
Import Modeling Exercise
Alternatively, you can also import modeling exercise from an existing one by clicking on Import Modeling Exercise.
An import modal will prompt up, where you will have an option to select and import previous modeling exercises from the list by clicking the button.
Once you import one of the exercises, you will then be redirected to a form which is similar to Create new modeling exercise form with all the fields filled from the imported exercise. You can now modify the fields as per your necessity to create a new Modeling Exercise.
Result
- Click the
button of the modeling exercise and adapt the interactive problem statement. There you can also set release and due dates
- Click the
button to see the scores achieved by the students
- Click the
button to see the list of students participated in the exercise
- Click the
button to see the list of submission submitted by students
- Click the
button to modify/add example submission of the exercise
- Click the
button to delete the exercise
- You can get an overview of the exercise by clicking on the title
Assessment
When the due date is over you can assess the submissions.
To assess the submissions, first click on Assessment Dashboard.
Then click on Submissions of the modeling exercise.
You will then be redirected to Submissions and Assessments Page.
Click on the button of specific student. You will then be redirected to the assessment page where you will be able to assess the submission of that student.
You can now start assessing the elements of the model by double clicking it. Once you double click, you will get an assessment dialog where you can assign points, feedback and navigate through all other assessable components.
Alternatively, you can also assess the diagram by dragging and dropping assessment instructions from the Assessment Instructions section.
Feedback to the entire submission can also be added by clicking on the button.
Once you're done assessing the solution, you can either:
- Click on
to save the incomplete assessment so that you can continue it afterward
- Click on
to submit the assessment
- Click on
to cancel and release the lock of the assessment
- Click on
to navigate to exercise dashboard page
Automatic Assessment Suggestions
If the checkbox Automatic assessment suggestions enabled is checked for a modeling exercise, Artemis generates assessment suggestions for submissions using the Athena service.
To learn how to set up an instance of the Athena service and configure your Artemis installation accordingly, please refer to the section Athena Service.
After clicking on on one of the submission entries on the Submissions and Assessments Page, assessment suggestions are loaded automatically as indicated by the following loading indicator:
Once assessment suggestions have been retrieved, a notice on top of the page indicates that the current submission contains assessment suggestions created via generative AI.
The suggestions themselves are shown as follows. If a suggestion directly references a diagram element, a dialog showing the suggested grading score for this specific suggestion as well as a suggestion on what could be improved is attached to the corresponding element. In this example, a remark is made that an element is present in the evaluated BPMN diagram without being mentioned in the problem statement.
If a suggestion addresses a more general aspect of the diagram, multiple diagram elements at once, or elements that are missing from the diagram, the suggestion is shown in a card overview below the diagram. These unreferenced suggestions can be accepted or discarded via buttons on the individual suggestion cards.
How Suggestion Generation Works
This section provides insights into how automated feedback suggestions are generated for modeling exercises using Athena. The module uses a Large Language Model (LLM) internally to generate feedback through the following process:
-
Feedback Request Reception: Upon receiving a feedback request, the corresponding modeling submission is serialized into an appropriate exchange format depending on the diagram type. For BPMN diagrams, BPMN 2.0 XML is used as it is a commonly used exchange format for process models and proved to be well-understood by LLMs. IDs of diagram elements are shortened during serialization to minimize the token count of the input provided to the language model.
-
Prompt Input Collection: The module gathers all required input to query the connected language model. This includes:
- Number of points and bonus points achievable
- Grading instructions
- Problem statement
- Explanation of the submission format
- Optional example solution
- Serialized submission
-
Prompt Template Filling: The collected input is used to fill in the prompt template. If the prompt exceeds the language model's token limit, omittable features are removed in the following order: example solution, grading instructions, and problem statement. The system can still provide improvement suggestions without detailed grading instructions.
-
Token Limit Check: Feedback generation is aborted if the prompt is still too long after removing omittable features. Otherwise, the prompt is executed on the connected language model.
-
Response Parsing: The model's response is parsed into a dictionary representation. Feedback items are mapped back to their original element IDs, ensuring that the feedback suggestions can be attached to referenced elements in the original diagram.
Optimizing Exercises for Automated Assessment
A few best practices should be considered to get the best possible assessment suggestions for a modeling exercise. As the current version of the module for generating suggestions for modeling exercises is based on a large language model, when composing grading instructions for an exercise, it is advisable to follow similar strategies as for prompt engineering an LLM.
One of the strategies for optimizing the prompt results of an LLM is instructing the model as clearly as possible about the expected output of the task at hand. The following example shows grading instructions for an exemplary BPMN process modeling exercise optimized for automatic assessment. The instructions explicitly list all aspects Athena should assess and how credits should be assigned accordingly, ensuring consistent suggestions across all submissions.
Example: Optimized Grading Instructions for BPMN Exercises
Evaluate the following 10 criteria:
1. Give 1 point if all elements described in the problem statement are present in the submission, 0 otherwise.
2. Give 1 point if the outgoing flows from an exclusive gateway are also labeled if there is more than one outgoing flow from the exclusive gateway, 0 otherwise.
3. Give 1 point if a start-event is present in the student's submission, 0 otherwise.
4. Give 1 point if an end-event is present in the student's submission, 0 otherwise.
5. Give 0 points if the activities in the diagram are not in the correct order according to the problem statement, 1 otherwise.
6. Give 1 point if all pools and swimlanes are labeled, 0 otherwise.
7. Give 1 point if the submission does not contain elements that are not described in the problem statement, 0 otherwise.
8. Give 1 point if all diagram elements are connected, 0 otherwise.
9. Give 1 point if all tasks are named in the "Verb Object"-format where a name consists of a verb followed by the object, 0 otherwise.
10. Give 1 point if no sequence flows connect elements in two different pools, 0 otherwise.
Automatic Student Feedback
Why Automatic Student Feedback: In large courses, providing timely and personalized feedback on modeling exercises is challenging. Automated student feedback helps learners identify misconceptions early, iterate on their work, and refine diagram modeling skills—all without waiting for an instructor or tutor to be available.
Overview:
When a modeling exercise is configured to allow Allow automatic AI preliminary feedback requests, preliminary AI feedback can be requested for modeling submissions. The feedback is generated through the Athena Service, which analyzes both the structure and layout of the diagrams and produces feedback based on the provided Grading Instructions, Problem Statement, and Sample Solution.
It is recommended that comprehensive Grading Instructions be provided in the form of Structured Grading Instructions and that a Sample Solution is included (although not mandatory). This ensures that the AI-generated feedback aligns with the intended grading criteria and offers targeted, meaningful hints.
How to Request Automatic Feedback:
-
Requesting Feedback
- 1.1. Navigate to a Modeling Exercise with the Automatic Student Feedback feature enabled
- 1.2. Create a diagram in the modeling editor and submit it
- 1.3. Feedback may be requested either from the exercise overview page or directly within the modeling editor
Request Feedback Button in Exercise Overview and Modeling Editor -
Viewing Feedback
- 2.1. After a feedback request is made, the system processes the diagram and generates preliminary feedback
- 2.2. An alert appears at the top of the page to indicate that the feedback is ready
Notification Alert When AI Feedback is Ready - 2.3. A preliminary score is displayed in the top-right corner of the screen
Preliminary Score in Modeling Editor - 2.4. Clicking on the score reveals detailed, inline feedback that highlights specific issues and provides suggestions directly within the diagram
Detailed AI Feedback -
Submission History
- Feedback can be requested multiple times before the submission due date. All feedback requests are recorded in the submission history
- To review previous feedback, access the submission history section and click on an entry to display its detailed feedback
Submission History Section in Modeling Editor
Demo
A demonstration of the automated generation of student feedback for a class diagram:
























