A follow-up oral process interview: What it is and what it should not be

Recommendations for teachers at the Utrecht University

Written by: Laura Koenders, Educational Development & Training, Utrecht University & Ya-Ping (Amy) Hsiao, Teaching & Learning Center, Tilburg University

Detecting potential breaches of academic integrity is part of your responsibility as an examiner. For this, you need guidance from the university on how to respond, the procedures you must follow, how to report your suspicions, the type of evidence you need to provide, and how to submit it. You also need to be informed about the next steps in the process, such as a decision made by the Board of Examiners (Glendinning, 2024). This is not new of course. However, the presence of genAI means that it is becoming more difficult to detect these breaches of academic integrity, since genAI has so many different types of use. This article aims to offer this guidance for both you as an examiner, and for the Board of Examiners, with a special focus on possible steps to take in the coursework assessment method (e.g., individual writing assignments, groupwork). 

To illustrate this, we would like to start with a case: 

When grading essays, you suspect genAI use in the essays of three of the students due to the general phrases and ‘empty’ use of language. There were no clear guidelines for genAI use, although copying a text in full is considered fraud. You decide to ask the students to come by, to hear their side of the story. Two students tell you they have used genAI, one student tells you they did not. You decide that the students who have used genAI use should improve their work and hand this in again for a grade. You grade the work of the student who said not to have used genAI without penalization.

You see that this approach is problematic for several reasons. Firstly, you can create inequalities between students, since you are not sure if the third student indeed has not used genAI. There might also be students who have used genAI that you are not aware of. Secondly, there is the risk that students feel falsely accused, which might damage your relationship with them. In addition, in this case it is unclear if the actions of the teacher can be considered sanctions or not. Should the Board of Examiners be informed and/or involved? 

What the case above illustrates, is that you must consider quite carefully how to approach conversations like this. In this article we hope to offer some guidelines. We will use the term ‘oral process interview’ for a conversation with the student, either used to figure out the learning process or the authenticity of the work of the student. For this oral process interview, the approach you follow is closely tied to the goal of the assessment. Therefore, this is the first thing you need to consider.

The goal of the assessment defines the goal of the oral process interview

For this article, we will define an assessment as everything you do to get information about the learning of a student (Nitko & Brookhart, 2013). This means an assessment can take many forms, from written assignments to multiple choice exams. What you do with this information is an important distinction, and this is called the function of the assessment (Wiliam & Black, 1996). For example, you can use the information from the assessment to decide about student learning (summative assessment, also called ‘assessment of learning’). Next to that, you can use the information to guide student learning and to know your students better (formative assessment, also called ‘assessment for learning’ or ‘assessment as learning’). In short: assessment for learning. For a good overview of the different terms ‘assessment of/as/for learning’ and their relationship to educational outcomes, see the review of Schellekens et al. (2021).

To illustrate this: 

  • A student can use the information from an exam to gauge how well they understand the content of the course. In this case, the student uses the assessment as a learning opportunity to further guide their study efforts (assessment as/for learning).
  • A teacher can use the information from the assessment to decide if the student has enough knowledge to complete a course (assessment of learning).

In higher education, we have many assessments throughout our courses with combined functions. For example: a student will receive feedback on a writing assignment (for learning), use the feedback to reflect and improve the subsequent work (as learning), and finally the assignment also counts toward the final grade (assessment of learning). With genAI in the picture, it is becoming increasingly important to be very clear about these different functions in the design of your assessments. 

In assessments designed to help students learn (assessments as/for learning), the impact of a breach of academic integrity is different than in assessments designed to make a formal decision (assessments of). Since in the former there is no consequence (no or low stakes) involved, it doesn’t impact the decisions about the level of the students, neither at course nor at curriculum level. There is of course a negative impact on the learning process of a fraudulent student. 

For example, you’ve designed a writing assessment to teach students write a concise and clear paragraph. Here the learning objective is a writing skill, so you don’t want students to outsource this to genAI. If a student decides to do so, your feedback will be less valuable and therefore the student won’t learn as much.

If you want to use the same assignment for a formal decision about how well a student can write a paragraph, this also becomes problematic if a student uses genAI. GenAI use by the student will impact the validity of your judgement, because you are not sure whether the results show student attainment of the learning objectives; either good writing skills or good genAI use (if permitted). Also, your decision may not be reliable, because you cannot be certain when the student works on the same writing assignment again in a different context (e.g., under supervision), can they demonstrate the same results? 

In both cases above, a breach in academic integrity will harm the goal that you have with the assessment. So, in both cases, changes to the assessment are needed to make them valid and worthwhile again. In some cases, it could be valuable to combine the assessment with some sort of oral process interview.

Ensuring academic conduct and responding to cheating is challenging. The difficulty is the balance between punishing and persuading

Questions you can ask during the oral interview

Ensuring academic conduct and responding to cheating is challenging. The difficulty is the balance between punishing and persuading (Ellis & Murdoch, 2024). In this context, during the process monitoring we balance a better understanding of your student (and persuasion to use genAI in an academically sound way) but also need to determine when to punish.

With the goal to help students learn (assessment as/for learning) 

As stated above, in assessments designed to steer student learning, there is no formal decision involved. Therefore, the questions you ask are designed to give yourself and the student insight in their learning process, and the impact of genAI use on this process. It is very important to create an open atmosphere, so be open about the goal of this meeting and what you will do with the answers the students give. 

Use eliciting, clarifying, justifying, defending, and explaining questions. These types of questions should resemble the formative feedback that teachers give during the process of student work (Wellington, 2010). A question like “What else can you say about your paper?” is vague and therefore may leave the student unsure of what’s being asked, leading them to guess or make up an answer rather than accounting for the writing process. 

Try to align the questions with the AI use that you have permitted for the assignment. See for examples of questions the column ‘example questions to increase student learning, and to give feedback on the use of genAI’ of Table 1.

With the goal to determine student learning outcomes (assessment of learning) 

As stated above, the goal of this type of interview is to establish authorship and authenticity of the work. For this, you will need to ask enough questions to increase the reliability of your judgement. A conversation like this is also known as ‘oral inspection ’. However, be aware that you can never be 100% sure that the student indeed did not use genAI in a greater amount than allowed. Having full certainty is therefore not the goal of this meeting. In case you come across irregular results during the oral process interview and while you are grading the assignment and suspect fraud, please follow the regular processes and report this to the Board of Examiners and the student. 

In order to increase the reliability of your judgement, it is important (as is with the other type of oral process interviews) that you create an open atmosphere and there is room for a dialogue. Transparency about your intentions and the goal of the meeting will help. 

Don’t use leading and vague questions (Pearce & Chiavaroli, 2020). If you use leading or vague questions, you may unintentionally help the student provide answers that align with what you expect, rather than allowing the student to demonstrate their own independent understanding of their writing. For example, if you say: "Your paper argues that technology has more benefits than drawbacks, correct?", this is leading the student toward agreeing with this statement, even if they are unsure or did not originally frame their argument that way. Inauthentic students (who may have used AI or external help) might simply agree without being able to explain their reasoning, while genuine students might unconsciously follow the examiner’s wording rather than recalling their original thought process. 

See for examples of questions the column ‘example questions to verify authorship and authenticity’ of Table 1.

In addition to written feedback, oral process interviews are great opportunities to give students oral feedback. Also, when seeing unauthorized uses of AI tools, it is also a moment to persuade students to change to the expected behavior.

How to organize a follow-up oral process interview?

This will take additional resources and time from you as a teacher, so it is important to think this through thoroughly. Oral process interviews could be done during (group) presentations, or as individual conversations with all students, or just with a few selected students. Knowing the goal of the oral process interview will also help you make choices regarding the procedures. 

In practice, various assessment methods and interview formats can be integrated into a course design to strengthen educators' dual role as both teachers and examiners. 

After submitting assessment components (e.g., drafts, sections, logbooks) 

In addition to written feedback, oral process interviews are great opportunities to give students oral feedback. Also, when seeing unauthorized uses of AI tools, it is also a moment to persuade students to change to the expected behavior. The oral feedback can be integrated as in-class discussion group activities. 

Random interview before grading 

As examiner you can do a complementary post-assignment interview as an anti-fraud measure. It is important to mention that this is not a grade determining part of the examination, it is only applied to check if the submitted work is actually done by the student herself/himself. Only a sample of the students (e.g., 5-20%) will be selected to do the complementary oral check. 

In case you do this based on a sample of your student population, please consider the following: 

  • We recommend doing the interview shortly after the assignment has been submitted and before you start to grade.
  • Preferably pick the students totally random (so not the first 20% in alphabetical order, but for example based on a randomly picked last number of their student number).
  • If you decide to use an algorithm to select students, make sure to make the selection criteria explicit to prevent bias.

Targeted cases after grading 

  • During grading, we recommend documenting the irregularities and list the questions you plan to ask during this follow-up, as you would have done before genAI. See box 1 for an example on how to communicate this to the student.  
  • In case you cannot find convincing answers on these irregularities after the oral process interview, please follow the regular processes and report this to the Board of Examiners.
  • Students have the right to be informed about these processes, so if they are not (yet) described in the Education and Examination Regulations, be conscious to inform them about it.
  • At the UU, there are few guidelines regarding the recording and storing of oral exams (see the Education and Examination Regulations of your department/faculty). There are no regulations so far on an oral process interview, so we recommend following the procedures for oral exams. In addition, we would recommend that the oral review can be recorded, but should only be stored if there is a suspicion of fraud.
The emphasis should be on ensuring student accountability for their work, rather than attempting to ‘determine where the human ends and where the artificial intelligence begins

How to effectively communicate these to students?

When implementing follow-up oral process interviews - especially those conducted after grading - it is important to consider their potential to induce anxiety and stress in students, which may negatively affect cognitive performance (Ringeisen et al., 2019). Factors such as fear of punishment for poor performance and social pressure can contribute to this anxiety (Ryan & Deci, 2017; Schürmann et al., 2022). To mitigate these effects, educators should clearly communicate the purpose and procedures of the interviews while fostering a supportive, non-threatening environment. This approach promotes fairness and reduces the risk of false positives. 

It is crucial to inform students about the organization of the follow-up oral process interview: why it is conducted, when it will take place, how long each session will be, and what is expected from students (e.g., answering questions or explaining how the work was done). Additionally, we recommend that students are informed whether the follow-up will be recorded and what happens with the recording and the ‘results’ of the interview afterwards. 

While it is the teacher’s responsibility to ensure the authenticity of student work, the growing integration of human and machine intelligence means that AI’s role will continue to evolve (Luo, 2024). Therefore, the emphasis should be on ensuring student accountability for their work, rather than attempting to ‘determine where the human ends and where the artificial intelligence begins’ (Eaton, 2024, p.10). This requires a balance between the teacher’s responsibility and the student’s accountability, which should be established from the very beginning of the assignment process. 

Acknowledgements 

We would like to thank Rein Cozijn, Dries Deweer, Brenda den Oudsten, Niels Bosma, Ruud Custers, Astrid Poorthuis, Marieke den Otter and Maarten de Boer for their review and feedback. 

In writing this piece, the authors used ChatGPT (Version March 7th, 2025) to improve English language. Full responsibility for the contents lies with the authors.

Table 1. Examples of questions to increase student learning or to verify authorship and accountability.

Table 1. Examples of questions to increase student learning or to verify authorship and accountability.

 

Assessment as/for learning:

Example questions to increase student learning

Assessment of learning: 

Example questions to verify authorship and authenticity

(Hacker et al., 2009; Voerman & Faber, 2021)

Questions regarding content and learning strategies, if genAI was notpermitted during (part of) the assignmentTo stimulate learning, it is important to ask activating questions, that prompt students to think. As a teacher, this helps because these questions allow you to give feedback as well. You can categorise these questions as follows (Voerman & Faber, 2021): Based on your observations of the student’s earlier work, if the content and writing appear significantly more advanced, you might ask:

Questions regarding the content (the product itself):

  • What are the main ideas of this article? What is your argument?
  • What sources did you use? Where did you find your sources?

Questions regarding the content (the product itself):

  • Looking at this specific section, can you now explain it in your own words?
  • Can you explain your main argument and line of argumentation in your own words?
  • What are the major differences between the draft and final version?

Questions regarding self-regulation strategies of the student:

  • Do you have previous experiences with an approach to writing? What worked? What didn’t?

How do you check if what you’re handing in fits the learning outcomes and requirements of the assignment?

Questions regarding self-regulation strategies of the student:

  • Can you explain more about your writing process? Was this similar to other assignments? 

What challenges did you face when structuring your thoughts? What did you do to overcome these?

Questions regarding the learning strategies of the student:

  • How did you decide on this structure or approach for your ideas?
  • Can you explain how you developed your main idea or argument? Are you content with it?
  • How did you assess the credibility of your sources? 

Questions regarding the learning strategies of the student:

  • How did you weigh these different arguments, and how did you get to your conclusion?
  • What strategies did you use to improve your writing?
Questions to ask regarding content and learning strategies, if genAI use was permitted during (part of) the assignment

Questions regarding the content (the product itself):

  • Did you use genAI? In what way?
  • What specific sections of your product were generated by AI?
  • What type of edits did genAI suggest? Where did you incorporate them?

Questions regarding the content (the product itself):

  • Can you point out a part of your product that you wrote or revised manually?
  • Where did you document and acknowledge AI’s contributions in your work?
  • Can you show me how and where AI influenced the direction of your work?
  • What additional databases did you use beyond AI-generated content?

Questions regarding the learning strategies of the student:

  • How did you evaluate the output of genAI?
  • Can you explain why/how you integrate AI-generated ideas with your own ideas this way?
  • How did you check the accuracy of AI-generated references or data?
  • How did you ensure that AI did not introduce biases or errors into your work?

Questions regarding the learning strategies of the student:

  • Which AI tools did you use, and why did you choose them for this task?
  • Did the AI suggest any sentence rewrites? If so, how did you decide which ones to accept?
  • How did you modify AI-generated ideas to make them your own?
  • Did you disagree with any AI-generated content? How did you adjust it?

Questions regarding the self-regulation strategies of the student:

  • How do you ensure that the genAI adopted parts of your work still align with the learning outcomes and assignment requirements?
  • How do you ensure that the AI-generated ideas suit your own ideas?
  • Were there certain things (grammar, spelling, structure, etc.) that genAI consistently changed? What do you learn from that for next assignments?
  • How do you ensure that the work is still reflecting your own voice?

Questions regarding the self-regulation strategies of the student:

  • How can you ensure that your work reflects your own understanding and critical thinking?
  • How can you ensure that AI did not change the meaning of your ideas or the style of your voice?
  • What human input is needed to keep authorship of your work, and how do you ensure accuracy?