Is Interactive Oral assessment the answer to AI concerns?

As a way to ensure academic integrity, a proposed solution is for any handwritten submission to be accompanied by an Interactive Oral assessment component, or a Viva (Dobson, 2017). The oral discussion provides an extra assurance that the learner’s work is authentic and representative of their understanding of the content (Sotiriadou, 2019).

But is it as simple as that?

No!

There are a range of things to consider for the Interactive Oral to be useful.

The weighting of the Interactive Oral  

Whatever weighting is assigned to the task must be sufficient to impact a student if they decide not to attend. For example, if a student uses AI to write an essay worth 25% of their final grade, they may be happy to forego a 5% weighting for not attending the Viva. To ensure attendance, delay releasing the essay grade until the student has participated in the Viva. There are two ways you can design a rubric that facilitates this:

  • A single criterion at the end of the rubric that essentially acts as a hurdle: ‘the student is able to demonstrate they understand what they wrote.’ The issue with this however is that this criterion would have to be greater than 50% to take effect, which distorts the focus of the assessment.
  • A clause in each criterion that says: ‘….and is able to explain this in the Viva.’ This approach allows you to ensure that all of the key components of the essay can be talked to. If they don’t attend the Viva, then the student is awarded the section in the rubric that says ‘…and was not able to explain their answer in the Viva.’, which would be a fail, or does not meet requirements etc.

Equity in the questions asked of each learner

Standardising the questions asked by each marker will be important to ensure that it is fair for all. Questions will be seeking good explanations of some of the key ideas presented in the written submission. It is possible to design questions that are standardised but that also allows you to check that the submission is their own work. For example, you could ask every student: ‘Could you please expand on the idea you mentioned in the second paragraph related to……’, or ‘When you say …… in this section of the essay, can you elaborate more on what you mean?’

If you add the ‘oral clause’ in each criterion, then each learner will be asked the same number of questions, but you reserve the right to ask for further clarification if you judge a response to be needing more information. This approach also mitigates any concerns about students colluding with what the questions will be in the Oral. Each question will be related to the individual essay being discussed, and the students will need to know about their own work.

Other prompts could include: ‘Tell me more about…..’, ‘What would you recommend….’, ‘In coming to your decision/recommendation did you consider…..’, ‘I noted…I am curious as to why…’, ‘I am not clear on…’, ‘I would be interested to hear your rationale for…’ Pearce (2020) provides an overview of various types of prompts and their implications.

The time it takes to do it

Of course, a 5-minute conversation with every student in your marking group is going to be time-consuming, which is why this must be embedded into the marking time. A solution to this is to reduce the original word count of the essay enough so that it compensates for the extra time needed to authenticate understanding in the Oral. The average reader will read 300 words per minute, so it likely will take approximately 5 minutes to mark 250 words.

Training students in the Interactive Oral

It will be necessary to prepare students for this type of assessment. This will increase the validity of the approach. You don’t want to have a situation where a student is unable to speak to their understanding because of nerves or some other factor not related to an academic integrity breach. A solution to this is to embed this type of activity as a formative task in a tutorial session, satisfying the development of graduate attributes at the same time (Cranmer, 2006).

The organisational logistics

Organising where and when students attend their Interactive Oral will take some logistical management. It would need to be close enough to when the essay is submitted, and there are technologies that can assist in students nominating a time to attend their oral assessment. It may be easier for students to complete their Oral via Zoom or Teams.

References

Cranmer, S. (2006). Enhancing Graduate Employability: Best Intentions and Mixed Outcomes. Studies in Higher Education 31 (2): 169–84.  [Taylor & Francis Online] [Web of Science ®], [Google Scholar]

Dobson, S. (2017). The Life and Death of the Viva. The enabling power of assessment, pp.1–22. doi:https://doi.org/10.1007/978-3-319-64016-7_1.

‌Pearce, J., & Chiavaroli, N. (2020). Prompting Candidates in Oral Assessment Contexts: A Taxonomy and Guiding Principles. Journal of Medical Education and Curricular Development, 7. https://doi.org/10.1177/2382120520948881

Sotiriadou, P., Logan, D., Daly, A. and Guest, R. (2019). The role of authentic assessment to preserve academic integrity and promote skill development and employability. Studies in Higher Education, 45(11), pp.1–17. doi:https://doi.org/10.1080/03075079.2019.1582015 .

I’m Paul Moss. I’m a learning designer at the University of Adelaide. Follow me on Twitter @edmerger

Leave a Reply