Blogs (4) >>
Thu 21 Mar 2024 10:45 - 11:10 at Meeting Rooms B110-112 - Assessment & Grading Chair(s): Amanpreet Kapoor

In this experience paper we introduce the concept of `diverging assessments’, process-based assessments designed so that they become unique for each student while all students see a common skeleton. We present experiences with diverging assessments in the contexts of computer networks, operating systems, ethical hacking, and software development. All the given examples allow the use of generative-AI-based tools, are authentic, and are designed to generate learning opportunities that foster students’ meta-cognition. Finally, we reflect upon these experiences in five different courses across four universities, showing how diverging assessments enhance students’ learning while respecting academic integrity.

Thu 21 Mar

Displayed time zone: Pacific Time (US & Canada) change

10:45 - 12:00
Assessment & GradingPapers at Meeting Rooms B110-112
Chair(s): Amanpreet Kapoor University of Florida, USA
10:45
25m
Talk
Diverging assessments: What, Why, and ExperiencesGlobal
Papers
Amin Sakzad Monash University, David Paul University of New England, Judy Sheard Monash University, Ljiljana Brankovic University of New England, Matthew P. Skerritt RMIT University, Nan Li University of Wollongong, Australia, Sepehr Minagar Monash University, Simon , William Billingsley University of New England
DOI
11:10
25m
Talk
Mechanical TA 2: Peer Grading With TA and Algorithmic SupportGlobal
Papers
Hedayat Zarkoob University of British Columbia, Kevin Leyton-Brown University of British Columbia
DOI
11:35
25m
Talk
Rubric for the Quality of Answers to Student Queries about CodeGlobal
Papers
Svana Esche Technical University of Darmstadt
DOI