Creating fraud resistant exams

Creating fraud resistant exams

On this page, you can find the following information:

Other relevant sources:

Necessary fraud prevention measures per exam type

Here is an overview of which measures you need to take per remote assessment type (columns). The remote assessment types are divided into open-book and closed-book types, and the measures are divided into exam construction and procedural measures.

Open book
(no declarative knowledge)
Closed book
(declarative knowledge)
Exam type Projects /
larger assignments
Remote exams
(using assignment tool)
Remote exams
(using digital exam tool)
Oral
exams
Online proctored
exams
Exam construction measures Open-book exam
Open-ended questions Open ended questions (!!)
Variation in questions and cases Random questions from pool Variation in questions and cases Random questions from pool
Parametrization Parametrization
Procedural measures  Honor’s pledge
Oral authenticity check
Timeslots Timeslots
Plagiarism check (text editor) or |
handwriting comparison (handwritten)
Manual
plagiarism check
Online proctoring
Identity checks Visual identity check
Login with netID Login with netID

In order to give students grades that reliably represent how well they master the learning objectives (LOs) of a course, we must:

  1. Assess these learning objectives (and nothing more).
  2. Make sure that students deliver work that reflects their own level of LO mastering. In other words, we must prevent and detect fraud, and make sure that students can perform optimally during the exam.

Exam construction measures

There are four ways to prevent fraud when developing the test:

Transform your exam to an open book exam instead of a closed-book exam. Questions on facts (declarative knowledge) are easy for students to look-up during a remote exam. It is difficult to enforce students not to use their books or ‘cheat sheets’ during the exam. Therefore, do not ask for factual knowledge, but aim at questions at higher levels of Bloom (Bloom, 1956, click here for the TU Delft adaptation for engineering education), which students can only answer if they master the learning objectives. The questions should be constructively aligned with the learning activities and learning objectives.

Authorize students to use all help available and provide them a list of sources that they are suggested to have available.

In case a learning objective requires reproduction of factual knowledge, consider whether this factual knowledge is crucial (for example in their professional lives) or not. In case a learning objective cannot be tested, discuss with the Board of Examiners and Programme Director whether this is acceptable.

In case you need to ask factual questions and an open book exam is not a possibility, you could consider oral examinations (depending on the number of students), or online proctored exams (only an option if online proctoring is allowed by your Board of Examiners).

If you must test factual knowledge in a closed-book setting, consider using oral exams, or online proctoring (last resort).

Change the exam into an open book exam with open-ended questions (i.e. no multiple choice, multiple select, true/false, etc.). This implies that you cannot ask remember level questions. This should be constructively aligned with the learning activities and learning objectives.
See here for more information on how to construct open-ended questions
Click to see why you cannot use multiple-choice questions in a remote exam.

Answers to open-ended questions, especially those that require longer answers at higher levels of Bloom (click here for the TU Delft adaptation of Bloom’s taxonomy for engineering education),, are not straight forward to share amongst peers. Furthermore, similarities in answers to open questions can be used to detect fraud.

Here is a manual on how to construct open-ended questions for a remote exam.

If you want to know why closed-ended questions (multiple choice, yes-no, true-false, multiple select, etc.) are sensitive to fraud, and if there are exceptions read the text below.

Unless you are using a question bank (see below) which will render each student with a unique exam, you are discouraged strongly from using multiple choice questions (mcqs). The same holds for true-false questions and other closed ended questions.

Why shouldn’t I use (proctored) multiple choice questions?
Multiple choice questions are more sensitive to fraud, since it is relatively simple to communicate the answers to other students.

Why shouldn’t I use (proctored) True-False questions?
True-false questions are not good from an educational point of view, since students will start looking for an error in any question that they think is ‘true’. Students tend to overthink especially true statements and will continuously think that they have overlooked a detail that made the statement false. Uncertainty may diminish their performance.

What if I randomize the order of the answer options?
Changing the order of the options (answers) does not help, since students can still communicate ‘the answer that begins with/ends on/is the shortest/…’.

What if I ask them the same question at the same time?
Students can still communicate the answers.

What if I don’t allow them to go back to the previous question?
In this case, students will perform worse than on a standard exam, since they will either waste too much time on that question, or will skip the answer and feel sad that they just lost a point, and maybe remember the answer at a later time and will be terribly frustrated that they can’t go back to the previous question. The grade will not reflect how well they master the learning objectives and will be lower.

What if I randomize the order of the questions?
Changing the order of the questions will create illogical question orders for some students, and logical question orders for others. You should at least keep related questions together and if there is a logical order within a subject/learning objective, keep that order intact. Students can still communicate answers.

What if I ask each student a unique set of questions from a database of questions?
That is possible, but it is a lot of work to create that database (‘question bank’) and it must contain good quality questions that have proven to discriminate between good-performing and not-so-good-performing students (p-value and Rir-value should be known). This is normally done by analyzing (test result analysis) how well these or similar questions perform on previous, regular exams. The reason that you need ‘proven quality’ questions is that you cannot do a test result analysis to change the scoring for some questions (i.e. giving all students full points since there was something wrong with the question).

What about multi-select questions? (These are multiple choice questions where you can select each individual option and often don’t indicate how many options should be selected)
It is really difficult to develop good multi-select questions, and they can only be used in some cases. See below for more information on when you can use them and how.

Multi-select questions

These questions consist of a number of true/false questions and are prone to cheating. True/false questions are rather difficult to develop without giving students the feeling that they might have overlooked something and that somewhere in the statement, you put a clue that the statement was incorrect, after all.

Furthermore, if you are using Möbius, the grading is very untransparent: incorrectly selecting an option is punished harder than forgetting to select an option. This is not clear to lecturers, nor communicated to students. Therefore, scores are relatively low. Our experience is that the scores for multi-select questions do (nearly) not correlate with the grade of the student.
The reason is that it is very difficult to ask good quality multi-select questions. In case of wordy questions, the main point is to ask a single question that can be answered without looking at the options.

When to use multi-select: The only type of question which is suitable for multi-select, is a question like “Which of the three geometric structures below is/are topologically identical to this structure?” (insert a picture of a structure, and then 3 pictures of structures that are undeniably similar or dissimilar). In this case, each option should be indisputably correct or incorrect and students will not have the feeling that you are trying to trick them.

Be transparent about the scores: Be transparent in how students can earn points, and figure out how Möbius or Brightspace assigns/deducts points from the score for each incorrectly chosen or omitted option.

In case of multiple choice with 3 alternatives (3 is the preferred number of alternatives for multiple choice questions), each exam should consist of ~54 questions in order for the grade to be reliable. This number of questions is considerably higher than for open questions.

What about guessing? Do I adjust the grade?
Students can earn points by randomly guessing the correct answer. You will need to take guessing into account when calculating the grade from the score. Be transparent about this to your students and communicate this before, during (cover page) and after the test. More information can be found in the reader of UTQ module ASSESS.

How do I analyze the results to check for problems with questions/answers? How do I adjust the grading if test result analysis shows bad results?
Do a test result analysis to assess the quality of the individual questions and use the information to change the scoring of individual questions. More information can be found in the reader of UTQ module ASSESS. In Brightspace quizzes and Möbius, most relevant information is available in the test statistics. See ‘test result analysis’ below how to assess the results and use this to adapt the scoring.

In order to make it more difficult for students to share answers to questions (specifically answers to knowledge questions), you can give each student different exam questions, chosen from a question pool with interchangeable questions. This can be done by using different Exam versions or creating Unique individual exams from question banks:

  1. Exam versions: Divide the students into groups and give each group a different version of the exam. The exam questions are different for each group, and the same within a group.
    • Pro: easy to set up.
    • Con: if students find out in which group they are, they can communicate within the group.
    • Work-around: if the exam is divided into parts, you can change the grouping for the second part compared to the first grouping.
      You can keep the first questions in each exam part the same, so it is harder and more stressful for them to find out what group they belong to.
  2. Question pools: Give each student a unique exam, by drawing interchangeable questions from question pools. Each question pool contains questions that are interchangeable in terms of learning objective/topic and difficulty.
    • Pro: unpredictable questions, easy to set-up in Brightspace quizzes
    • Con: test result analysis only usable if you have large numbers of students

In both cases, you need interchangeable questions, which will take you more time to develop than in case of a traditional exam.

What does an exam with question pools look like?
Per learning objective or topic, you will formulate a number of questions at the same levels of difficulty and of the same question type. This pool of interchangeable questions is called a question pool.
Examples of interchangeable questions in the same question pool:

  1. Fill in the blanks, automatically graded using regular expressions: Naming parts of a machine (if the answer can be copied from a book, this is only possible for proctored exams). The machine is different for each question.
  2. Short answer, automatically graded using regular expressions: Writing out the applicable formula for a situation shown in a figure. The situation is different for each question.
  3. Arithmetic question, automatically graded (all or nothing): Calculate the force on a beam in a construction. The construction or the beam is different for each question.
    For each student, a unique exam will be formed with randomly drawn questions from the question pools.

Example 1

  • LO1: 3 question pools of unrelated questions
    • Pool 1a: low difficulty: matching question
    • Pool 1b: low question: short open question
    • Pool 1c: medium difficulty: arithmetic question
  • LO2: 3 question pools of unrelated questions
    • Pool 2a: low difficulty: open question, giving an explanation
    • Pool 2b: medium difficulty: arithmetic question
    • Pool 2c: medium difficulty: open question, analyzing a problem
  • LO3: 2 question pools of unrelated questions
    • Pool 3a: medium difficulty: arithmetic question, analyzing
    • Pool 3b: difficult question: analyzing data and drawing a conclusion

Example 2 (solving engineering problems)

  • LO1: 1 question pool that comprises questions with 5 subquestions each. The subquestions are of increasing difficulty.
  • LO2: 1 question pool that comprises questions with 4 subquestions each. The subquestions are of increasing difficulty.

Can I change the order of the questions?
The order of the exam questions should be logical, in order to enable students to perform optimally. Keep questions on the same topic/learning objective together.

Ask students the same, numerical questions, but with different numbers, so they cannot exchange the numbers.

Parametrization is used in numerical questions. For each question, all students use different numbers, chosen from a range that you determine. Therefore, the outcomes are different and it is not possible to commit fraud by sharing answers. In order to help each other, they would have to share the calculation steps, which is more cumbersome. Parameterization is possible in Brightspace quizzes (arithmetic question types), in Möbius, and in Grasple (available for math service education only).

If you want to use parameterization in Brightspace Assignment, you could determine the value they should work with based on a figure in their student number. You can change the figure that you base the values on for each (larger) assignment, to prevent grouping. Example: use the 3rd figure of the student number in question 1, the last in question 2 and the second in question 3.

3rd figure in your student number: 0 1 2 3 4 5 6 7 8 9
Value for x:

Value for y:

x=1

y=6

x=3

y=6

x=2

y=4

x=4

y=6

x=3

y=4

x=4

y=3

x=2

y=3

x=1

y=5

x=2

y=4

x=3

y=5

Important: Be aware that students might unintentionally use the incorrect values. Try to make sure that the values lead to equally difficult calculations, and have students practice with this in order to reduce the stress during the exam of seeing this system for the first time.

Ask students the same, numerical questions, but with different numbers, so they cannot exchange the numbers.

Parametrization is used in numerical questions. For each question, all students use different numbers, chosen from a range that you determine. Therefore, the outcomes are different and it is not possible to commit fraud by sharing answers. In order to help each other, they would have to share the calculation steps, which is more cumbersome. Parameterization is possible in Brightspace quizzes (arithmetic question types), in MapleTA, and in Grasple (available for math service education only).

If you want to use parameterization in Brightspace Assignment, you could determine the value they should work with based on a figure in their student number. You can change the figure that you base the values on for each (larger) assignment, to prevent grouping. Example: use the 3rd figure of the student number in question 1, the last in question 2 and the second in question 3.

3rd figure in your student number: 0 1 2 3 4 5 6 7 8 9
Value for x:

Value for y:

x=1

y=6

x=3

y=6

x=2

y=4

x=4

y=6

x=3

y=4

x=4

y=3

x=2

y=3

x=1

y=5

x=2

y=4

x=3

y=5

Important: Be aware that students might unintentionally use the incorrect values. Try to make sure that the values lead to equally difficult calculations, and have students practice with this in order to reduce the stress during the exam of seeing this system for the first time.

Procedural measures

Next to fraud prevention measures in the development of exam questions, you can take the following procedural measures:

Make students promise that they will only hand in their own work and that they will not use help or unauthorized tools, nor will help other students. The promise can be made by having students typing over the honor’s pledge, or by reading it aloud at the start of an oral exam. In case of a written exam, this can be done a day before taking the exam.

We trust the integrity of the student. During your course, ask them to read the TU Delft code of conduct and discuss that you expect that they will adhere to the code. Indicate that you will ask them to do an honor pledge and what it will read. Inform them whether they will be asked to do the pledge before, at the start of, and/or at the end of their assessment. Students can either copy or vocalize the honor pledge.

You can change the pledge to make it more applicable for your assessment. Here are two examples:

  1. Online exam:
    “I promise that I will not use unauthorized help from people or other sources during my exam. I will create the answers on my own and I will create them only during the allocated exam time slots. I will not provide help to other students during their exam.”
  2. Timed take-home exam:
    “I promise that I have not used unauthorized help from people or other sources for completing my exam. I created the submitted answers all by myself during the time slot that was allocated for that specific exam part. I will not provide nor have I provided help to other students during their exam.”

For oral exams, students can promise that they will not receive questions from students who took the exam earlier, nor provide questions to the students who will take the exam later.

For written remote exams, the honor’s pledge could also be administered one day before the actual examination, by for example typing the text of a pledge in a Brightspace Quiz short answer and grade it automatically (students can have another go if they make a spelling mistake).

For oral exams, you can do it at the start of the recording (if applicable).

Complementary oral check for randomly sampled students

Contact a random 10% (or more) of the students immediately after the exam and ask them to explain a couple of answers to confirm that they authored the work. Make sure that you choose these students totally randomly, or by using selection criteria that are clearly unbiased towards specific groups of students. Let the students know what the timeslot is in which the oral check takes place, to prevent unnecessary waiting. Furthermore, take the Security and Privacy of Oral Examinations into account during oral authenticity checks.

In case of group projects or group assignments, let the group describe who contributed what, for example in the report. Provide them with a tool (Buddycheck) to stimulate them to give each other intermediate peer feedback on contribution, especially in larger projects. Make the group small enough so that everybody can contribute and that each contribution will be valued by their peers.

Most examiners will use the complementary post-exam oral check as an anti-fraud measure. It is important to mention that this is not a grade determining part of the examination, it is only applied to check if the student has been honest in submitting his/her work. Only a sample of the students (e.g. 5-20%) will be selected to do the online complementary oral check.

In case you do a complementary oral check on a sample of your student population, please consider the following:

  • We recommend to do the check shortly after the exam has finished and before you have graded the exams.
  • Preferably pick the students totally random (so not the first 20% in alphabetical order, but for example based on a randomly picked last number of their student number).
  • If you decide to use an algorithm to select students, make sure to make the selection criteria explicit to prevent bias.

What to ask:

  • Ask for an explanation of some of their answers.

The checks can be recorded but should only be stored if there is a suspicion of fraud. The recording and storing should be done in a similar way as described in Security and Privacy Guidelines for Oral Exams.

In case you come across irregular results while you are scoring the assignment/exam and suspect fraud, please follow the regular processes and report this to the Board of Examiners and the involved student(s).

  1. Informing students
  2. Timing
    1. Preparing for handwritten exam questions takes about 5 minutes per question due to readability issues. Less
    2. Doing an oral check takes about 10 minutes, unless you run into bad cases.
  3. Identity check
    1. Inform students to keep their student ID ready.
    2. Check the students’ identity before you start the oral check or oral exam.
    3. Do not record campus cards or other proofs of identity.
  4. Questioning
    1. Ask for an explanation of some of their answers to check whether it is plausible that their work is their own.
    2. Be clear about whether or not students will be allowed look at their answers and/or drafts or not during the oral check.
    3. If students are allowed draft paper during the exam, ask them to write on clean sheets and inform them that you may ask them to show these draft papers during the oral check.
  5. Recordings
    1. Do not forget to delete all recordings two months after grading, unless for students who filed complaints. Delete their recordings two months after the procedure has been finished. For details on how to record in a privacy compliant way, please read this document.
  6. Tool and TA support
    1. Use a tool with a main room and break-out rooms, like YouSeeU, or a tool with waiting rooms.
    2. Have TAs invite the (random) students, put them in the waiting room, help them with their audio and video, check their identity, and move them to your break-out room when you are available for the oral check.
    3. Goal: to diminish start-up time.

Split the exam into 2-4 consecutive parts (30-90 minutes per part). Each exam part is visible during a timeslot and needs to be finished before the end of that timeslot. This diminishes the extent to which answers can be exchanged. You could schedule breaks in between. Please note that students tend to become very stressed by the intermediate deadlines, which diminishes their performance and the reliability of their grade. Therefore, make sure that the time slots are long enough for students to get into a flow of concentration. Preferably give them an opportunity to practice with similar timeslots in a practice exam, use long time-locks and as few as possible. Make sure that the length of each timeslot is realistic for students in exam conditions. Provide students who are entitled to extra time with correctly elongated timeslots.

Tip: Make the examination available only during the examination time-slot for the students who subscribed for the exams. If different groups have different exams, make the exam available only to the correct group. Close the exam after the time-slot (due date) plus a short, extra time window (grace period, end date).  This flags all exams that were submitted late in Brightspace, but it is possible for students to submit their work. See how-to for assignments: add timeslots to set-up the available time window and use ‘release conditions’ to make assignments available to specific groups only.

In case of Brightspace assignments that are written digitally, use the built-in plagiarism check in Brightspace (TurnItIn) and open each similarity report to check for larger matches in the student’s text.

In case of Brightspace Quizzes or Möbius, you have to download the students’ answers manually and look for similarities using a spreadsheet programme manually.

When you use for example Brightspace assignments, you can manually check whether students had the same handwriting as in a previous assignment, whether they copied the hand-written notes from a peer, or whether they seem to have copied texts from peers. Ask your students to keep the original handwritten papers, in case of legibility issues.

This is the only form of online video-surveillance that is allowed, and it should comply with the TU Delft Online Proctored Examination Regulation. Online proctoring is only available for digital knowledge exams that need to be taken as closed-book exams, in case it is not possible to change the exam into an oral examination due to student numbers.

  • Permission of the Board of Examiners: Your Board of Examiners needs to give explicit permission to use online proctoring for each exam. Online proctoring may only be used as a last resort, and the Board of Examiners assesses whether this is the case. Some Boards of Examiners have indicated that they will never give permission for online proctored examinations.
  • Online Proctored Examination Regulation: If you use online proctoring for your exam, you need to adhere to the Online Proctored Examination Regulations.
  • Online proctoring only option for video surveillance:  If you need to use video surveillance, you are only allowed to use online proctoring via Digital Exams, because it ensures that recorded data will be stored, processed and destroyed according to privacy regulations.
  • Availability: Online proctoring is in principle only available for knowledge question exams that are administered as digital exams. The reason why exams need to be digital is that the camera can only record the student’s face and not their handwriting, since students need to read the assignments from the screen. This implies that the camera faces their heads, not hands.
  • Maximum duration: The maximum duration of a proctored exam is 90 minutes. After 90 minutes, students need a toilet break, and the occurrence of technical issues increases.
  • Maximum number of students: The maximum number of students per group is increased to 150.
  • Available assessment tools with proctoring: Currently, proctoring is only available in combination with digital assessment tools Möbius and Grasple using the tool RPNow. Grasple is only available for mathematics (service education).
  • Practice test: Have all students do a practice exam a couple of days before the exam, to detect technical issues and procedural issues, and have students familiarize themselves with the tools.
  • Costs: Online proctoring is a paid service with costs ranging between 10-15 euros per student in the exam. The costs of a proctored exam are for the faculty.
  • Click to see why you should avoid using multiple-choice questions in an online proctored remote exam, and if you use them, how you should do this appropriately.

For oral exams and (project) presentations, you can do an identity check using the student’s campus card (do not record this!).

For exams in Brightspace Quizzes or Assignments, students need to login with their netID. It is not allowed to share the login credentials with other people.

Fraud prevention should not interfere with assessment guidelines

Make sure that fraud prevention measures do not interfere with the guidelines for remote assessment. Remember, the guiding principle for remote assessment is that during assessments, we should enable our students to demonstrate how well they master the learning objectives. Here are the key elements from the guidelines:

1. The assessment should assess all learning objectives in a reliable way. 2. Take fraud prevention measures that do not hinder student performance. 3. Helpdesk: Be available for students during the assessment.
4. Practice exam: Give students and yourself practice with the setting, questions & tools. During assessments, students demonstrate how well they master the course. 5. Feasible: The assessment should be feasible for both students and you.
6. Extra time: Give students with disabilities extra time. 7. Privacy: Comply with privacy regulations. 8. Communicate assessment details to your students via Brightspace and email.

Eight key elements for developing fair and aligned remote assessment

Below, we sketch some ‘don’ts’ in fraud prevention, with an explanation of why not to do this, based on the key elements from the remote assessment guidelines.

The eight guidelines in the table below are requirements to achieve that. They are based on two other guiding principles for assessment: constructive alignment and fairness.

1. Don’t ask students to upload or email their (campus) ID

Do not ask students to upload or email their campus ID, nor any other ID. It will not certify that they are the ones who take the exam and therefore provides no ground to process these privacy sensitive data.

2. Don’t ask students to upload a picture of themselves

Do not ask students to upload or email pictures of themselves. It will not certify that they are the ones who take the exam and therefore provides no ground to process these privacy sensitive data.

3. Don’t add ‘surprise elements’ to the exam

Do not add ‘surprise elements’ to the exam, for example to check the student’s identity. Students will be distracted and startled, or might miss the ‘call for surprise action’, which will influence their grade.

Contact us

Please send us an email and we'll get back to you.

Sending

Log in with your credentials

Forgot your details?

By continuing to use this website, you agree with the use of cookies. More information

This website uses cookies of multiple categories: strictly necessary cookies, performance cookies and advertisement cookies. The strictly necessary cookies are used to verify whether you have given consent to the cookies before, whereas the performance cookies are used for insights in the usage of the website. Using the latter allows the admin of this domain to improve the experience on the website in the future. Some of the cookies are third-party, such as YouTube and Google Analytics.

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close