Creating fraud resistant exams

Creating fraud resistant exams

On this page, you can find the following information:

Other relevant sources:

Necessary fraud prevention measures per exam type

Here is an overview of which measures you need to take per remote assessment type (columns). The remote assessment types are divided into open-book and closed-book types, and the measures are divided into exam construction and procedural measures.

Open book
(no declarative knowledge)
Closed book
(declarative knowledge)
Exam type Projects /
larger assignments
Remote exams
(using assignment tool)
Remote exams
(using digital exam tool)
Oral
exams
Online proctored
exams
Exam construction measures Open-book exam
Open-ended questions Open ended questions (!!)
Variation in questions and cases Random questions from pool Variation in questions and cases Random questions from pool
Parametrization Parametrization
Procedural measures  Honor’s pledge
Oral authenticity check
Time-locks Time locks
Plagiarism check (text editor) or |
handwriting comparison (handwritten)
Manual
plagiarism check
Online proctoring
Identity checks Visual identity check
Login with netID Login with netID

In order to give students grades that reliably represent how well they master the learning objectives (LOs) of a course, we must:

  1. Assess these learning objectives (and nothing more).
  2. Make sure that students deliver work that reflects their own level of LO mastering. In other words, we must prevent and detect fraud, and make sure that students can perform optimally during the exam.

Exam construction measures

There are four ways to prevent fraud when developing the test:

Transform your exam to an open book exam instead of a closed-book exam. Questions on facts (declarative knowledge) are easy for students to look-up during a remote exam. It is difficult to enforce students not to use their books or ‘cheat sheets’ during the exam. Therefore, do not ask for factual knowledge, but aim at questions at higher levels of Bloom, which students can only answer if they master the learning objectives. The questions should be constructively aligned with the learning activities and learning objectives.

Authorize students to use all help available and provide them a list of sources that they are suggested to have available.

In case a learning objective requires reproduction of factual knowledge, consider whether this factual knowledge is crucial (for example in their professional lives) or not. In case a learning objective cannot be tested, discuss with the Board of Examiners and Programme Director whether this is acceptable.

In case you need to ask factual questions and an open book exam is not a possibility, you could consider oral examinations (depending on the number of students), or online proctored exams (only an option if online proctoring is allowed by your Board of Examiners).

Change the exam into an open book exam with open-ended questions (i.e. no multiple choice, multiple select, true/false, etc.). This implies that you cannot ask remember level questions. This should be constructively aligned with the learning activities and learning objectives.
See here for more information on how to construct open-ended questions
Click to see why you cannot use multiple-choice questions in a remote exam.

Answers to open-ended questions, especially those that require longer answers, are not straight forward to share amongst peers. Furthermore, similarities in answers to open questions can be used to detect fraud.

Here is a manual on how to construct open-ended questions for a remote exam.

If you want to know why closed-ended questions (multiple choice, yes-no, true-false, multiple select, etc.) are sensitive to fraud, and if there are exceptions read the text below.

Unless you are using a question bank (see below) which will render each student with a unique exam, you are discouraged strongly from using multiple choice questions (mcqs). The same holds for true-false questions and other closed ended questions.

Why shouldn’t I use (proctored) multiple choice questions?
Multiple choice questions are more sensitive to fraud, since it is relatively simple to communicate the answers to other students.

Why shouldn’t I use (proctored) True-False questions?
True-false questions are not good from an educational point of view, since students will start looking for an error in any question that they think is ‘true’. Students tend to overthink especially true statements and will continuously think that they have overlooked a detail that made the statement false. Uncertainty may diminish their performance.

What if I randomize the order of the answer options?
Changing the order of the options (answers) does not help, since students can still communicate ‘the answer that begins with/ends on/is the shortest/…’.

What if I ask them the same question at the same time?
Students can still communicate the answers.

What if I don’t allow them to go back to the previous question?
In this case, students will perform worse than on a standard exam, since they will either waste too much time on that question, or will skip the answer and feel sad that they just lost a point, and maybe remember the answer at a later time and will be terribly frustrated that they can’t go back to the previous question. The grade will not reflect how well they master the learning objectives and will be lower.

What if I randomize the order of the questions?
Changing the order of the questions will create illogical question orders for some students, and logical question orders for others. You should at least keep related questions together and if there is a logical order within a subject/learning objective, keep that order intact. Students can still communicate answers.

What if I ask each student a unique set of questions from a database of questions?
That is possible, but it is a lot of work to create that database (‘question bank’) and it must contain good quality questions that have proven to discriminate between good-performing and not-so-good-performing students (p-value and Rir-value should be known). This is normally done by analyzing (test result analysis) how well these or similar questions perform on previous, regular exams. The reason that you need ‘proven quality’ questions is that you cannot do a test result analysis to change the scoring for some questions (i.e. giving all students full points since there was something wrong with the question).

What about multi-select questions? (These are multiple choice questions where you can select each individual option and often don’t indicate how many options should be selected)
It is really difficult to develop good multi-select questions, and they can only be used in some cases. See below for more information on when you can use them and how.

Multi-select questions

These questions consist of a number of true/false questions and are prone to cheating. True/false questions are rather difficult to develop without giving students the feeling that they might have overlooked something and that somewhere in the statement, you put a clue that the statement was incorrect, after all.

Furthermore, if you are using MapleTA, the grading is very untransparent: incorrectly selecting an option is punished harder than forgetting to select an option. This is not clear to teachers, nor communicated to students. Therefore, scores are relatively low. Our experience is that the scores for multi-select questions do (nearly) not correlate with the grade of the student.
The reason is that it is very difficult to ask good quality multi-select questions. In case of wordy questions, the main point is to ask a single question that can be answered without looking at the options.

When to use multi-select: The only type of question which is suitable for multi-select, is a question like “Which of the three geometric structures below is/are topologically identical to this structure?” (insert a picture of a structure, and then 3 pictures of structures that are undeniably similar or dissimilar). In this case, each option should be indisputably correct or incorrect and students will not have the feeling that you are trying to trick them.

Be transparent about the scores: Be transparent in how students can earn points, and figure out how MapleTA or Brightspace assigns/deducts points from the score for each incorrectly chosen or omitted option.

In case of multiple choice with 3 alternatives (3 is the prefered number of alternatives for multiple choice questions), each exam should consist of ~54 questions in order for the grade to be reliable. This number of questions is considerably higher than for open questions.

What about guessing? Do I adjust the grade?
Students can earn points by randomly guessing the correct answer. You will need to take guessing into account when calculating the grade from the score. Be transparent about this to your students and communicate this before, during (cover page) and after the test. More information can be found in the reader of UTQ module ASSESS.

How do I analyse the results to check for problems with questions/answers? How do I adjust the grading if test result analysis shows bad results?
Do a test result analysis to assess the quality of the individual questions and use the information to change the scoring of individual questions. More information can be found in the reader of UTQ module ASSESS. In Brightspace quizzes and MapleTA, most relevant information is available in the test statistics. See ‘test result analysis’ below how to assess the results and use this to adapt the scoring.

Ask all students different questions. That way, there is no use in communicating the answers. All students get different questions, randomly chosen from question pools with interchangeable questions. You could for example ask similar questions on different cases or datasets.

In order to make it more difficult for students to share answers to questions (specifically answers to knowledge questions), you can give each student different exam questions. This can be done by using different Exam versions or creating Unique individual exams from question banks:

  1. Exam versions: Divide the students into groups and give each group a different version of the exam. The exam questions are different for each group, and the same within a group.
    • Pro: easy to set up.
    • Con: if students find out in which group they are, they can communicate within the group.
    • Work-around: if the exam is divided into parts, you can change the grouping for the second part compared to the first grouping.
      You can keep the first questions in each exam part the same, so it is harder and more stressful for them to find out what group they belong to.
  2. Question pools: Give each student a unique exam, by drawing interchangeable questions from question pools. Each question pool contains questions that are interchangeable in terms of learning objective/topic and difficulty.
    • Pro: unpredictable questions, easy to set-up in Brightspace quizzes
    • Con: test result analysis only usable if you have large numbers of students

In both cases, you need interchangeable questions, which will take you more time to develop than in case of a traditional exam.

What does an exam with question pools look like?
Per learning objective or topic, you will formulate a number of questions at the same levels of difficulty and of the same question type. This pool of interchangeable questions is called a question pool.
Examples of interchangeable questions in the same question pool:

  1. Fill in the blanks, automatically graded using regular expressions: Naming parts of a machine (if the answer can be copied from a book, this is only possible for proctored exams). The machine is different for each question.
  2. Short answer, automatically graded using regular expressions: Writing out the applicable formula for a situation shown in a figure. The situation is different for each question.
  3. Arithmetic question, automatically graded (all or nothing): Calculate the force on a beam in a construction. The construction or the beam is different for each question.
    For each student, a unique exam will be formed with randomly drawn questions from the question pools.

Example 1

  • LO1: 3 question pools of unrelated questions
    • Pool 1a: low difficulty: matching question
    • Pool 1b: low question: short open question
    • Pool 1c: medium difficulty: arithmetic question
  • LO2: 3 question pools of unrelated questions
    • Pool 2a: low difficulty: open question, giving an explanation
    • Pool 2b: medium difficulty: arithmetic question
    • Pool 2c: medium difficulty: open question, analyzing a problem
  • LO3: 2 question pools of unrelated questions
    • Pool 3a: medium difficulty: arithmetic question, analyzing
    • Pool 3b: difficult question: analyzing data and drawing a conclusion

Example 2 (solving engineering problems)

  • LO1: 1 question pool that comprises questions with 5 subquestions each. The subquestions are of increasing difficulty.
  • LO2: 1 question pool that comprises questions with 4 subquestions each. The subquestions are of increasing difficulty.

Can I change the order of the questions?
The order of the exam questions should be logical, in order to enable students to perform optimally. Keep questions on the same topic/learning objective together.

Ask students the same, numerical questions, but with different numbers, so they cannot exchange the numbers.

Parametrization is used in numerical questions. For each question, all students use different numbers, chosen from a range that you determine. Therefore, the outcomes are different and it is not possible to commit fraud by sharing answers. In order to help each other, they would have to share the calculation steps, which is more cumbersome. Parameterization is possible in Brightspace quizzes (arithmetic question types), in MapleTA, and in Grasple (available for math service education only).

If you want to use parameterization in Brightspace Assignment, you could determine the value they should work with based on a figure in their student number. You can change the figure that you base the values on for each (larger) assignment, to prevent grouping. Example: use the 3rd figure of the student number in question 1, the last in question 2 and the second in question 3.

3rd figure in your student number: 0 1 2 3 4 5 6 7 8 9
Value for x:

Value for y:

x=1

y=6

x=3

y=6

x=2

y=4

x=4

y=6

x=3

y=4

x=4

y=3

x=2

y=3

x=1

y=5

x=2

y=4

x=3

y=5

Important: Be aware that students might unintentionally use the incorrect values. Try to make sure that the values lead to equally difficult calculations, and have students practice with this in order to reduce the stress during the exam of seeing this system for the first time.

Ask students the same, numerical questions, but with different numbers, so they cannot exchange the numbers.

Parametrization is used in numerical questions. For each question, all students use different numbers, chosen from a range that you determine. Therefore, the outcomes are different and it is not possible to commit fraud by sharing answers. In order to help each other, they would have to share the calculation steps, which is more cumbersome. Parameterization is possible in Brightspace quizzes (arithmetic question types), in MapleTA, and in Grasple (available for math service education only).

If you want to use parameterization in Brightspace Assignment, you could determine the value they should work with based on a figure in their student number. You can change the figure that you base the values on for each (larger) assignment, to prevent grouping. Example: use the 3rd figure of the student number in question 1, the last in question 2 and the second in question 3.

3rd figure in your student number: 0 1 2 3 4 5 6 7 8 9
Value for x:

Value for y:

x=1

y=6

x=3

y=6

x=2

y=4

x=4

y=6

x=3

y=4

x=4

y=3

x=2

y=3

x=1

y=5

x=2

y=4

x=3

y=5

Important: Be aware that students might unintentionally use the incorrect values. Try to make sure that the values lead to equally difficult calculations, and have students practice with this in order to reduce the stress during the exam of seeing this system for the first time.

Procedural measures

Next to fraud prevention measures in the development of exam questions, you can take the following procedural measures:

Make students promise that they will only hand in their own work and that they will not use help or unauthorized tools, nor will help other students. The promise can be made by having students typing over the honour’s pledge, or by reading it aloud at the start of an oral exam. In case of a written exam, this can be done a day before taking the exam.

We trust the integrity of the student. During your course, ask them to read the TU Delft code of conduct and discuss that you expect that they will adhere to the code. Indicate that you will ask them to do an honor pledge and what it will read. Inform them whether they will be asked to do the pledge before, at the start of, and/or at the end of their assessment. Students can either copy or vocalize the honor pledge.

You can change the pledge to make it more applicable for your assessment. Here are two examples:

  1. Online exam:
    “I promise that I will not use unauthorized help from people or other sources during my exam. I will create the answers on my own and I will create them only during the allocated exam time slots. I will not provide help to other students during their exam.”
  2. Timed take-home exam:
    “I promise that I have not used unauthorized help from people or other sources for completing my exam. I created the submitted answers all by myself during the time slot that was allocated for that specific exam part. I will not provide nor have I provided help to other students during their exam.”

For oral exams, students can promise that they will not receive questions from students who took the exam earlier, nor provide questions to the students who will take the exam later.

Complementary oral check for randomly sampled students

Contact a random 10% (or more) of the students immediately after the exam and ask them to explain a couple of answers to confirm that they authored the work. Make sure that you choose these students totally randomly, or by using selection criteria that are clearly unbiased towards specific groups of students.  Furthermore, take the Security and Privacy of Oral Examinations into account during oral authenticity checks.

In case of group projects or group assignments, let the group describe who contributed what, for example in the report. Provide them with a tool (Buddycheck) to stimulate them to give each other intermediate peer feedback on contribution, especially in larger projects. Make the group small enough so that everybody can contribute and that each contribution will be valued by their peers.

Some examiners will use the complementary post-exam oral check as an anti-fraud measure. It is important to mention that this is not a grade determining part of the examination, it is only applied to check if the student has been honest in submitting his/her work. Only a sample of the students (e.g. 5-20%) will be selected to do the online complementary oral check.

In case you do a complementary oral check on a sample of your student population, please consider the following:

  • We recommend to do the check shortly after the exam has finished and before you have graded the exams.
  • Preferably pick the students totally random (so not the first 20% in alphabetical order, but for example based on a randomly picked last number of their student number).
  • If you decide to use an algorithm to select students, make sure to make the selection criteria explicit to prevent bias.

What to ask:

  • Ask for an explanation of some of their answers.

The checks can be recorded, but should only be stored if there is a suspicion of fraud. The recording and storing should be done in a similar way as described in Security and Privacy Guidelines for Oral Exams.

In case you come across irregular results while you are scoring the assignment/exam and suspect fraud, please follow the regular processes and report this to the Board of Examiners and the involved student(s).

Split the exam into 2-4 consecutive parts (30-90 minutes per part). Each exam part is visible during a time-slot and needs to be finished before the end of that time-slot. This diminishes the extent to which answers can be exchanged. You could schedule breaks in between. Please note that students tend to become very stressed by the intermediate deadlines, which diminishes their performance and the reliability of their grade. Therefore, give them an opportunity to practice with similar time-locks in a practice exam, use long time-locks and as few as possible. Make sure that the length of each time-lock is realistic for students in exam conditions. Provide students who are entitled to extra time with correctly elongated time-locks.

In case of Brightspace assignments that are written digitally, use the built-in plagiarism check in Brightspace (TurnItIn) and open each similarity report to check for larger matches in the student’s text.
In case of Brightspace Quizzes or MapleTA, you can download the students’ answers and look for similarities using a spreadsheet programme. Make sure to remove personal information (student numbers and names) from the file after establishing the grades.

When you use for example Brightspace assignments. You can manually check whether students had the same handwriting as in a previous assignment, whether they copied the hand-written notes from a peer, or whether they seem to have copied texts from peers. Ask your students to keep the original handwritten papers, in case of legibility issues.

This is the only form of online video-surveillance that is allowed, and it should comply with the TU Delft Online Proctored Examination Regulation. Online proctoring is only available for digital knowledge exams that need to be taken as closed-book exams, in case it is not possible to change the exam into an oral examination due to student numbers.

  • Permission of the Board of Examiners: Your Board of Examiners needs to give explicit permission to use online proctoring for each exam. Online proctoring may only be used as a last resort, and the Board of Examiners assesses whether this is the case. Some Boards of Examiners have indicated that they will never give permission for online proctored examinations.
  • Online Proctored Examination Regulation: If you use online proctoring for your exam, you need to adhere to the Online Proctored Examination Regulations.
  • Online proctoring only option for video surveillance:  If you need to use video surveillance, you are only allowed to use online proctoring via Digital Exams, because it ensures that recorded data will be stored, processed and destroyed according to privacy regulations.
  • Availability: Online proctoring is in principle only available for knowledge question exams that are administered as digital exams. The reason why exams need to be digital is that the camera can only record the student’s face and not their handwriting, since students need to read the assignments from the screen. This implies that the camera faces their heads, not hands.
  • Maximum duration: The maximum duration of a proctored exam is 90 minutes. After 90 minutes, students need a toilet break, and the occurrence of technical issues increases.
  • Maximum number of students: The maximum number of students per group is increased to 150.
  • Available assessment tools with proctoring: Until week 4.6 (midterms), proctoring will only be available in MapleTA. After week 4.6, proctored Grasple will be available for mathematics (service education) only.
  • Practice test: Have all students do a practice exam a couple of days before the exam, to detect technical issues and procedural issues, and have students familiarize themselves with the tools.
  • Costs: Online proctoring is a paid service with costs ranging between 10-15 euros per student in the exam. The costs of a proctored exam are for the faculty.
  • Click to see why you should avoid using multiple-choice questions in an online proctored remote exam, and if you use them, how you should do this appropriately.

For oral exams and (project) presentations, you can do an identity check using the student’s campus card (do not record this!).

For exams in Brightspace Quizzes or Assignments, students need to login with their netID. It is not allowed to share the login credentials with other people.

Automatic grading in Brightspace quizzes

It is possible to save quite some time in grading and still use a range of question types, by using the automated grading options in Brightspace quizzes. In this section, you will find a description of how to do that.

Automatic grading possibilities

The following question types allow for automatic grading (in order of applicability). Warning: they can only give full or no points. If you have large questions, either split them into smaller questions, or use the following question type that must be graded by hand:

Written response. Students can type in a longer text, and you can add instructions for graders on how to give students full or partial points.

  1. Arithmetic: For calculations. Parametrization is possible (all students get random values for variables, all chosen from a range that you set).
  2. Significant figures: same as arithmetic, but with EXP (10^) notation to force students to determine the precision of an answer.
  1. Short answer: Questions that require a short answer, like 1 word, that can be automatically graded. Answers can be case-sensitive or not. If you have more complicated answers (like vectors (1,0,0)), you can ‘program’ different versions of correct answers by using regular expressions. See below for some examples.
  2. Multi-short answer: multiple short answers.
  3. Fill in the blanks: see multi-short answer, but with blanks that students need to fill in.
  1. Matching questions: matching items on a list to the left of the screen to items on a list to the right.
  2. Ordering questions: putting a list of actions in the correct order.
  1. Multiples choice. Not recommended.
  2. Multi-select: multiple choice questions in which multiple checkboxes can be selected, often without telling the student how many to select. Not recommended.
  3. True-false: statement that can either be true or false.

Checking boxes is very sensitive to fraud, and therefore not recommended.

Regular expressions and examples

You can use regular expressions to determine whether short answers are true or false. Here is a list of syntax you can use. Be careful that you test them well (first in https://regex101.com/ to check for syntax errors and later with a colleague in your Brightspace Sandbox). It is not possible to do automatic regrading after your students took the test. You would have to go over the incorrect responses and manually increase the points for answers that were correct any way.

Question: Give/describe the vector, by using solely ‘0’ and ‘1’, which is orthogonal to the vectors (1,0,0) and (0,1,0).

Do not insert blanks in your answer.

 

Answer using Text:

  • (0,0,1).

However, if you are forgiving students to insert blanks, you could use the following code in RegEx:

  • \(\s?0\s?,\s?0\s?,\s?1\s?\)
    \: escape sign that indicates that the next character should be treated like a character.
    \(: character (
    \s: a space
    ?: indicates that the proceeding character may or may not be there.
    \s?: a space or no space

If you want them to be able to use multiple spaces:

  • \(\s*0\s*,\s*0\s*,\s*1\s*\)
    \: escape sign that indicates that the next character should be treated like a character.
    \(: character (
    \s: a space
    *: indicates that the proceeding character may or may not be there multiple times.
    \s*: a space, multiple spaces, or no space

Question: How many levels of Bloom are there?

Answer using RegEx:

(6|six|Six|SIX)
|: or
(): brackets contain options that can be separated by | )

Question: Name all levels, separated by a semicolon and a space (; ), starting with the lowest level, without capitals.

Answer using RegEx:

  • (/([Rr]emember)(; )([Uu]nderstand)(; )([Aa]pply)(; )([Aa]naly[sz]e)(; )([Ee]valuate)(; )([Cc]reate)/)
    (): brackets contain options (groups)
    /: escape sign, indicates that the next character should be treated as a character
    /(: the symbol (
    []: Square brackets contain characters. One of them should be present.
    [Rr]: either R or r should be present, so both Remember as well as remember are considered correct.

Contact us

Please send us an email and we'll get back to you.

Sending

Log in with your credentials

Forgot your details?

By continuing to use this website, you agree with the use of cookies. More information

This website uses cookies of multiple categories: strictly necessary cookies, performance cookies and advertisement cookies. The strictly necessary cookies are used to verify whether you have given consent to the cookies before, whereas the performance cookies are used for insights in the usage of the website. Using the latter allows the admin of this domain to improve the experience on the website in the future. Some of the cookies are third-party, such as YouTube and Google Analytics.

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close