Good to know about tests

It is easier said than done to make a test. Are you measuring what you need to according to the course plan? Is the test a reliable for measuring knowledge of the course material? In order to provide support along the way we have given some examples of the most common problems when it comes to creating tests.

Allmänt

Out of the learning aims and the material presented in the course you should be sure of what the test is measuring. There should be a connection between the test questions and the collective aims of the course. The questions should cover the kind of knowledge you want to measure. How many questions there are, for example, can depend on the number of credits the course has. Don't forget to test the test yourself to validate it. 

Questions

Be critical of the types of questions you will pose:

Purpose

Normally in a course you can have two types of tests, formative or summative. When you want to identify the level of a knowledge a student has, for example, you can use a self-corrected test. When you want to measure whether the student has met the aims of the course plan you can use a test graded by you, the teacher. 

Process

Developing a test is a process that consists of designing, asking questions and testing. 

Design

Take into consideration the parts of the course plan and aims, the content of the course and which learning outcomes are planned for the test. Think about the students as a target group, for example, which term they attend and the program in which they are enrolled. 

The next step is to think about the size of the course, which aspect of the course material you want to focus on and which level of difficulty the questions will have. How will you administer the test and how will it be  reported? Lastly, will you be giving the same test next time? Will it be adapted to an audited course plan? Will be it validated anew? 

When it comes to levels of learning it can be a good idea to look at Solo taxonomy (or Blooms taxonomy). Is the student expected to show understanding? What about analysis? Where you place questions in the taxonomy can affect the results of the test. 

Another part of design is knowing how you will weight the questions. You can, for example, group questions in different categories, in order of importance or the creation of questions that are conclusive to the results. 

While it is easy to begin creating questions in Ping Pong, it can be a good idea to first begin with a control document with an overview of the questions, where they belong in terms of weight and level, before inserting the questiong into the test. 

Ask questions

How a question is formed will affect how it is measured. Avoid factors that unnecessarily increase the level of difficulty and break up a question into smaller parts of it risks becoming too complex. Be as clear as possible. It is important that the student understands the question within the context of the issue being discussed. 

Try randomizing the order in which questions are presented if it is possible in the tool you are using. Otherwise you risk placing the correct answer in the same place too often, something that students are quick to notice. 

Consider the following when writing test questions:

And remember the following when creating alternative answers:

Testing

Review the test according to these criteria before publishing for others who will test it.

Consider who will be part of the test pilot group. What do they need to know before testing your test? You should be aware that the results of the pilot group can affect decisions about adjustments to level of difficulty and your requirements based on their answer frequency and averages.