Assessment Items
The specification of items
During the phase of preparation of a test, we have to specify the characteristics of the items or exercises to produce the statement of instructions, stimuli and type of expected responses, all adapted to the selected type of items.
A. Choice of type of items
Several factors are involved when choosing the types of items to use in a test. We must consider the purpose of the test, the dimensions to be measured and the type of response provided by the student. Before finalizing the choice of item types, we must also consider the following two factors: the type of marking and the time of administration.
B. Determination of the number of items
Depending on the expected duration of the administration of the test and the types of items selected for each of the evaluated dimensions, we have to determine the total number of items to draft taking into account the relative importance indicated in the table specification or adjusted according to circumstances. At the time of the preparation of a test that will be experienced as much as possible, there must be a higher number of items to be able to choose the best items for the final administration.
C. Parts of an item
An item is usually composed of the following parts (note that the last part touching the key correction and marking instructions are not found in the student booklet):
- Stimulus – The stimulus presents the task that students must achieve. It consists essentially of a question or directive. In some cases, a scenario precedes the question or directive.
- Administrative instructions – This is the information provided to students to enable them to answer the question. It must also have instructions and directives reserved for those responsible for administration of the instrument.
- Response – For selected response items, the answer is selected by the student from a proposed choice. For response items, the answer is formulated by the student himself.
- Answer key and marking instructions – For selected response items, the answer key indicates the correct answer, as this is usually indicated by a letter or a number. For response items, the answer key provides the correct answer or an example of good response or characteristics that must have the right answer. If there is a subjective correction item, there should be a rating or scoring guide. The marking key and the marking instructions must be prepared at the time of writing the item even if their use will take place only after the administration of the test.
D. Some writing rules
Here are some rules that should guide when drafting the items or questions that are part of the data collection instruments.
- Write the items in a clear, correct language, accessible to students.
- Ensure that the stimulus is complete: the student must know exactly what is expected of him. It is preferable to present the item as interrogative rather than as a sentence to be completed.
- In the stimulus, avoid providing the student answers indices as set out verbatim borrowed textbooks.
- In the stimulus, avoid using phrases like "you think," "for you", "do you think ..." which may suggest that the student is asked his opinion.
- Do not write sentences with negatives or double negatives.
- Explain to the students how to indicate his answer.
- Prevent a response to a question that provides the answer to another question.
- Present to student situations different from those he was able to examine or treat in class, or those he could find in his textbook.
- Choose proposed answers to equally likely and attractive for students.
E. Items revision grids
Here are valid criteria for all types of items.
- The item does measure well the contents and skills involved?
- The type of selected items is it appropriate to measure the dimension evaluated?
- The item is clearly formulated?
- The level of language used corresponds to the language level of the students?
- Is the stimulus concise?
- Did you avoid providing good response indices (adverbs of time, number or kind of determinants)?
- Are the used texts and documents adapted to the students?
- Is the quality of the documents or illustrations impeccable?
- Have we avoid using extracts from textbooks?
- The item is it free from gender stereotypes, cultural or other?
We indicate below the specific criteria to "multiple choice" items.
- Are the proposed answers all likely?
- Are the responses independent of each other?
- Are item's responses written in the same grammatical form?
- Are the answers to the item of the same length and the same complexity?
- Are the answers to the item placed in a logical order (alphabetical or numerical)?
- Did you avoid using the terms "none of the above" and "all of the above"?
Here are specific criteria to items "pairing".
- The contents of stimuli and responses are they consistent?
- Does each answer is a likely choice for each of the stimuli?
- Does the stimuli and responses are presented in a logical order (alphabetical or numerical)?
- Are there more answers as stimuli?
- The number of stimuli it is less than 5?
- Have you indicated whether the same response can be used more than once?
- Does the columns of stimuli and responses have been identified?
- The student knows how to answer?
As for the specific criteria to "short or long answer" items, here they are.
- Is the stimulus content very limited, leading, in the case of short-response questions, one answer?
- The degree of accuracy expected (if necessary) is it mentioned in the student?
- In the case of extended response items, is the stimulus included a context or situation that presents to student all the information it needs to answer?
- The length of the expected response, the marking criteria and the points to be allocated are specified?
- In the answer key, all acceptable answers are given?
- Have you indicated, in the answer key, the conditions for success (number of expected responses to pass the item)?
Finally, here are the criteria that must be met for the entire test
- The purpose of the test is it clear?
- The test covers all the dimensions included in the specification table?
- The weights shown in the specification table were they met?
- The principles to be respected as outlined in the specification table they were actually observed?
- Is the number of points given to each item indicated?
- Is the answer clearly indicated for each item?
- The response indicated for each item is it correct and the only correct?
- The order of the items it relevant?
- The length of the test is appropriate?
- The guidelines for the student are clear, complete, accurate and conspicuously?
- Have we avoid cutting an item in the spreading over two pages?
- Is the test free of grammatical errors, spelling and typographical?
- Print are they readable?
Email address: info@mcpythagore.com