Short Answer (Fill in the Blank) and Essay
With constructed-response items, the test-taker supplies an answer rather than selecting from options already provided. Because students supply the answers, this type of item reduces the chance of guessing. Constructed-response items include short answer and essay. Short-answer items can be answered by a word, phrase, or number. There are two types of short-answer items: question and completion.
One format presents a question that students answer in a few words or phrases. With the other format, completion or fill-in-the-blank, students are given an incomplete sentence that they complete by inserting a word or words in the blank space.
In an essay item, the student develops a more extended response to a question or statement. Essay tests and written assignments use writing as the means of expressing ideas, although with essay items the focus of assessment is the content of the answer rather than the writing ability.
Short Answer
Short-answer items can be answered by a word, phrase, or number. The two types of short-answer items—question and completions—also referred to as fill-in-the-blank, are essentially the same except for format. With the question format, students answer a question in a few words or phrases. Calculations may be included for the teacher to review the process that the student used to arrive at an answer.
The questions may stand alone and have no relationship to one another, or comprise a series of questions in a similar content area. Completion items consist of a statement with a key word or words missing; students fill in the blank to complete it. Other types of completion items ask students to perform a calculation and record the answer, or to order a list of responses. Completion items are appropriate for recall of facts and specific information and for calculations.
To complete the statement, the student recalls missing facts, such as a word or short phrase, or records the solution to a calculation problem. Although completion items appear easy to construct, they should be designed in such a way that only one answer is possible. If students provide other correct answers, the teacher needs to accept them.
Fill-in-the-blank and ordered response (also called drag-and-drop) items are two of the alternate item formats used on the NCLEX. Fill in-the-blank items ask candidates to answer a question or to perform a calculation and type in the answer. With drag-and-drop or ordered response items, candidates answer a question by rank ordering options or placing a list of responses in the proper order (National Council of State Boards of Nursing, 2007).
For example, students might be given a list of Erikson’s stages of development and asked to put them in the order they occur. On a computerized test, such as the NCLEX, candidates can click an option and drag it or highlight an option and use the arrow keys to arrange the options in the correct order.
However, this same format can be used on a paper-and-pencil test with students writing the order on their test booklets or teacher-made answer sheets, or indicating it on a machine-Scannable answer sheet.
Short-answer items are useful for measuring student ability to interpret data; use formulas correctly, complete calculations, and solve mathematical-type problems. Items may ask students to label a diagram, name anatomical parts, identify various instruments, and label other types of drawings, photographs, and the like.
Brookhart and Nitko (2008) described another type of short-answer format, association variety, which provides a list of terms or pictures for which students recall relevant labels, numbers, or symbols (p. 130). For example, students might be given a list of medical terms and be asked to recall their abbreviations.
Writing Short-Answer
Items Suggestions for developing short-answer items are as follows:
- Questions and statements should not be taken verbatim from textbooks, other readings, and lecture notes. These materials may be used as a basis for designing short-answer items, but taking exact wording from them may result in testing only recall of meaningless facts out of context. Such items measure memorization of content and may or may not be accompanied by the student’s comprehension of it.
- Phrase the item so that a unique word, series of words, or number must be supplied to complete it. Only one correct answer should be possible to complete the statement.
- Write questions that are specific and can be answered in a few words, phrases, or short sentences. The question, “What is insulin?” does not provide sufficient direction as to how to respond; asking instead “What is the peak action time of NPH insulin?” results in a more specific answer.
- Before writing the item, think of the correct answer first and then write a question or statement for that answer. Although the goal is to develop an item with only one correct response, students may identify other correct answers. For this reason, it is necessary to develop a scoring sheet with all possible correct answers, and re-score student responses as needed if students provide additional correct answers that the teacher did not anticipate.
- Fill-in-the-blank items requiring calculations and solving mathematical-type problems should include in the statement the type of answer and degree of specificity desired, for instance, convert pounds to kilograms, rounding your answer to one decimal point.
- For a statement with a key word or words missing, place the blank at the end of the statement. This makes it easier for students to complete. It is also important to watch for grammatical clues in the statement, such as “a” versus “an” and singular versus plural, prior to the blank, which might give clues to the intended response. If more than one blank is included in the statement, they should be of equal lengths.
- When students need to write longer answers, provide for sufficient space or use a separate answer sheet. In some situations, longer responses might indicate that the item is actually an essay item, and the teacher should then follow principles for constructing and evaluating essay items.
- 8. Even though a blank space is placed at the end of the statement, the teacher may direct the student to record one-word answers in blanks arranged in a column to the left or right of the items, thereby facilitating scoring.
- For example, 1. Streptococcus pneumonia and Staphylococcus aurous are examples of bacteria. Following are some examples of question and completion (or fillin-the-blank) formats of short-answer items: What congenital cardiac defect results in communication between the pulmonary artery and the aorta? Two types of metered-dose inhalers used for the treatment of bronchial asthma are: List three methods of assessing patient satisfaction in an acute care setting. 1. 2. 3. You are caring for a patient who weighs 128 lb. She is ordered 20 mcg/kg of an IV medication. What is the correct dose in micrograms? Answer:
Essay Item
In an essay test, students construct responses to items based on their understanding of the content. With this type of test item, varied answers may be possible depending on the concepts selected by the student for discussion and the way in which they are presented. Essay items provide an opportunity for students to select content to discuss, present ideas in their own words, and develop an original and creative response to an item.
This freedom of response makes essay items particularly useful for complex learning outcomes (Oermann, 1999). Higher level responses, however, are more difficult to score than responses reflecting recall of facts. Although some essay items are developed around recall of facts and specific information, they are more appropriate for higher levels of learning.
Miller, Linn, and Gronlund (2009) recommended that essay items be used primarily for learning outcomes that cannot be adequately measured through selected-response items. Essay items are effective for assessing students’ ability to apply concepts, analyze theories, evaluate situations and ideas, and develop creative solutions to problems, drawing on multiple sources of information.
Although essay items use writing as the medium for expression, the intent is to evaluate student understanding of specific content rather than judge writing ability in and of itself. Low-level essay items are similar to short-answer items, and require precise responses.
An example of a low-level essay is “Describe three signs of increased intracranial pressure in children under 2 years old.” Broader and higher level essay items, however, do not limit responses in this way and differ clearly from short-answer items, such as “Defend the statement ‘access to health care is a right.’ ”Essay items may be written to assess a wide range of learning outcomes. These include:
■ Comparing, such as comparing the side effects of two different medications
■ Outlining steps to take and protocols to follow
■ Explaining and summarizing in one’s own words a situation or statement
■ Discussion topics
■ Applying concepts and principles to a clinical scenario and explaining their relevance to it
■ Analyzing patient data and clinical situations through use of relevant concepts and theories
■ critiquing different interventions and nursing management
■ Developing plans and proposals drawing on multiple sources of information
■ analyzing nursing and health care trends
■ arriving at decisions about issues and actions to take, accompanied by a rationale
■ Analyzing ethical issues, possible decisions, and their consequences
■ Developing arguments for and against a particular position or decision.
As with other types of test items, the objective or outcome to be assessed provides the framework for developing the essay item. From the learning outcome, the teacher develops a clear and specific item to elicit information about student achievement. If the outcome to be assessed focuses on application of concepts to clinical practice, then the essay item should examine ability to apply knowledge to a clinical situation.
The item should be stated clearly so that the students know what they should write about. If it is ambiguous, the students will perceive the need to write all they know about a topic. Bierer and colleagues described the development and use of concept appraisals (CAPPs) for assessing medical students’ ability to integrate core concepts presented during class and from prior weeks of the course (Bierer, Dannefer, Taylor, Hall, & Hull, 2008).
These are essay items that require synthesis of learning and application of concepts to a case. Students answer the CAPPs on a weekly basis and receive two types of feedback on their responses: a standard answer that is posted online and individualized feedback from a faculty member.
Issues with Essay Tests
Although essay items are valuable for examining the ability to select, organize, and present ideas and they provide an opportunity for creativity and originality in responding, they are limited by low reliability and other issues associated with their scoring. The teacher should have an understanding of these issues because they may influence the decision to use essay items.
Limited Ability to Sample Content
By their nature essay items do not provide an efficient means of sampling course content as compared to objective items. Often only a few essay items can be included on a test, considering the time it takes for students to formulate their thoughts and prepare an open-ended response, particularly when the items are intended for assessing higher levels of learning.
As a result, it is difficult to assess all of the different content areas in a nursing course using essay items. When the learning outcomes are memorization and recall of facts, essay items should not be used because there are more efficient means of measuring such outcomes. Instead, essay items should be developed for measuring complex achievement and the ability to conceptualize, develop, integrate, and relate ideas (Miller et al., 2009). Essay items are best used for responses requiring originality.
Unreliability in Scoring
The major limitation of essay items is the lack of consistency in evaluating responses. Scoring responses is a complex process, and studies have shown that essay responses are scored differently by different teachers (Miller et al., 2009). Some teachers are more lenient or critical than others regardless of the criteria established for scoring.
Even with preset criteria, teachers may evaluate answers differently, and scores may vary when the same teacher reads the paper again. Miller et al. (2009) suggested that frequently the reasons for unreliability in scoring are the failure of the faculty member to identify the specific outcomes being assessed with the essay item and lack of a well-defined rubric for scoring (p. 242).
Factors such as missed words and incorrect grammar may affect scoring beyond the criteria to which they relate. Mertler (2003) suggested that there is a tendency to give lower scores for papers that have illegible writing, spelling errors, or poor grammar. The unreliability with scoring, though, depends on the type of essay item. When the essay item is highly focused and structured, such as “List three side effects of bronchodilators,” there is greater reliability in scoring.
Of course, these lower level items could also be classified as short-answer. Less restrictive essay items allowing for freedom and creativity in responding have lower rater reliability than more restricted ones. Items asking students to analyze, defend, judge, evaluate, criticize, and develop products are less reliable in terms of scoring the response. There are steps the teacher can take, though, to improve reliability, such as defining the content to be included in a “correct” answer and using a scoring rubric.
Carryover Effects
Another issue in evaluating essay items is a carryover effect in which the teacher develops an impression of the quality of the answer from one item and carries it over to the next answer. If the student answers one item well, the teacher may be influenced to score subsequent responses at a similarly high level; the same situation may occur with a poor response.
For this reason, it is best to read all students’ responses to one item before evaluating the next one. Miller et al. (2009) suggested that reading all the answers to one item at a time improves scoring accuracy by keeping the teacher focused on the standards of each item. It also avoids carrying over an impression of the quality of the student’s answer to one item onto the scoring of the next response.
The same problem can occur with tests as whole as well as written assignments. The teacher’s impression of the student can carry over from one test to the next or from one paper to the next. When scoring essay tests and grading papers, the teacher should not know whose paper it is.
Halo Effect
There may be a tendency in evaluating essay items to be influenced by a general impression of the student or feelings about the student, either positive or negative, that create a halo effect when judging the quality of the answers.
For instance, the teacher may hold favorable opinions about the student from class or clinical practice and believe that this learner has made significant improvement in the course, which in turn might influence the scoring of responses. For this reason, essay tests should be scored anonymously by asking students to identify themselves by an assigned or selected number rather than by their names. Names can be matched with numbers after scoring is completed.
Effect of Writing Ability
It is difficult to evaluate student responses based on content alone even with clear and specific scoring guidelines. The teacher’s judgment is often influenced by sentence structure, grammar, spelling, punctuation, and overall writing ability.
Some students write well enough to cover up their lack of knowledge of the content; longer answers may be scored higher regardless of the content. The teacher, therefore, needs to evaluate the content of the learner’s response and not be influenced by the writing style. When writing is also evaluated, it should be scored separately (Miller et al., 2009).
Order-of-Scoring Effect
The order in which essay tests are read and scored may influence the assessment (Chase, 1999). Essay tests read early tend to be scored higher than those read near the end. As such, teachers should read papers in random order and read each response twice before computing a score. After reading and scoring all student answers to an item, the teacher should rearrange the papers so that they are in a different order (Oosterhof, 2001).
Nitko and Brookhart (2007) described the problem of “rater drift,” the tendency of the teacher to gradually stray from the scoring criteria. In scoring essay items the teacher needs to check that the rubric and standards for grading are implemented equally for each student.
Time
One other issue in using essay items is the time it takes for students to answer them and for teachers to score them. In writing essay items, the teacher should estimate how long it will take to answer each item, erring on allowing too much time rather than too little. Students should be told approximately how long to spend on each item so they can pace themselves (Miller et al., 2009).
Scoring essay items also can be a pressing issue for teachers, particularly if the teacher is responsible for large numbers of students. Considering that responses should be read twice, the teacher should consider the time required for scoring responses when planning for essay tests. Scoring software is available that can scan an essay and score the response.
Intelligent Essay Assessor
One example is the Intelligent Essay Assessor™ that automatically evaluates and scores electronically submitted essays (Pearson Education Inc., 2007). Rudner, Garcia, and Welch (2006) evaluated the reliability of using the Intelli Metric SM automated essay scoring system for evaluating essays from the Analytic Writing Assessment of the Graduate Management Admission Test.
Scoring with the IntelliMetric system was reliable when compared to human raters, to a system based on word counts, and to a weighted probability model. The Pearson correlations between human raters and the IntelliMetric system had a mean of 0.83 (Rudner et al., 2006). Nursing faculty members need to assess, however, whether such software is appropriate for use in nursing courses and whether its use is cost-effective.
Student Choice of Items
Some teachers allow students to choose a subset of essay items to answer, often because of limited time for testing and to provide options for students. For example, the teacher may include four items on the care of patients with heart disease and ask students to answer two of them.
However, Miller et al. (2009) cautioned against this practice because when students choose different items to answer, they are actually taking different tests. The option to choose items to answer also may affect measurement validity.
Restricted-Response Essay Items
There are two types of essay items: restricted response and extended response. Although the notion of freedom of response is inherent in essay items, there are varying degrees of freedom in responding to the items. At one end of the continuum is the restricted-response item, in which a few sentences are required for an answer. These are short answer essays.
At the other end is the extended-response item, in which students have complete freedom of response, which often requires extensive writing (Oermann, 1999). Responses to essay items typically fall between these two extremes.
In a restricted-response item, the teacher limits the student’s answer by indicating the content to be discussed and frequently the amount of discussion allowed, for instance, limiting the response to one paragraph or page. With this type of essay item, the way in which the student responds is structured by the teacher.
A restricted-response item may be developed by posing a specific problem to be addressed and asking questions about that problem (Miller et al., 2009). For example, specific material, such as patient data, a description of a clinical situation, research findings, a description of issues associated with clinical practice, and extracts from the literature, to cite a few, may be included with the essay item.
Students read, analyze, and interpret this accompanying material, then answer questions about it. Nitko and Brookhart (2007) referred to essay items of this type as interpretive exercises or context-dependent tasks. Examples of restricted-response items follow:
■ Defines patient-focused care. Limit your definition to one paragraph.
■ Select one environmental health problem and describe its potential effects on the community. Do not use an example presented in class. Limit your discussion to one page.
■ Compare metabolic and respiratory acidosis. Include the following in your response: definitions, precipitating factors, clinical manifestations, diagnostic tests, and interventions.
■ Your patient is 76 years old and 1 day postoperative following a femoral popliteal bypass graft. Name two complications the patient could experience at this time and discuss why they are potential problems. List two nursing interventions for this patient during the initial recovery period with related evidence.
■ Describes five physiological changes associated with the aging process.
Extended-Response Essay Items
Extended-response essay items are less restrictive and as such provide an opportunity for students to decide how to respond: they can organize ideas in their own ways, arrive at judgments about the content, and demonstrate ability to communicate ideas effectively in writing.
With these types of items, the teacher may assess students’ ability to develop their own ideas and express them creatively, integrate learning from multiple sources in responding, and evaluate the ideas of others based on predetermined criteria. Because responses are not restricted by the teacher, assessment is more difficult. This difficulty, however, is balanced by the opportunity for students to express their own ideas.
As such, extended-response essay items provide a means of assessing more complex learning not possible with selected-response items. The teacher may decide to allow students to respond to these items outside of class. Sample items include:
■ Select an article describing a nursing research study. Critique the study, specifying the criteria used. Based on your evaluation, describe how the research findings could be used in clinical practice.
■ The fall rate on your unit has increased in the last 3 months. Develop a plan for analyzing this occurrence with a rationale to support your action plan.
■ Develop a plan for saving costs in the wound clinic.
■ You receive a call at the allergy clinic from a mother who describes her son’s problems as “having stomach pains” and “acting out in school.” She asks you if these problems may be due to his allergies. How would you respond to this mother? How would you manage this call? Include a rationale for your response.
■ You are caring for a child recently diagnosed with acute lymphocytic leukemia who lives with his parents and two teenage sisters. Describes how the family health-and-illness cycle would provide a framework for assessing this family and planning for the child’s care.
Writing Essay Items
Essay items should be reserved for learning outcomes that cannot be assessed effectively through multiple-choice and other selected-response formats. With essays, students can demonstrate their critical thinking, ability to integrate varied sources of information, and creativity. Suggestions for writing essay items follow.
Develop essay items that require synthesis of the content.
Avoid items that students can answer by merely summarizing the readings and class discussions without thinking about the content and applying it to new situations. Assessing students’ recall of facts and specific information may be accomplished more easily using selected-response formats rather than essay.
Phrase items clearly
The item should direct learners in their responses and should not be ambiguous. Exhibit 6.1 provides sample stems for essay items based on varied types of learning outcomes. Framing the item to make it as specific as possible is accomplished more easily with restricted-response items. With extended-response items, the teacher may provide directions as to the type of response intended without limiting the student’s own thinking about the answer.
In the example that follows, there is minimal guidance as to how to respond; The revised version, however, directs students more clearly as to the intended response without limiting their freedom of expression and originality. Example: Evaluate an article describing a nursing research study.
Revised Version: Select an article describing a nursing research study. Critique the study, specifying the criteria you used to evaluate it. Based on your evaluation, describe whether or not the research provides evidence for nursing practice. Include a rationale supporting your decision.
Prepare students for essay tests
This can be accomplished by asking thought-provoking questions in class; engaging students in critical discussions about the content; and teaching students how to apply concepts and theories to clinical situations, compare approaches, and arrive at decisions and judgments about patients and issues. Practice in synthesizing content from different sources, presenting ideas logically, and using creativity in responding to situations will help students prepare to respond to essay items in a testing situation.
This practice may come through discussions in class, clinical practice, and online; written assignments; and small-group activities. For students lacking experience with essay tests, the teacher may use sample items for formative purposes, providing feedback to students about the adequacy of their responses.
Tell students about apportioning their time to allow sufficient time for answering each essay item. In writing a series of essay items, consider carefully the time needed for students to answer them and inform students of the estimated time before they begin the examination.
In this way students may gauge their time appropriately. Indicating the point value of each essay item will also guide students to use their time appropriately, spending more time on and writing longer responses to items that carry greater weight.
Score essay items that deal with the analysis of issues according to the rationale that students develop rather than the position they take on the issue. Students should provide a sound rationale for their position, and the evaluation should focus on the rationale rather than on the actual position.
Avoid the use of optional items and student choice of items to answer. As indicated previously, these results in different subsets of tests that may not be comparable.
In the process of developing the item, write an ideal answer to it. The teacher should do this while drafting the item to determine if it is appropriate, clearly stated, and reasonable to answer in the allotted time frame. Save this ideal answer for use later in scoring students’ responses.
If possible, have a colleague review the item and explain how he or she would respond to it. Colleagues can assess the clarity of the item and whether it will elicit the intended response.
Scoring Essay Items: Holistic Versus Analytic There are two methods of scoring essay items: holistic and analytical. The holistic method involves reading the entire answer to each item and evaluating its overall quality. With the analytical method of scoring, the teacher separately scores individual components of the answer.
Holistic Scoring With holistic scoring, the teacher assesses and scores the essay response as a whole without judging each part separately. There are different ways of scoring essays using the holistic method.
Relative Scoring. One method of holistic scoring is to compare each student’s answer with the responses of others in the group, using a relative standard. To score essay items using this system, the teacher quickly reads the answers to each item to gain a sense of how the students responded overall, then re-reads the answers and scores them. Papers may be placed in a number of piles reflecting degrees of quality with each pile of papers receiving a particular score or grade.
Model Answer. Another way is to develop a model response for each item and then compare each student’s response to that model. The model answer does not have to be written in narrative form, but can be an outline with the key points and elements that should be in the answer. Before using a model answer for scoring responses, teachers should read a few papers to confirm that students’ answers are consistent with what was intended.
Holistic Scoring Rubric. A third way of implementing holistic scoring is to use a scoring rubric, which is a guide for scoring essays, papers, written assignments, and other open-ended responses of students. Rubrics also can be used for grading posters, concept maps, presentations, and projects competed by students. The rubric consists of predetermined criteria used for assessing the quality of the student’s work ( Mertler , 2003).
With holistic scoring, the rubric includes different levels of responses, as well as characteristics or descriptions thereof, and the related score. The student’s answer is assigned the score associated with the one description within the rubric that best reflects its quality and thus its score.
The important concept in this method is that holistic scoring yields one overall score that considers the entire response to the item rather than scoring its component parts separately (Miller et al., 2009; Nitko & Brookhart, 2007). Holistic rubrics are quicker to use for scoring because the teacher evaluates the overall response rather than each part of it.
One disadvantage, though, is that they do not provide students with specific feedback about their answers. Analytic Scoring In the analytical method of scoring, the teacher identifies the content that should be included in the answer and other characteristics of an ideal response. Each of these areas is evaluated and scored separately. With analytical scoring the teacher focuses on one characteristic of the response at a time (Miller et al., 2009).
Often a detailed scoring plan is used that lists content to be included in the answer and other characteristics of the response to be judged. Students earn points based on how well they address each content area and the other characteristics, not their overall response. This method of scoring is effective for essay items that require structured answers ( Mertler , 2003).
Analytic Scoring Rubric
A scoring rubric can also be developed with points assigned for each of the content areas that should be included in the response and other characteristics to be evaluated. An analytical scoring rubric provides at least two benefits in assessing essays and written work. First, it guides the teacher in judging the extent to which specified criteria have been met.
Second, it provides feedback to students about the strengths and weaknesses of their response (Miller et al., 2009). There are many Web sites to assist faculty in creating and using rubrics for evaluating student learning. Although most of these pertain to general education, the information can be easily adapted for assessment in nursing courses.
Criteria For Assessing Essay Items
The criteria for assessing essay items, regardless of the method, often address three areas:
(a) content
(b) organization
(c) process
Questions that guide assessments of each of these areas are:
■ Content: Is relevant content included? Is it accurate? Are significant concepts and theories presented? Are hypotheses, conclusions, and decisions supported? Is the answer comprehensive?
■ Organization: Is the answer well organized? Are the ideas presented clearly? Is there a logical sequence of ideas?
■ Process: Was the process used to arrive at conclusions, actions, approaches, and decisions logical? Were different possibilities and implications considered? Was a sound rationale developed using relevant literature and theories?
Suggestions for Scoring
- Identify the method of scoring to be used prior to the testing situation and inform the students of it.
- Specify in advance an ideal answer. In constructing this ideal answer, reading reviews, classroom discussions of the content, and other instructional activities completed by students. Identify content and characteristics required in the answer and assign points to them if using the analytical method of scoring.
- If using a scoring rubric, discuss it with the students ahead of time so that they are aware of how their essay responses will be judged. Students should understand the scoring rubric and criteria being used and the number of points for each element in the rubric (Moskal, 2003).
- Read a random sample of papers to get a sense of how the students approached the items and an idea of the overall quality of the answers.
- Score the answers to one item at a time. For example, read and score all of the students’ answers to the first item before proceeding to the second item. This procedure enables the teacher to compare responses to an item across students, resulting in more accurate and fairer scoring, and saves time by only needing to keep in mind one ideal answer at a time (Miller et al., 2009). 6. Read each answer twice before scoring. In the first reading, note omissions of major points from the ideal answer, errors in content, problems with organization, and problems with the process used for responding. Record corrections or comments on the students’ paper. After reading through all the answers to the question, begin the second reading for scoring purposes.
- Read papers in random order.
- Use the same scoring system for all papers.
- Read essay answers and other written assignments anonymously. Develop a system for implementing this in the nursing education program, for instance, by asking the students to choose a code number.
- Cover the scores of the previous answers to avoid being biased about the student’s ability.
- For important decisions or if unsure about the evaluation, have a colleague read and score the answers to improve reliability. A sample of answers might be independently scored rather than the complete set of student tests.
- Adopt a policy on writing (sentence structure, spelling, punctuation, grammar, neatness, and writing style in general) and determine whether the quality of the writing will be part of the test score. Inform students of the policy in advance of the test. If writing is assessed, then it should be scored separately, and the teacher should be cautious not to let the writing style bias the evaluation of content and other characteristics of the response.
Conclusion
Short-answer items can be answered by a word, phrase, or number. There are two types of short-answer items: question and completion, also referred to as fill-in-the-blank. These items are appropriate for recall of facts and specific information. With short-answer items, students can be asked to interpret data, use formulas, complete calculations, and solve mathematical-type problems.
In an essay test, students construct responses to items based on their understanding of the content. With this type of test item, varied answers may be possible depending on the concepts selected by the student for discussion and the way in which they are presented. Essay items provide an opportunity for students to select content to discuss, integrate concepts from various sources, present ideas in their own words, and develop original and creative responses to items.
This freedom of response makes essay items particularly useful for complex learning outcomes. There are two types of essay items: restricted response and extended response. In a restricted-response item, the teacher limits the student’s answer by indicating the content to be discussed and frequently the amount of discussion allowed, for instance, limiting the response to one paragraph or page.
In an extended-response item, students have complete freedom of response, often requiring extensive writing. Although essay items use writing as the medium for expression, the intent is to assess student understanding of specific content rather than judge the writing ability in and of it. Other types of assignments are better suited to assessing the ability of students to write effectively.