BY SITI NAWANGSIH ROHANA BARIT
What is the purpose of the test? What are the objectives of the test? How will the test specification reflect both the purpose and the objectives? How will the test tasks be selected and the separate items arranged? What kind of scoring, grading, and/or feedback is expected?
ANA BACHRUN
10/17/2009
2
1. Assessing Clear and unambiguous Objectives In addition to knowing the purpose of the test you’re creating, you need to know as specifically as possible what it is you want to test.
ANA BACHRUN
10/17/2009
3
2. Drawing Up Test Specification a.
Outline of the test curriculum b. Skills to be included c. Item types and tasks
Based on the
ANA BACHRUN
10/17/2009
4
3. Devising Test Tasks In revising the draft consider the following important questions: a. b. c. d. e.
Are the direction to each section absolutely clear? Is there an example item for each section? Does each item measure a special objective? Is each item stated clear, simple language? Does each multiple choice item have appropriate distractors; that is, are the wrong items clearly wrong? f. Is the difficulty of each item appropriate for your students? g. Is the language of each item sufficiently authentic? h. Do the sum of the items and the test as a whole adequately reflect the learning objectives?
ANA BACHRUN
10/17/2009
5
Multiple-Choice items, which may appear to be the simplest kind of item to construct, are extremely difficult to design correctly. Hughes (2003, pp 76-78) stated a number of weaknesses of MC items. 1. The technique tests only recognition knowledge. 2. Guessing may have a considerable effect on the test scores. 3. The technique severely restrict what can be tested. 4. It is very difficult to write successful item. 5. Washback may be harmful 6. Cheating may be facilitated
ANA BACHRUN
10/17/2009
6
Consider the following guideline for designing MC items: 1. Design each item to measure a specific objective. 2. State both stem and options are simply and directly as possible 3. Make certain that the intended answer is clearly the only correct one. 4. Use items indices to accept, discard or revise items. ANA BACHRUN
10/17/2009
7
Here are the decisions about scoring the test: Percent of Total Grade
Possible Total Correct
Oral interview
40%
4 scores, 5 to 1 range x 40 2
Listening
20%
10 items @ 2 points each
20
Reading
20%
10 items @ 2 points each
20
Writing
20%
2 scores, 5 to 1 range x 20 2
Total
100%
100 ANA BACHRUN
10/17/2009
8
Your first thought might be that assigning grades to students performance on this test would be easy: just give ”A” for 90-100 percent, and a “B” for 80-89 percent and so on. NOT SO FAST! Grading is a thorny issue ------ CHAPTER 11
ANA BACHRUN
10/17/2009
9
Scoring and grading would not be complete without some consideration of the form in which you will offer feedback to your students, the feedback that you want to become a beneficial washback. Washback is achieved when students can through the testing experience, identify their areas of success and challenge. When a test become a learning experience, it achieves washback.
ANA BACHRUN
10/17/2009
10
Brown, H. Douglas. (2004). Language Assessment: Principle and Classroom Practices. New York: Pearson Education. Wegener, Delano P. Test Construction .http://www.delweg.com/dpwessay/tests.ht m. Accessed on October 10th, 2009.
ANA BACHRUN
10/17/2009
11
ANA BACHRUN
10/17/2009
12