Ong , Saw Lan
(2007)
Comparing Two Language Version Of Science
Achievement Tests Using Differential
Item Functioning.
The Asia Pacific Journal of Educators and Education (formerly known as Journal of Educators and Education), 22 (1).
pp. 1-15.
ISSN 2289-9057
Abstract
At the national level, the Ministry of Education in Malaysia assesses the
achievement of primary school students in reading and writing, mathematics and science.
The results of the assessments are used for selection decisions as well as for grading
students. Since the implementation of the new language policy of teaching science and
mathematics in English, both Malay and English have been used as the language of
assessment. The validity of interpretation for tests results across different language
version is an important issue that needs to be investigated. Translating a test from a
source language to a target language does not necessarily produce two psychometrically
equivalent tests. The purpose of this study is to identify item(s) in translated achievement
tests that may function differently across languages. Differential Item Functioning (DIF)
analysis is useful to reveal items with psychometric characteristics that have been altered
by the translation. Two statistical analyses were conducted to identify and evaluate DIF
item(s). The simultaneous item bias test (SIBTEST), a nonparametric statistical method
of assessing DIF in an item is used. The result obtained is then compared with the oneparameter logistic model, analyze using BILOG-MG V3.0 in assessing DIF in translated
items. Both statistical analyses identified approximately 50% of the science items
displayed DIF. This result suggests that substantial psychometric differences exist
between the two language versions of the science test at the item level.
Actions (login required)
|
View Item |