Master Thesis
Differential item functioning (DIF) deals with how test items perform in different demographic groups. Although many operationalizations of DIF are proposed, no single approach proves to be exclusively superior. The current study investigated the properties of DIF indices in the differential functioning of items and tests (DFIT) framework and further tested the power of item parameter replication method (IPR) in identifying biased items within the item response theory (IRT) paradigm. The results indicated the IPR method is a useful tool in DIF detection. The source of item parameters covariance structure was found to affect the results only slightly. The results indicated that the proportion of biased items on the test might not be as influential factor as the actual number of biased items. This finding, along with low false negative rates when 99.9th percentile cut-off scores is used, supports the view that sequential elimination of biased items is preferable over the simultaneous detection.
Defended on August 8, 2008, in partial fulfillment of the requirements for the degree of Master of Arts, in the Quantitative Psychology program of Middle Tennessee State University.