مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

Persian Verion

Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

video

Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

sound

Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

Persian Version

Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

View:

377
Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

Download:

0
Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

Cites:

Information Journal Paper

Title

Assessing the optimal method of detecting Differential Item Functioning in Computerized Adaptive Testing

Pages

  23-51

Abstract

 und: Test fairness is one of the main challenges in transition from paper-pencil towards Computerized Adaptive Testing (CAT). Aim: This study was aimed at investigating Differential Item Function (DIF), assessing intervening factors in clarifying DIF and suggesting the optimal method for DIF in Computerized Adaptive Testing. Method: The empirical method was applied based on the nature of the study area. Data gathering procedure and manipulating the variables were done using simulation method. The responses of 1000 examinees (reference and focal group with equal 500 numbers) to item bank of 55 dichotomous items were simulated based on 3-parameter logistic model with 20 iterations. Fifteen items were manipulated in terms of DIF type and magnitude and test impact was evaluated based on mean difference of comparison groups. Computerized Adaptive Test with 30 items was administered via Firestar software package. Analysis was done by Logistic regression (LR) and Item response theory-likelihood ratio test (IRT-LRT) and the methods were compared based on their power and type I error rate. Results: Type I error rate of likelihood ratio test was less than Logistic regression. The power of the methods was influenced by type, magnitude of DIF and test impact. Comparing with Logistic regression, Item response theory-likelihood ratio test had more power in detecting uniform DIF for the impact and no-impact conditions and it showed more power by increasing the magnitude of DIF. The two methods showed no difference in assessing non-uniform DIF and both of them were poor. Conclusion: Given the power and type I error rate, likelihood ratio test is an optimal approach in detecting uniform DIF. However, assessing non-uniform DIF requires further investigation.

Cites

  • No record.
  • References

  • No record.
  • Cite

    APA: Copy

    SHARIFI YEGANEH, NEGAR, FALSAFI NEJAD, MOHAMMAD REZA, FAROKHI, NOORALI, & JAMALI, EHSAN. (2018). Assessing the optimal method of detecting Differential Item Functioning in Computerized Adaptive Testing. TRAINING MEASUREMENT, 9(33 ), 23-51. SID. https://sid.ir/paper/214689/en

    Vancouver: Copy

    SHARIFI YEGANEH NEGAR, FALSAFI NEJAD MOHAMMAD REZA, FAROKHI NOORALI, JAMALI EHSAN. Assessing the optimal method of detecting Differential Item Functioning in Computerized Adaptive Testing. TRAINING MEASUREMENT[Internet]. 2018;9(33 ):23-51. Available from: https://sid.ir/paper/214689/en

    IEEE: Copy

    NEGAR SHARIFI YEGANEH, MOHAMMAD REZA FALSAFI NEJAD, NOORALI FAROKHI, and EHSAN JAMALI, “Assessing the optimal method of detecting Differential Item Functioning in Computerized Adaptive Testing,” TRAINING MEASUREMENT, vol. 9, no. 33 , pp. 23–51, 2018, [Online]. Available: https://sid.ir/paper/214689/en

    Related Journal Papers

    Related Seminar Papers

  • No record.
  • Related Plans

  • No record.
  • Recommended Workshops






    Move to top