Repository logo
Communities & Collections
All of DSpace
  • English
  • العربية
  • বাংলা
  • Català
  • Čeština
  • Deutsch
  • Ελληνικά
  • Español
  • Suomi
  • Français
  • Gàidhlig
  • हिंदी
  • Magyar
  • Italiano
  • Қазақ
  • Latviešu
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Srpski (lat)
  • Српски
  • Svenska
  • Türkçe
  • Yкраї́нська
  • Tiếng Việt
Log In
New user? Click here to register.Have you forgotten your password?
  1. Home
  2. Browse by Author

Browsing by Author "Kahraman, Nilufer"

Filter results by typing the first few letters
Now showing 1 - 2 of 2
  • Results Per Page
  • Sort Options
  • No Thumbnail Available
    Item
    An Explanatory Item Response Theory Approach for a Computer-Based Case Simulation Test
    (2014) Kahraman, Nilufer; S-9457-2018
    Problem: Practitioners working with multiple-choice tests have long utilized Item Response Theory (IRT) models to evaluate the performance of test items for quality assurance. The use of similar applications for performance tests, however, is often encumbered due to the challenges encountered in working with complicated data sets in which local calibrations alone provide a poor model fit. Purpose: The purpose of this study was to investigate whether the item calibration process for a performance test, computer-based case simulations (CCS), taken from the United States Medical Licensing Examination((R)) (USMLE (R)) Step 3((R)) examination may be improved through explanatory IRT models. It was hypothesized that explanatory IRT may help improve data modeling for performance assessment tests by allowing important predictors to be added to a conventional IRT model, which are limited to item predictors alone. Methods: The responses of 767 examinees from a six-item CCS test were modeled using the Partial Credit Model (PCM) and four explanatory model extensions, each incorporating one predictor variable of interest. Predictor variables were the examinees' gender, the order in which examinees encountered an individual item ( item sequence), the time it took each examinee to respond to each item ( response time), and examinees' ability score on the multiple-choice part of the examination. Results: Results demonstrate a superior model fit for the explanatory PCM with examinee ability score from the multiple-choice portion of Step 3. Explanatory IRT model extensions might prove useful in complex performance assessment test settings where item calibrations are often problematic due to short tests and small samples. Recommendations: Findings of this study have great value in practice and implications for researchers working with small or complicated response data. Explanatory IRT methodology not only provides a way to improve data modeling for performance assessment tests but also enhances the inferences made by allowing important person predictors to be incorporated into a conventional IRT model.
  • No Thumbnail Available
    Item
    Using Multigroup Confirmatory Factor Analysis to Test Measurement Invariance in Raters: A Clinical Skills Examination Application
    (2015) Kahraman, Nilufer; Brown, Crystal B.; S-9457-2018
    Psychometric models based on structural equation modeling framework are commonly used in many multiple-choice test settings to assess measurement invariance of test items across examinee subpopulations. The premise of the current article is that they may also be useful in the context of performance assessment tests to test measurement invariance of raters. The modeling approach and how it can be used for performance tests with less than optimal rater designs are illustrated using a data set from a performance test designed to measure medical students' patient management skills. The results suggest that group-specific rater statistics can help spot differences in rater performance that might be due to rater bias, identify specific weaknesses and strengths of individual raters, and enhance decisions related to future task development, rater training, and test scoring processes.

| Başkent Üniversitesi | Kütüphane | Açık Bilim Politikası | Açık Erişim Politikası | Rehber |

DSpace software copyright © 2002-2025 LYRASIS

  • Privacy policy
  • End User Agreement
  • Send Feedback
Repository logo COAR Notify