Yıl 2018, Cilt 5, Sayı 2, Sayfalar 263 - 273 2018-03-18

Effects of Various Simulation Conditions on Latent-Trait Estimates: A Simulation Study

Hakan Koğar [1]

40 124

The aim of this simulation study, determine the relationship between true latent scores and estimated latent scores by including various control variables and different statistical models. The study also aimed to compare the statistical models and determine the effects of different distribution types, response formats and sample sizes on latent score estimations. 108 different data bases, comprised of three different distribution types (positively skewed, normal, negatively skewed), three response formats (three-, five- and seven-level likert) and four different sample sizes (100, 250, 500, 1000) were used in the present study. Results show that, distribution types and response formats, in almost all simulations, have significant effect on determination coefficients. When the general performance of the models are evaluated, it can be said that MR and GRM display a better performance than the other models. Particularly in situations when the distribution is either negatively or positively skewed and when the sample size is small, these models display a rather good performance.

Item response theory, Classical test theory, Factor analysis, Latent trait scores, data simulation
  • Allahyari, E., Jafari, P., & Bagheri, Z. (2016). A simulation study to assess the effect of the number of response categories on the power of ordinal logistic regression for differential ıtem functioning analysis in rating scales. Computational and mathematical methods in medicine, vol. 2016, Article ID 5080826. doi.org/10.1155/2016/5080826
  • Bartholomew, D. J., Knott, M., & Moustaki, I. (2011). Latent variable models and factor analysis: A unified approach (Vol. 904). John Wiley & Sons. doi.org/10.1002/9781119970583
  • Borsboom, D., & Mellenbergh, G. J. (2002). True scores, latent variables, and constructs: A comment on Schmidt and Hunter. Intelligence, 30(6), 505-514. doi.org/10.1016/S0160-2896(02)00082-X
  • Brzezińska, J. (2016). Latent variable modelling and item response theory analyses in marketing research. Folia Oeconomica Stetinensia, 16(2), 163-174. doi.org/10.1515/foli-2016-0032
  • Crocker, L., & Algina, J. (1986). Introduction to classical and modern test theory. Holt, Rinehart and Winston, 6277 Sea Harbor Drive, Orlando, FL 32887.
  • Cyr, A., & Davies, A. (2005). Item response theory and latent variable modeling for surveys with complex sampling design: The case of the national longitudinal survey of children and youth in Canada. In conference of the Federal Committee on Statistical Methodology, Office of Management and Budget, Arlington, VA.
  • Dawber, T., Rogers, W. T., & Carbonaro, M. (2009). Robustness of Lord's formulas for item difficulty and discrimination conversions between classical and item response theory models. Alberta Journal of Educational Research, 55(4), 512.
  • DeCoster, J. (1998). Overview of factor analysis. Retrieved June 12, 2017 from http://www.stat-help.com/factor.pdf
  • Dumenci, L., & Achenbach, T. M. (2008). Effects of estimation methods on making trait-level inferences from ordered categorical items for assessing psychopathology. Psychological assessment, 20(1), 55-62. doi.org/10.1037/1040-3590.20.1.55
  • Han, K. T. (2007). WinGen: Windows software that generates item response theory parameters and item responses. Applied Psychological Measurement, 31(5), 457-459. doi.org/10.1177/0146621607299271
  • Hauck Filho, N., Machado, W. D. L., & Damásio, B. F. (2014). Effects of statistical models and items difficulties on making trait-level inferences: A simulation study. Psicologia: Reflexão e Crítica, 27(4), 670-678. doi.org/10.1590/1678-7153.201427407
  • Jafari, P., Bagheri, Z., Ayatollahi, S. M. T., & Soltani, Z. (2012). Using Rasch rating scale model to reassess the psychometric properties of the Persian version of the PedsQL TM 4.0 Generic Core Scales in school children. Health and Quality of Life Outcomes, 10(1), 27. doi.org/10.1186/1477-7525-10-27
  • Kline, R. B. (2005). Principles and practice of structural equation modeling (Second Edition). New York: The Guilford Publications.
  • Li, C. H. (2016). Confirmatory factor analysis with ordinal data: Comparing robust maximum likelihood and diagonally weighted least squares. Behavior Research Methods, 48(3), 936-949. doi.org/10.3758/s13428-015-0619-7
  • Lord, F. M., & Novick, M. R. (1968). Statistical theories of mental test scores. Reading, MA: Addison-Wesley.
  • Lorenzo-Seva, U., & Ferrando, P. J. (2006). FACTOR: A computer program to fit the exploratory factor analysis model. Behavior research methods, 38(1), 88-91. doi.org/10.3758/BF03192753
  • Lozano, L. M., García-Cueto, E., & Muñiz, J. (2008). Effect of the number of response categories on the reliability and validity of rating scales. Methodology, 4(2), 73-79. doi.org/10.1027/1614-2241.4.2.73
  • Maydeu-Olivares, A., Kramp, U., García-Forero, C., Gallardo-Pujol, D., & Coffman, D. (2009). The effect of varying the number of response alternatives in rating scales: Experimental evidence from intra-individual effects. Behavior Research Methods, 41(2), 295-308. doi.org/10.3758/BRM.41.2.295
  • Mellenbergh, G. J. (1996). Measurement precision in test score and item response models. Psychological Methods, 1, 293 – 299. doi.org/10.1037/1082-989X.1.3.293
  • Raykov, T. ve Marcoulides, G. A. (2000). A first course in structural equation modeling. London: Lawrence Erlbaum Associates, Inc.
  • Revelle, W. (2017). Package ‘psych’. Retrieved from https://cran.r-project.org/web/packages/psych/psych.pdf
  • Rizopoulos, D. (2017). Package ‘ltm’. Retrieved from https://cran.r-project.org/web/packages/ltm/ltm.pdf
  • Samejima, F. (1968). Estimation of latent ability using a response pattern of graded scores. Psychometrika Monographs, 34(Suppl. 17).
  • Saporta, G., & Niang, N. (2009). Principal component analysis: Application to statistical process control. Data analysis, 1-23. doi.org/10.1002/9780470611777.ch1
Birincil Dil en
Konular Eğitim Bilimleri
Yayımlanma Tarihi July
Dergi Bölümü Makaleler
Yazarlar

Orcid: 0000-0001-5749-9824
Yazar: Hakan Koğar (Sorumlu Yazar)
E-posta: hkogar@gmail.com
Kurum: Akdeniz University
Ülke: Turkey


Bibtex @araştırma makalesi { ijate377138, journal = {International Journal of Assessment Tools in Education}, issn = {}, address = {İzzet KARA}, year = {2018}, volume = {5}, pages = {263 - 273}, doi = {}, title = {Effects of Various Simulation Conditions on Latent-Trait Estimates: A Simulation Study}, key = {cite}, author = {Koğar, Hakan} }
APA Koğar, H . (2018). Effects of Various Simulation Conditions on Latent-Trait Estimates: A Simulation Study. International Journal of Assessment Tools in Education, 5 (2), 263-273. Retrieved from http://dergipark.gov.tr/ijate/issue/35703/377138
MLA Koğar, H . "Effects of Various Simulation Conditions on Latent-Trait Estimates: A Simulation Study". International Journal of Assessment Tools in Education 5 (2018): 263-273 <http://dergipark.gov.tr/ijate/issue/35703/377138>
Chicago Koğar, H . "Effects of Various Simulation Conditions on Latent-Trait Estimates: A Simulation Study". International Journal of Assessment Tools in Education 5 (2018): 263-273
RIS TY - JOUR T1 - Effects of Various Simulation Conditions on Latent-Trait Estimates: A Simulation Study AU - Hakan Koğar Y1 - 2018 PY - 2018 N1 - DO - T2 - International Journal of Assessment Tools in Education JF - Journal JO - JOR SP - 263 EP - 273 VL - 5 IS - 2 SN - -2148-7456 M3 - UR - Y2 - 2018 ER -
EndNote %0 International Journal of Assessment Tools in Education Effects of Various Simulation Conditions on Latent-Trait Estimates: A Simulation Study %A Hakan Koğar %T Effects of Various Simulation Conditions on Latent-Trait Estimates: A Simulation Study %D 2018 %J International Journal of Assessment Tools in Education %P -2148-7456 %V 5 %N 2 %R %U