Establishing the Validity and Reliability of a Program Evaluation Questionnaire using Rasch Measurement Model

Su Ling Loh, Nur Aisah Jamil, Ng Khar Thoe


This study was conducted to establish the validity and reliability of the evaluation questionnaire for the 12th Regional Congress of the Search For SEAMEO Young Scientist (SSYS) 2022 using the Rasch Measurement Model that was aided by the Winsteps software.  The questionnaire contains 24 items that evaluate the Congress's objectives, inputs, as well as event management and administration.  Each item is rated on a 4-point rating scale.  The instrument was administered at the end of the 3-days SSYS Congress held virtually in which 1891 participants submitted their responses.  The establishment of validity and reliability of this questionnaire is crucial before further analysis is carried out.  The Rasch Model analysis showed that the reliability index of the respondents was 0.87 and person separation is 2.60, while the item reliability index is 0.96 with an item separation index of 5.08.  Item polarity indicates that the point measure correlation (PTMEA CORR) for the 24 items is between 0.67 to 0.76.  In terms of item fit, the results indicated one misfit item that needs improvement in the future.  The Principal Component Analysis (PCA) shows that almost all the items are unidimensional and intended to measure a similar trait.  All these indicate the reliability of the questionnaire, and researchers can proceed with further data analysis to evaluate the 12th Regional Congress of the SSYS 2022.


Technological knowledge, Technological Content Knowledge; Technological Pedagogical Knowledge; Science; Augmented Reality


Aziz, A. A., Masodi, M. S., & Zaharim, A. (2013). Asas Model Pengukuran Rasch: Pembentukan Skala dan Struktur Pengukuran. Universiti Kebangsaan Malaysia.

Azrilah, A. . (2010). Rasch measurement fundamentals: Scale Construct and Measurement Structure. Kuala Lumpur, Malaysia: Integrated Publishing.

Bond, T. G., & Fox, C. M. (2007). Applying the Rasch Model: Fundamental Measurement in the Human Sciences. New Jersey, USA: Lawrence Erlbaum Association, Inc.

Bond, T. G., & Fox, C. M. (2013). Applying the Rasch Model: Fundamental Measurement in the Human Sciences. Psychology Press.

Chou, Y. T., & Wang, W. C. (2010). Checking dimensionality in item response models with principal component analysis on standardized residuals. Educational and Psychological Measurement, 70(5), 717–731.

Finlayson, M. L., Peterson, E. W., Fujimoto, K. A., & Plow, M. A. (2009). Raschvalidation of the falls prevention strategies survey. Journal Archives of Physical Medicine and Rehabilitation, 90(2), 2039–2046.

Fisher, W. P. (2007). Rating scale instrument quality criteria. Rasch Measurement Transactions, 21(1), 1095.

Ishak, A. H., Osman, M. R., Mahaiyadin, M. H., Tumiran, M. A., & Anas, N. (2018). Examining unidimensionality of psychometric properties via rasch model. International Journal of Civil Engineering and Technology, 9(9), 1462–1467.

Linacre, J. M. (2007a). A user’s guide to WINSTEPS Rasch-model computer programs. Chicago, IL: Mesa Press.

Linacre, J. M. (2007b). Dimensionality and Structural Validity investigation - an example. Retrieved from

Linacre, J. M. (2012). Winsteps Rasch measurement computer program User’s guide. Oregon, USA: Beaverton.

Linacre, J. M., & Wright, B. D. (2012). A user’s guide to WINSTEPS ministeps Rasch model computer programs. Chicago, USA: Mesa Press.

Mangao, D.D. & Ng, K.T. (2014). Search for SEAMEO Young Scientists (SSYS) - RECSAM's initiative for promoting public science education: The way forward. Presentation published in International Conference on Science Education 2012 (refereed) Proceedings (pp.45-56). Springer, Berlin, Heidelberg. Retrieved:

Ng, K.T. (2005). An evaluation of the scientific creativity and problem-solving behaviours of young learners in the development of investigative project work. Presentation published in the Proceedings (refereed) of International Conference on Science and Mathematics Education (CoSMEd) 2005 (pp.118-129). Penang, Malaysia: SEAMEO RECSAM. Retrieved

Ng, K.T., Baharum, B., Othman, M., Tahir, S. & Pang, Y.J. (2020). Managing technology-enhanced innovation programs: Framework, exemplars and future directions. Solid State Technology. Vol. 63, Issue 1s, pp.555-565. Horizon Research Publishing Corporation. Retrieved

Nunnally, J. C., & Bernstein, I. H. (1994). Psychometric theory. New York, NY, USA: McGraw-Hill.

Rachman, T., & Napitupulu, D. B. (2017). Rasch Model for Validation a User Acceptance Instrument for Evaluating E-learning System. CommIT (Communication and Information Technology) Journal, 11(1), 9.

RECSAM (2022). Programme Book of the 12th Regional Congress of Search for SEAMEO Young Scientists. Penang, Malaysia: SEAMEO RECSAM

Sick, J. (2010). Unidimensionality Equal item discrimination and error due to guessing. JALT Testing & Evaluation SIG Newsletter., 14(2), 23–29.

Stufflebeam, D. L. (2000). The CIPP Model for Evaluation. In T. Stufflebeam, D. L., Madam, C.F. & Kellaghan (Ed.), Evaluation Models (pp. 279–317). Boston: Kluwer Academit.

Talib, R., Iahad, N. A., Ashari, Z. M., Rameli, M. R. M., Bakar, Z. A., & Dollah, R. (2019). Rasch strategies for evaluating quality of the Conceptions and Alternative assessment Survey (CETAS). Universal Journal of Educational Research, 7(12 A), 10–17.

Tavakol, M., & Dennick, R. (2011). Making sense of Cronbach’s alpha. International Journal of Medical Education, 2, 53–55.

Wright, B.D., & Stone, M. H. (1979). Best Test Design. Chicago, IL: Mesa Press.

Wright, Benjamin D., & Masters, G. N. (1982). Rating Scale Analysis: Rasch Measurement. In Journal of the American Statistical Association (Vol. 78).

Full Text: PDF

DOI: 10.30595/dinamika.v14i2.14786

Copyright (c) 2022 Su Ling Loh, Nur Aisah Jamil, Ng Khar Thoe

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

ISSN: 2655-870X