Testing invariance between web and paper students satisfaction surveysa case study

  1. Monica Martinez-Gomez 1
  2. Juan A. Marin-Garcia 1
  3. Martha Giraldo O'Meara 2
  1. 1 Universidad Politécnica de Valencia
    info

    Universidad Politécnica de Valencia

    Valencia, España

    ROR https://ror.org/01460j859

  2. 2 Universitat de València
    info

    Universitat de València

    Valencia, España

    ROR https://ror.org/043nxc105

Journal:
Intangible Capital

ISSN: 1697-9818

Year of publication: 2017

Volume: 13

Issue: 5

Pages: 879-901

Type: Article

DOI: 10.3926/IC.1049 DIALNET GOOGLE SCHOLAR lock_openOpen access editor

More publications in: Intangible Capital

Abstract

Purpose: This paper studied the measurement invariance (MI) across web-based and paper-based surveys to evidece if both techniques of data collection can be regarded as equivalent. Design/methodology/approach: We develop a multigroup confirmatory factor analysis (MGCFA) with Maximum Likelihood Estimation to asses meassurement invariance of the Job Diagnostic Survey (JDS) adapted to teaching, with data collected from paper and web surveys. Sample from paper surveys was constituted by 294 student of a Spanish public university in the academic years 2007-08, 2008-09 and 2009-10. Internet surveys were administered through an open source survey application called LimeSurvey. We received 241 completed questionnaires. Findings: Results show that metric invariance, covariance invariance, variance of latent factors invariance and measurement errors invariance can be established between two groups. We can conclude that both methods of collecting data can be considered equivalent. Research limitations/implications: This study was done with a particular sample and strict focus questionnaire and we might not generalize the findings. It should be extended in the future to include other universities and graduate students. Originality/value: Results showed that the factor structures remained invariant across the internet-based and paper-based groups, that is to say, both methods of collecting data can be considered equivalent, with the same factor structure, factor loadings, measurement errors of factors and the same reliability. These findings are useful for researchers since they add a new sample in which web and paper questionnaires are equivalent and for teachers to desire to change the teaching methodology at university, encourage students’ participation and teamwork through active methodologies

Funding information

This paper has been written with financial support from the Project "Validación de las Competencias Transversales de Innovación mediante un enfoque formativo". (GV/2016/004) de la Conselleria d'Educació, Investigació, Cultura i Esport (Generalitat Valenciana).

Bibliographic References

  • American Psychological Association & National Council on Measurement in Education. (1999). Standards for educational and psychological testing. Washington, DC: American Educational Research Association
  • Aster, A.Z. (2004). Consumer research goes online. Marketing Magazine, 7, 13-14
  • Aydin, B., & Ceylan, A. (2008). The employee satisfaction in metalworking manufacturing: How do organizational culture and organizational learning capacity jointly affect it?. Journal of Industrial Engineering and Management, 1(2), 143-168
  • Barak, M., Ben-Chaim D., & Zoller, U. (2007). Purposely teaching for the promotion of higher-order thinking skills: A case of critical thinking. Research in Science Education, 37, 353-369. https://doi.org/10.1007/s11165-006-9029-2
  • Bartram, D. (2005). The great eight competencies: A criterion-centric approach to validation. Journal of Applied Psychology, 90, 1185-1203. https://doi.org/10.1037/0021-9010.90.6.1185
  • Bentler, P.M., & Bonett, D.G. (1980). Significance tests and goodness of fit in the analysis of covariance structures. Psychological Bulletin, 88, 588-606. https://doi.org/10.1037/0033-2909.88.3.588
  • Bollen, K.A., & Long, J.S. (1993). Testing Structural Equation Models. Newbury Park, California: Sage
  • Bosnjak, M., Tuten, T.L., & Wittmann, W.W. (2005). Unit (non) response inweb-based access panel surveys: an extended planned-behavior approach. Psychology & Marketing 22(6), 489-505. https://doi.org/10.1002/mar.20070
  • Bowling, A. (2005). Mode of questionnaire administration can have serious effects on data quality. Journal of Public Health, 27, 281-291. https://doi.org/10.1093/pubmed/fdi031
  • Brown, M.W., & Cudeck, R. (1993). Alternative ways of assessing model fit. In K.A. Bollen & J.S. Long (Eds.), Testing structural equation models (pp.136-162). Newbury Park, California: Sage
  • Buchanan, T., & Smith, J.L. (1999). Using the internet for psychological research: Personality testing on the World Wide Web. British Journal of Psychology, 90, 125-144. https://doi.org/10.1348/000712699161189
  • Byrne, B.M. (1989). Multigroup comparisons and the assumption of equivalent construct validity across groups: Methodological and substantive issues. Multivariate Behavioural Research, 24, 503-523. https://doi.org/10.1207/s15327906mbr2404_7
  • Byrne, B.M., & Stewart, S.M. (2006).The MACS approach to testing for multigroup invariance of a second-order structure: A walk through the process. Structural Equation Modeling, 13, 287-321. https://doi.org/10.1207/s15328007sem1302_7
  • Byrne, B.M., & Van De Vijver, F.J.R. (2010). Testing for measurement and structural equivalence in large-scalecross-cultural studies: Addressing the issue of nonequivalence. International Journal of Testing, 10, 107-132. https://doi.org/10.1080/15305051003637306
  • Chen, F.F. (2007). Sensitivity of goodness of fit indices to lack of measurement invariance. Structural Equation Modeling, 14, 464-504. https://doi.org/10.1080/10705510701301834
  • Chen, F.F.; Sousa, K.H.; West, S.G. (2005). Testing Measurement Invariance of Second-Order Factor Models. Structural Equation Modeling, 12: 471-492. https://doi.org/10.1207/s15328007sem1203_7
  • Cheung, G.W. (2008). Testing equivalence in the structure, means, and variances of higher-order constructs with structural equation modelin. Organizational Research Methods, 11(3), 593-613. https://doi.org/10.1177/1094428106298973
  • Cheung, G.W., & Rensvold, R.B. (2002). Evaluating goodness-of-fit indices for testing measurement Equivalence. Structural Equation Modeling, 9, 233-255. https://doi.org/10.1207/S15328007SEM0902_5
  • Cohen, S., Kamarck, T., & Mermelstein, R. (1983). A global measure of perceived stress. Journal of Health and Social Behavior, 24, 385-396. https://doi.org/10.2307/2136404
  • Cole, M.S., Bedeian, A.G., & Feild, H.S. (2006). The measurement equivalence of web-based and paperand-pencil measures of transformation leadership: A multinational test. Organisational Research Methods, 9(2), 339-368. https://doi.org/10.1177/1094428106287434
  • Cook, C., Heath, F., & Thompson, R.L. (2000). A meta-analysis of response rates in Web-or Internet based surveys. Educational and Psychological Measurement, 60, 821-836. https://doi.org/10.1177/00131640021970934
  • Curran, P.J., West, S.G., & Finch, J.F. (1996). The robustness of test statistics to nonnormality and specification error in confirmatory factor analysis. Psychological Methods, 1(11), 16-29. https://doi.org/10.1037/1082-989X.1.1.16
  • Davidov, E., & Depner, F. (2011). Testing for measurement equivalence of human values across online and paper-and pencil surveys. Quality & Quantity, 45(2), 375-390. https://doi.org/10.1007/s11135-009-9297-9
  • De Beuckelaer, A., & Lievens, F. (2009). Measurement equivalence of paper-and-pencil and Internet organisational surveys: A large scale examination in 16 countries. Applied Psychology, 58(2), 336-361. https://doi.org/10.1111/j.1464-0597.2008.00350.x
  • Dillman, D.A. (2000). Mail and Internet Surveys: The Tailored Design Method (2nd Eds.). New York: John Wiley & Sons
  • Dillman, D., Smyth, J., & Christian, L. (2009). Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method. New York: Wiley
  • Dimitrov, D.M. (2006). Validation of cognitive operations and processes across ability levels and individual test items. In T.E. Scruggs & M.A. Mastropieri, (Eds.), Advances in Learning and Behavioral Disabilities (pp. 55-81). San Diego, CA: Elsevier. Ltd. https://doi.org/10.1016/S0735-004X(06)19003-5
  • Drasgow, F., & Schmidt, N. (2002). Measuring and Analyzing Behaviour in Organizations: Advances in Measurement and Data Analysis. San Francisco: Jossey Bass
  • Ebenezer, J.V., Columbus, R., Kaya, O.N., Zhang, L., & Ebenezer, D.L. (2012). One science teacher's professional development experience: A case study exploring changes in students' perceptions of their fluency with innovative tecnologies. Journal of Science Educatonal Technologies, no, xx-xx. Retrieved from: http://www.springerlink.com/content/q03j2118040t6863
  • Elosua, P. (2005). Evaluación progresiva de la invarianza factorial entre las versiones original y adaptada de una escala de autoconcepte. Psicothema, 17(2), 356-362
  • Epstein, J., Klinkenberg, W.D., Wiley, D., & McKinley, L. (2001). Ensuring sampling equivalence across Internet and paper-and-pencil assessments. Computers in Human Behavior, 17, 339-346. https://doi.org/10.1016/S0747-5632(01)00002-4
  • Fan, W., & Yan, Z. (2010). Factors affecting response rates of the web survey: A systematic review. Computers in Human Behavior, 26(2), 132-139. https://doi.org/10.1016/j.chb.2009.10.015
  • Fang, J., Wen, C., & Pavur, R. (2012). Participation willingness in web surveys: Exploring effect of sponsoring corporation's and survey provider's reputation. Cyberpsychology, Behavior, and Social Networking, 15(4), 195-199. https://doi.org/10.1089/cyber.2011.0411
  • Fang, J., Wen, C., & Prybutok, V.R. (2014). An assessment of equivalence between internet and paperbased surveys: Evidence from collectivistic cultures. Quality & Quantity, 48(1), 493-506. https://doi.org/10.1007/s11135-012-9783-3
  • Fouladi, R.T., McCarthy, C.J., & Moller, N.P. (2002). Paper-and-pencil or online? Evaluating mode effects on measures of emotional functioning and attachment. Assessment, 9(2), 204-215. https://doi.org/10.1177/10791102009002011
  • Gangestad, S., & Snyder, M. (1985). To carve nature at its joints': On the existence of discrete classes in personality. Psychological Review, 92, 317-349. https://doi.org/10.1037/0033-295X.92.3.317
  • Giraldo-O'Meara, M., Marin-Garcia, J.A., & Martínez-Gómez, M. (2014). Validation of the JDS satisfaction scales applied to educational university environments. Journal of Industrial Engineering and Management, 7(1), 72-99. https://doi.org/10.3926/jiem.906
  • Göritz, A.S. (2006). Incentives in web studies: methodological issues and a review. International Journal of Internet Science, 1(1), 58-70
  • Hackman, J.R., & Oldham, G.R. (1975). Development of the Job Diagnostic Survey. Journal of Applied Psychology, 60(2), 159-170. https://doi.org/10.1037/h0076546
  • Hackman, J.R., & Oldham, G.R. (1976). Motivation through the design of the work: Test of a theory. Organizational Behaviour and Human Performance, 16, 250-279. https://doi.org/10.1016/0030-5073(76)90016-7
  • Hackman, J.R., & Oldham, G.R. (1980). Work Redesig. Reading, MA: Addison-Wesley
  • Hair, J.F., Anderson, R.E., Thatam, R.L., & Black, W.C. (1998). Multivariate Data Analysis (6th eds). New York: Prentice Hall International
  • Herrero, J., & Meneses, J. (2006). Short Web-based versions of the perceived stress (PSS) and Center for Epidemiological Studies-Depression (CESD) Scales: A comparison to pencil and paper r e sponse s among Int e rne t use r s. Computers in Human Behavior, 22(5), 830-846. https://doi.org/10.1016/j.chb.2004.03.007
  • Hogg, A. (2003). Web efforts energize customer research. Electronic Perspectives, 6, 81-83
  • Hohwü, L., Lyshol, H., Gissler, M., Jonsson, S.H., Petzold, M., & Obel, C. (2013). Web-based versus traditional paper questionnaires: A mixed-mode survey with a Nordic perspective. Journal of Medical Internet Research, 15(8), e173. https://doi.org/10.2196/jmir.2595
  • Hu, L., & Bentler, P.M. (1995). Evaluating model fit. In R. H. Hoyle (Ed.), Structural Equation Modeling. Concepts, Issues, and Applications (pp.76-99). London: Sage
  • Hu, L., & Bentler, P.M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6(1), 1-55. https://doi.org/10.1080/10705519909540118
  • Ismail, A., Mashkuri, A., Sulaiman, A., & Kee Hock, W. (2011). Interactional justice as a mediator of the relationship between pay for performance and job satisfaction. Intangible Capital, 7(2), 213-235
  • Jöreskog, K.G. (1971). Simultaneous factor analysis in several populations. Psychometrika, 36, 409-426. https://doi.org/10.1007/BF02291366
  • King, W., & Miles, E. (1995). Quasi-experimental assessment of the effect of computerizing noncognitive paper and pencil measurements: A test of measurement equivalence. Journal Applied of Psycholy, 80(6), 643-651. https://doi.org/10.1037/0021-9010.80.6.643
  • Kline, R.B. (2010). Principles and Practice of Structural Equation Modeling. NY, London: The Guilford Press
  • Kraut, A.I., & Saari, L.M. (1999). Organizational surveys: Coming of age for a new era. In A.I. Kraut & A.K. Korman (Eds), Evolving practices in human resource management (pp. 302-327). San Francisco, CA: Jossey-Bass
  • Lautenschlager, G.J., & Flaherty, V.L. (1990). Computer administration of questions: More desirable or more social desirability?. Journal of Applied Psychology, 75, 310-314. https://doi.org/10.1037/0021-9010.75.3.310
  • Leung, D., & Kember, D. (2005). Comparability of data gathered from evaluation questionnaires on paper and through the internet. Research in Higher Education, 46(5), 571-591. https://doi.org/10.1007/s11162-005-3365-3
  • Marbach-Ad, G., & Sokolove, P.G. (2002). The use of e-mail and in-class writing to facilitate student-instructor interaction in large-enrollment traditional and active learning classes. Journal of Science Education Technology, 11 (2), 109-119. https://doi.org/10.1023/A:1014609328479
  • Martínez-Gomez, M., & Marin-Garcia, J.A. (2009). Como medir y guiar el cambio hacia entornos educativos universitarios más motivadores para los alumnos. Formación Universitaria, 2, 3-14. https://doi.org/10.4067/S0718-50062009000400002
  • Martínez Gómez, M., Marin-Garcia, J., & Giraldo-O'Meara, M. (2016). The measurement invariance of job diagnostic survey (jds) across three university student groups. Journal of Industrial Engineering and Management, 9(1), 17-34. https://doi.org/10.3926/jiem.1783
  • Martins, N. (2010). Measurement model equivalence in web-and paper-based surveys. Sourthen African Business Review, 14(3), 77-107
  • Meade, A.W., Michels, L.C., & Lautenschlager, G.J. (2007). Are internet and paper-and pencil personality tests truly comparable? An experimental design measurement invariance study. Organizational Research Methods, 10(2), 322-345. https://doi.org/10.1177/1094428106289393
  • Miles, E.W., & King, W.C. (1998). Gender and administration mode effects when pencil-and-paper personality tests are computerized. Educational and Psychological Measurement, 58, 66-74. https://doi.org/10.1177/0013164498058001006
  • Nulty, D.D. (2008). The adequacy of response rates to online and paper surveys: What can be done?. Assessment & Evaluation in Higher Education, 33( 3), 301-314. https://doi.org/10.1080/02602930701293231
  • Orgambídez-Ramos, A., Borrego-Alés, Y., & Mendoza-Sierra, I. (2014). Role stress and work engagement as antecedents of job satisfaction in Spanish workers. Journal of Industrial Engineering and Management, 7(1), 360-372. https://doi.org/10.3926/jiem.992
  • Radloff, L. (1977). The CES-D Scale: A self-report depression scale for research in the general population. Applied Psychological Measurement, 1, 385-401. https://doi.org/10.1177/014662167700100306
  • Riva, G., Teruzzi, T., & Anolli, L. (2003). The use of the internet in psychological research: Comparison of online and offline questionnaires. Cyber Psychology & Behavior, 6, 73-80. https://doi.org/10.1089/109493103321167983
  • Reips, U.D. (2000). The web experiment method: Advantages, disadvantages and solutions. In M.H. Birnbaum (Eds.), Psychological Experiments on the Internet (pp. 89-117). San Diego, CA: Academic Press. https://doi.org/10.1016/B978-012099980-4/50005-8
  • Roberts, L.L., Konczak, L.J., & Macan, T.H. (2004). Effects of data collection method on organizational climate survey results. Applied H.R.M Research, 9, 13-26
  • Santos Rego, M.á., Godás Otero, A., Lorenzo Moledo, M., & Gómez Fraguela, J.A. (2010). Eficacia y satisfacción laboral de dos profesores no universitarios: Revisión de un instrumento de medida. Revista Española de Pedagogía, 245, 151-168
  • Satorra, A., & Bentler, P.M. (1994). Corrections to test statistic and standard errors in covariance structure analysis. In A. von Eye & C.C. Clogg (Eds.), Analysis of Latent Variables in Developmental Research (pp. 399-419). Thousand Oaks, CA: Sage
  • Satorra, A., & Bentler, P.M. (2001). A Scaled difference chi-square test statistic for moment structure analysis. Psychometrika, 66(4), 507-514. https://doi.org/10.1007/BF02296192
  • Schaeffer, D., & Dillman, D. (1998). Development of a standard e-mail methodology: Results of an experiment. Public Opinion Quarterly, 62, 378-397. https://doi.org/10.1086/297851
  • Schonlau, M., Fricker, R.D., & Elliott, M.N. (2002). Conducting research surveys via e-mail and the web. Santa Monica, CA: Rand Corporation
  • Simsek, Z., & Veigha, J.F. (2001). A primer on internet organizational surveys. Organizational Research Methods, 4, 218-235. https://doi.org/10.1177/109442810143003
  • Sproull, L.S. (1986). Using electronic mail for data collection in organizational research. Academy of Management Journal, 29, 159-169. https://doi.org/10.2307/255867
  • Stanton, J.M. (1998). An empirical assessment of data collection using the internet. Personnel Psychology, 51, 709-725. https://doi.org/10.1111/j.1744-6570.1998.tb00259.x
  • Steenkamp, J.B.E.M., & Baumgartner, H. (1998). Assessing measurement invariance in crossnational consumer research. Journal of Consumer Research, 25, 78-90. https://doi.org/10.1086/209528
  • Steinmetz, H., Schmidt, P., Tina-Booh, A., Wieczorek, S., & Schwartz, S. (2009). Testing measurement invariance using multigroup CFA: differences between educational groups in human values measurement. Quality & Quantity, 43(4), 599-616. https://doi.org/10.1007/s11135-007-9143-x
  • Trullas, I., & Enache, M. (2011). Theoretical analysis of the antecedents and the consequences of students' identification with their university and their perception of quality. Intangible Capital, 7(1), 170-212. https://doi.org/10.3926/ic.2011.v7n1.p170-212
  • Ullman, J.B., & Bentler, P.M. (2004) Structural Equation Modeling. In M. Hardy & Bryman (Eds.), Handbook of Data Analysis (pp.431-458). London: Sage
  • Van Gelder, M.M., Bretveld R.W., & Roeleveld, N. (2010). Web-based questionnaires: The future in epidemiology?. American Journal of Epidemiol, 172(11), 1292-1298. https://doi.org/10.1093/aje/kwq291
  • Vandenberg, R.J. (2002). Toward a further understanding of and improvement in measurement invariance methods and procedures. Organizational Research Methods, 5(2), 139-158
  • Vandenberg, R.J., & Lance, C.E. (2000). A review and synthesis on the measurement invariance literature: Suggestions, practices and recommendations for organisational research. Organizational Research Methods, 3, 4-70. https://doi.org/10.1177/109442810031002
  • Van de Schoot, R., Lugtig, P., & Hox, J. (2012). A checklist for testing measurement invariance. European Journal of Developmental Psychology, 9(4), 486-492. https://doi.org/10.1080/17405629.2012.686740
  • Walt, N., Atwood, K., & Mann, A. (2008). Does Survey Medium Affect Responses? An Exploration of Electronic and Paper Surveying in British Columbia Schools. Journal of Technology, Learning, and Assessment, 6(7). Retrieved from: http://www.jtla.org/
  • Young, S.A., Daum, D.L., Robie, C., & Macey, W.H. (2000). Paper vs. web survey administration: Do different methods yield different results?. Proccedings of 15th Anual Conference of the Society for Industrial and Organizational Psychology, New Orleans, LA
  • Yu, S.C., & Yu, M.N. (2007). Comparison of Internet-based and paper-based questionnaires in Taiwan using multisample invariance approach. Cyberpsycholy Behaviour, 10(4), 501-507. https://doi.org/10.1089/cpb.2007.9998
  • Yun, G.W. & Trumbo, C.W. (2000). Comparative Response to a Survey Executed by Post, e-mail, & Web Form. Journal of Computer-Mediated Communication, 6(1). Retrieved from: http://onlinelibrary.wiley.com/doi/10.1111/j.1083-6101.2000.tb00112.x/full https://doi.org/10.1111/j.1083-6101.2000.tb00112.x