Revisiones sistemáticas y meta-análisis en educaciónun tutorial

  1. Sánchez-Meca, Julio 1
  1. 1 Universidad de Murcia
    info

    Universidad de Murcia

    Murcia, España

    ROR https://ror.org/03p3aeb86

Revista:
Revista Interuniversitaria de Investigación en Tecnología Educativa

ISSN: 2529-9638

Año de publicación: 2022

Título del ejemplar: Metodologías aplicadas a la Tecnología Educativa

Número: 13

Páginas: 5-40

Tipo: Artículo

DOI: 10.6018/RIITE.545451 DIALNET GOOGLE SCHOLAR lock_openDIGITUM editor

Otras publicaciones en: Revista Interuniversitaria de Investigación en Tecnología Educativa

Resumen

Las revisiones sistemáticas (RSs) y los meta-análisis (MAs) constituyen una metodología consolidada en las Ciencias Sociales y de la Salud. Su propósito es sintetizar los resultados de estudios empíricos para dar respuesta a alguna pregunta de interés. Tomando como base una revisión comprehensiva de la literatura sobre RSs y MAs, se presenta en este artículo un tutorial sobre cómo hacer este tipo de investigaciones. Para ello, se describe su desarrollo siguiendo siete etapas: (1) formulación de la pregunta de interés; (2) definición de los criterios de selección de los estudios; (3) búsqueda de los estudios mediante el uso de fuentes formales e informales; (4) extracción de las características de los estudios; (5) definición del resultado de los estudios, haciendo hincapié en los índices del tamaño del efecto (ej., familia d y familia r); (6) métodos de síntesis, distinguiendo entre síntesis meta-analítica y otros métodos de síntesis, y (7) publicación o redacción de la RS/MA. También se presentan recomendaciones sobre cómo hacer lectura crítica de RSs/MAs hechas por otros y se presentan checklists y guías orientativas sobre cómo redactarlas, tales como los checklists PRISMA, AMSTAR-2, MOOSE o REGEMA, entre otros. Finalmente, se discuten las ventajas y las limitaciones de las RSs/MAs y se alcanzan algunas reflexiones finales, centrando la atención en la importancia de valorar posibles sesgos en los resultados de este tipo de investigación.

Referencias bibliográficas

  • Badenes-Ribera, L., Rubio-Aparicio, M. y Sánchez-Meca, J. (2020). Meta-análisis de generalización de la fiabilidad. Informació Psicològica, 119, 17-32. https://doi.org/dx.medra.org/10.14635/IPSIC.2020.119.6
  • Bahadivand, S., Doosti-Irani, A., Karami, M., Qorbani, M. y Mohammadi, Y. (2021). Prevalence of high-risk behaviors among Iranian adolescents: A comprehensive systematic review and meta-analysis. Journal of Education and Community Health, 8(2), 135-142. https://doi.org/10.29252/jech.8.2.135
  • Becker, B.J. (1988). Synthesizing standardized mean-change measures. British Journal of Mathematical and Statistical Psychology, 41, 257-278. https://doi.org/10.1111/j.2044-8317.1988.tb00901.x
  • Borenstein, M. y Hedges, L.V. (2019). Effect sizes for meta-analysis. En Cooper, H., Hedges, L.V. y Valentine, J.C. (Eds.), The handbook of research synthesis and meta-analysis (3ª ed.) (pp. 207-243). Russell Sage Foundation.
  • Borenstein, M., Hedges, L.V., Higgins, J.P.T. y Rothstein, H.R. (2019). Introduction to meta-analysis (2ª ed.). Wiley.
  • Botella, J. y Sánchez Meca, J. (2015). Meta-análisis en ciencias sociales y de la salud. Síntesis.
  • Breidbord, J. y Croudace, T.J. (2013). Reliability generalization for Childhood Autism Rating Scale. Journal of Autism and Developmental Disorders, 43(12), 2855-2865. https://doi.org/10.1007/s10803-013-1832-9
  • Campbell, M., Katikireddi, S.V., Sowden, A. y Thomson, H. (2019). Lack of transparency in reporting narrative synthesis of quantitative data: A methodological assessment of systematic reviews. Journal of Clinical Epidemiology, 105, 1-9. https://doi.org/10.1016/j.jclinepi.2018.08.019
  • Campbell, M., McKenzie, J.E., Sowden, A., Katikireddi, S.V., Brennan, S.E., Ellis, S., Hartmann-Boyce, J., Ryan, R., Shepperd, S., Thomas, J., Welch, V. y Thomson, H. (2020). Synthesis without meta-analysis (SWiM) in systematic reviews: Reporting guideline. British Medical Journal, 368(l6890). http://dx.doi.org/10.1136/bmj.l6890
  • Card, N.A. (2012). Applied meta-analysis for social science research. Guilford Press.
  • Cheng, L., Ritzhaupti, A.D. y Antonenko, P. (2019). Effects of the flipped classroom instructional strategy on students’ learning outcomes: A meta‑analysis. Educational Technology Research and Development, 67, 793-824. https://doi.org/10.1007/s11423-018-9633-7
  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2ª ed.). Erlbaum.
  • Conn, V.S. y Rantz, M.J. (2003). Research methods: Managing primary study quality in meta-analyses. Research in Nursing and Health, 26, 322-333. https://psycnet.apa.org/doi/10.1002/nur.10092
  • Cooper, H. (2016). Research synthesis: A step-by-step approach (5ª Ed.). Sage.
  • Cooper, H., Hedges, L.V. y Valentine, J.F. (Eds) (2019). The handbook of research synthesis and meta-analysis (3ª Ed.). Rusell Sage Foundation.
  • Cortina, J.M. y Nouri, H. (2000). Effect size for ANOVA designs. Sage.
  • Cumming, G. (2012). Understanding the new statistics: Effect sizes, confidence intervals, and meta-analysis. Routledge.
  • Davies, P. (1999). What is evidence-based education? British Journal of Educational Studies, 47(2), 108-121. https://doi.org/10.1111/1467-8527.00106
  • Dekker, I. y Meeter, M. (2022). Evidence-based education: Objections and future directions. Frontiers in Education, 7:941410. https://doi.org/10.3389/feduc.2022.941410
  • Downes, M.J., Brennan, M.L., Williams, H.C. y Dean, R.S. (2016). Development of a critical appraisal tool to assess the quality of cross-sectional studies (AXIS). British Medical Journal Open, 6, e011458. https://doi.org/10.1136/bmjopen-2016-011458
  • Duval, S. y Tweedie, R. (2000). Trim and fill: A simple funnel-plot-based method of testing and adjusting for publication bias in meta-analysis. Biometrics, 56, 455-463. https://doi.org/10.1111/j.0006-341x.2000.00455.x
  • Egger, M., Higgins, J.P.T. y Smith, G.D. (2022). Systematic reviews in health research: Meta-analysis in context (2ª ed.). Wiley.
  • Egger, M., Smith, G.D., Schneider, M. y Minder, C. (1997). Bias in meta-analysis detected by a simple graphical test. British Medical Journal, 315, 629-634.
  • Ellis, P.D. (2010). The essential guide to effect sizes: Statistical power, meta-analysis and the interpretation of research results. Cambridge University Press.
  • Erion, J. (2006). Parent tutoring: A meta-analysis. Education and Treatment of Children, 29, 79-106.
  • Giustini, D. (2019). Retrieving grey literature, information, and data in the digital age. En Cooper. H., Hedges, L.V. y Valentine, J.C. (Eds.), The handbook of research synthesis and meta-analysis (3ª ed.) (pp. 101-126). Russell Sage Foundation.
  • Glanville, J. (2019). Searching bibliographic databases. En Cooper. H., Hedges, L.V. y Valentine, J.C. (Eds.), The handbook of research synthesis and meta-analysis (3ª ed.) (pp. 73-99). Russell Sage Foundation.
  • Glass, G.V. (1976). Primary, secondary, and meta-analysis of research. Educational Researcher, 5(10), 3-8. https://doi.org/10.3102/0013189X005010003
  • Glass, G.V. y Smith, M.L. (1978). Meta-analysis of research on the relationship of class-size and achievement. Far West Laboratory for Educational Research and Development, San Francisco (CA).
  • Glass, G.V., McGaw, B. y Smith, M.L. (1981). Meta-analysis in social research. Sage.
  • Grissom, R.J. y Kim, J.J. (2012). Effect sizes for research: Univariate and multivariate applications (2ª ed.). Routledge.
  • Hedges, L.V. (1981). Distribution theory for Glass's estimator of effect size and related estimators. Journal of Educational Statistics, 6(2), 107-128. https://doi.org/10.3102/10769986006002107
  • Higgins, J.P.T., Thomas, J., Chandler, J., Cumpston, M., Li, T., Page, M.J. y Welch, V.A. (Eds.) (2022). Cochrane handbook for systematic reviews of interventions (2ª ed.). Wiley. Disponible en: https://training.cochrane.org/handbook/current
  • Hutton, B., Salanti, G., Caldwell, D.M., Chaimani, A., Schmid, C.H. et al. (2015). The PRISMA Extension Statement for Reporting of Systematic Reviews Incorporating Network Meta-analyses of Health Care Interventions: Checklist and Explanations. Annals of Internal Medicine, 162, 777-784. https://doi.org/10.7326/M14-2385
  • IntHout, J., Ioannidis, J.P.A., Rovers, M.M. y Goeman, J.J. (2016). Plea for routinely presenting prediction intervals in meta-analysis. British Medical Journal Open, 6, e010247. https://doi.org/10.1136/bmjopen-2015-010247
  • Hunt, M. (1997). How science takes stock: The story of meta-analysis. Russell Sage Foundation.
  • Kline, R.B. (2019). Becoming a behavioral science researcher: A guide to producing research that matters (2ª Ed.). Guilford Press.
  • Light, R.J. y Pillemer, D.B. (1984). Summing up: The science of reviewing research. Harvard University Press.
  • Lindberg, S.M., Hyde, J.S., Petersen, J.L. y Linn, M.C. (2010). New trends in gender and mathematics performance: A meta-analysis. Psychological Bulletin, 136(6), 1123-1135. https://doi.org/10.1037/a0021276
  • Lipsey, M.W. (2019). Identifying potentially interesting variables and analysis opportunities. En Cooper. H., Hedges, L.V. y Valentine, J.C. (Eds.), The handbook of research synthesis and meta-analysis (3ª ed.) (pp. 141-151). Russell Sage Foundation.
  • Lipsey, M.W. y Wilson, D.B. (2001). Practical meta-analysis. Sage.
  • López-López, J.A., Marín-Martínez, F., Sánchez-Meca, J., Van den Noortgate, W. y Viechtbauer, W. (2014). Estimation of the predictive power of the model in mixed-effects meta-regression: A simulation study. British Journal of Mathematical and Statistical Psychology, 67, 30-48. https://doi.org/10.1111/bmsp.12002
  • McKenzie, J.E. y Brennan, S.E. (2022). Chapter 12: Synthesizing and presenting findings using other methods. En Higgins, J.P.T., Thomas, J., Chandler, J., Cumpston, M., Li, T., Page, M.J. y Welch, V.A. (Eds.), Cochrane handbook for systematic reviews of interventions vers. 6.3. Cochrane. Disponible en: www.training.cochrane.org/handbook.
  • Moher, D., Shamseer, L., Clarke, M., Ghersi, D., Liberati, A., Petticrew, M., Shekelle, P. y Stewart, L.A. (2015). Preferred Reporting Items for Systematic Review and Meta-Analysis Protocols (PRISMA-P) 2015 statement. Systematic Reviews, 4(1):1. https://doi.org/ 10.1186/2046-4053-4-1
  • Morris, S.B. (2008). Estimating effect sizes from pretest-posttest-control group designs. Organizational Research Methods, 11, 364-386. https://doi.org/10.1177/1094428106291059
  • Morris, S.B. y DeShon, R.P. (2002). Combining effect size estimates in meta-analysis with repeated measures and independent-group designs. Psychological Methods, 7, 105-125. https://doi.org/10.1037/1082-989x.7.1.105
  • Nosek, B.A., Alter, G., Banks, G.C., Borsboom, D., Bowman, S.D. et al. (2015). Promoting an open research culture: Author guidelines for journals could help to promote transparency, openness, and reproducibility. Science, 348(6242), 1422–1425. https://doi.org/10.1126/science.aab2374
  • Page, M.J., Cumpston, M., Chandler, J. y Lasserson, T. (2021). Reporting the review. En Higgins, J.P.T., Thomas, J., Chandler, J., Cumpston, M., Li, T., Page, M.J. y Welch, V.A. (Eds.), Cochrane Handbook for Systematic Reviews of Interventions version 6.2 (actualizada en Febrero 2021). Cochrane Collaboration. Disponible en www.training.cochrane.org/handbook
  • Page, M.J., McKenzie, J.E., Bossuyt, P.M., Boutron, I., Hoffmann, T.C., Mulrow, C.D., et al. (2021). The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. Systematic Reviews, 10(89). https://doi.org/10.1186/s13643-021-01626-4
  • Patros, H.G., Alderson, R.M., Kasper, L.J., Tarle, S.J., Lea, S.E. y Hudec, K.L. (2016). Choice-impulsivity in children and adolescents with attention deficit/hyperactivity disorder (ADHD): A meta-analytic review. Clinical Psychology Review, 43, 162-174. https://doi.org/10.1016/j.cpr.2015.11.001
  • Petrosino, A., Boruch, R.F., Soydan, H., Duggan, L. y Sánchez-Meca, J. (2001). Meeting the challenges of evidence-based policy: The Campbell Collaboration. Annals of the American Academy of Political and Social Science, 578, 14-34. https://doi.org/10.1177/0002716201578001002
  • Petticrew, M. y Roberts, H. (2006). Systematic reviews in the social sciences: A practical guide. Blackwell.
  • Pigott, T. y Polanin, J.R. (2019). Methodological guidance paper: High-quality meta-analysis in a systematic review. Review of Educational Research, 90(1), 24-46. https://doi.org/10.3102/0034654319877153
  • Piñeiro-López, S., Martí-Vilar, M. y González-Sala, F. (2022). Intervenciones educativas en conducta prosocial y empatía en alumnado con altas capacidades: Una revisión sistemática. Bordón, 74(1), 141-157. https://doi.org/10.13042/Bordon.2022.90586
  • Rosa-Alcázar, A.I., Sánchez-Meca, J., Gómez-Conesa, A. y Marín-Martínez, F. (2008). The psychological treatment of obsessive-compulsive disorder: A meta-analysis. Clinical Psychology Review, 28, 1310-1325. https://doi.org/10.1016/j.cpr.2008.07.001
  • Rosenthal, R. (1991). Meta-analytic procedures for social research (ed. rev.). Sage.
  • Rosenthal, R., Rosnow, R.L. y Rubin, D.B. (2000). Contrasts and effect sizes in behavioral research: A correlational approach. Cambridge University Press.
  • Rothstein, H.R., Sutton, A.J. y Borenstein, M. (Eds.) (2005). Publication bias in meta-analysis: Prevention, assessment, and adjustments. Wiley.
  • Rubio-Aparicio, M., López-López, J. A., Viechtbauer, W., Marín-Martínez, F., Botella, J. y Sánchez-Meca, J. (2020). Testing categorical moderators in mixed-effects meta-analysis in presence of heteroscedasticity. Journal of Experimental Education, 88(2), 288-310. https://doi.org/10.1080/00220973.2018.1561404
  • Rubio-Aparicio, M., Sánchez-Meca, J., Marín-Martínez, F. y López-López, J.A. (2018). Guidelines for reporting systematic reviews and meta-analyses. Annals of Psychology, 34(2), 412-420. http://dx.doi.org/10.6018/analesps.34.2.320131
  • Salameh, J.-P., Bossuyt, P.M., McGrath, T.A., Thombs, B.D., Hyde, C.J., Macaskill, P. et al. (2020). Preferred reporting items for systematic review and meta-analysis of diagnostic test accuracy studies (PRISMA-DTA): explanation, elaboration, and checklist. British Medical Journal, 370(m2632).
  • Sánchez Martín, M., Navarro Mateu, F. y Sánchez-Meca, J. (2022). Las revisiones sistemáticas y la educación basada en evidencias. Espiral. Cuadernos del Profesorado, 15(30), 108-120.
  • Sánchez-Meca, J. (2008). Meta-análisis de la investigación. En Verdugo, M.A., Crespo, M., Badía, M. y Arias, B. (Coords.), Metodología en la investigación sobre discapacidad: Introducción al uso de las ecuaciones estructurales (pp. 121-139). Publicaciones del INICO (Colección ACTAS, 5/2008).
  • Sánchez-Meca, J. (2010). Cómo realizar una revisión sistemática y un meta-análisis. Aula Abierta, 38, 53-64.
  • Sánchez-Meca, J., Boruch, R.F., Petrosino, A. y Rosa-Alcázar, A.I. (2002). La Colaboración Campbell y la práctica basada en la evidencia. Papeles del Psicólogo, 22(83), 44-48.
  • Sánchez-Meca, J., López-López, J.A. y López-Pina, J.A. (2013). Some recommended statistical analytic practices when reliability generalization (RG) studies are conducted. British Journal of Mathematical and Statistical Psychology, 66, 402-425. https://doi.org/10.1111/j.2044-8317.2012.02057.x
  • Sánchez-Meca, J. y López-Pina, J.A. (2008). El enfoque meta-analítico de generalización de la fiabilidad. Acción Psicológica, 5, 37-64.
  • Sánchez-Meca, J. y Marín-Martínez, F. (2008). Confidence intervals for the overall effect size in random-effects meta-analysis. Psychological Methods, 13, 31-48. https://doi.org/10.1037/1082-989X.13.1.31
  • Sánchez-Meca, J. y Marín-Martínez, F. (2010). Meta-analysis. En P. Peterson, E. Baker y B. McGaw (Eds.), International Encyclopedia of Education (3ª ed.), Vol. 7 (pp. 274-282). Elsevier.
  • Sánchez-Meca, J., Marín-Martínez, F., López-López, J. A., Núñez, Núñez, R. M., Rubio-Aparicio, M., López-García, J. J., López-Pina, J. A., Blázquez-Rincón, D. M., López-Ibáñez, C. y López-Nicolás, R. (2021). Improving the reporting quality of reliability generalization meta-analyses: The REGEMA checklist. Research Synthesis Methods, 12(4), 516-536. https://doi.org/10.1002/jrsm.1487
  • Sánchez-Serrano, S., Pedraza-Navarro, I. y Donoso-González, M. (2022). ¿Cómo hacer una revisión sistemática siguiendo el protocolo PRISMA? Usos y estrategias fundamentales para su aplicación en el ámbito educativo a través de un caso práctico. Bordón, Revista de Pedagogía, 74(3), 51-66. https://doi.org/10.13042/Bordon.2022.95090
  • Sapp, M. (2017). Primer on effect sizes, simple research designs, and confidence intervals. Charles C. Thomas Pub., Ltd.
  • Saunders, L.D., Soomro, G.M., Buckingham, J., Jamtvedt, G. y Raina, P. (2003). Assessing the methodological quality of nonrandomized intervention studies. Western Journal of Nursing Research, 25, 223-237. https://doi.org/10.1177/0193945902250039
  • Scherer, R. y Shiddiq, F. (2019). The relation between students’ socioeconomic status and ICT literacy: Findings from a meta-analysis. Computers in Education, 138, 13-32. https://doi.org/10.1016/j.compedu.2019.04.011
  • Schmidt, F.L. y Hunter, J.E. (2015). Methods of meta-analysis: Correcting error and bias in research synthesis (3ª Ed.). Sage.
  • Shea, B.J., Reeves, B.C., Wells, G., Thuku, M., Hamel, C. et al. (2017). AMSTAR 2: A critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. British Medical Journal, 358(j4008). http://dx.doi.org/10.1136/bmj.j4008
  • Stroup, D.F., Berlin, J.A., Morton, S.C., Olkin, I., Williamson, G.D., et al. (2000). Meta-analysis of observational studies in epidemiology: A proposal for reporting. Meta-analysis Of Observational Studies in Epidemiology (MOOSE) group. Journal of the American Medical Association, 283, 2008-2012. https://doi.org/10.1001/jama.283.15.2008
  • Valentine, J.C., Aloe, A.M. y Wilson, S.J. (2019). Interpreting effect sizes. En Cooper, H., Hedges, L.V. y Valentine, J.C. (Eds.), The handbook of research synthesis and meta-analysis (3ª ed.) (pp. 433-452). Russell Sage Foundation.
  • Verhagen, A.P., de Vet, H.C., de Bie, R.A., Kessels, A.G., Boers, M., Bouter, L.M., Knipschild, P.G. et al. (1998). The Delphi list: A criteria list for quality assessment of randomised clinical trials for conducting systematic reviews developed by Delphi consensus. Journal of Clinical Epidemiology, 51(12), 1235-1241. https://doi.org/10.1016/s0895-4356(98)00131-0
  • Vevea, J.L., Coburn, K. y Sutton, A. (2019). Publication bias. En Cooper. H., Hedges, L.V. y Valentine, J.C. (Eds.), The handbook of research synthesis and meta-analysis (3ª ed.) (pp. 383-429). Russell Sage Foundation.
  • Vevea, J.L., Zelinsky, N.A.M. y Orwin, R.G. (2019). Evaluating coding decisions. En Cooper. H., Hedges, L.V. y Valentine, J.C. (Eds.), The handbook of research synthesis and meta-analysis (3ª ed.) (pp. 173-204). Russell Sage Foundation.
  • Wells, G.A., Shea, B., O'Connell, D., Peterson, J., Welch, V., Losos, M. y Tugwell, P. (2000). The Newcastle–Ottawa Scale (NOS) for assessing the quality of non-randomized studies in meta-analysis. Manuscrito no publicado, Universidad de Ottawa (Canadá).
  • White, I.R., Schmid, C.H. y Stijnen, T. (2021). Choice of effect measure and issues in extracting outcome data. En Schmid, C.H., Stijnen, T. y White, I.R. (Eds.), Handbook of meta-analysis (pp. 27-39). CRC Press.
  • Wilson, D.B. (2019). Systematic coding for research synthesis. En Cooper. H., Hedges, L.V. y Valentine, J.C. (Eds.), The handbook of research synthesis and meta-analysis (3ª ed.) (pp. 153-172). Russell Sage Foundation.