Assessing the Quality of Heritage Education ProgramsConstruction and Calibration of the Q-Edutage Scale

  1. Fontal Merillas, Olaia 1
  2. García Ceballos, Silvia 1
  3. Arias, Benito 1
  4. Arias, Víctor B. 2
  1. 1 Universidad de Valladolid
    info

    Universidad de Valladolid

    Valladolid, España

    ROR https://ror.org/01fvbaw18

  2. 2 Universidad de Salamanca
    info

    Universidad de Salamanca

    Salamanca, España

    ROR https://ror.org/02f40zc51

Revista:
Revista de psicodidáctica

ISSN: 1136-1034

Ano de publicación: 2019

Volume: 24

Número: 1

Tipo: Artigo

DOI: 10.1016/J.PSICOD.2018.07.003 DIALNET GOOGLE SCHOLAR lock_openAcceso aberto editor

Outras publicacións en: Revista de psicodidáctica

Resumo

Improving the assessment of the quality of educational programs is one of the main objectives of research in heritage education. However, we do not have an instrument that is brief, objective and allows the use of a common standard for unbiased quality comparison between different programs. The objective of this study has been to design and develop a tool for the quality assessment of heritage education programs, which maintains an appropriate balance between accuracy and brevity, and can be used both on its own (e.g., for screening purposes when the number of programs to be evaluated is high) and to support broader assessment systems. Relevant quality indicators were identified, according to previous research and evaluations by 17 experts, resulting in 14 quality indicators that were calibrated using Item Response Theory models from the assessment of 330 heritage education programs. The scale was able to discriminate with high precision between various levels of quality (i.e., very low, low, medium, high and very high), provided a good level of information over a wide area of the variable, and produced unbiased scores among different evaluators. The Q-Edutage scale is an relevant addition that contributes to improving the rigour of evaluation and program planning in the field of heritage education.

Información de financiamento

Este trabajo ha sido realizado en el marco del proyecto EDU2015-65716-C2-1-R, financiado por el Ministerio de Industria, Economía y Competitividad de España y el Fondo Europeo de Desarrollo Regional.

Financiadores

Referencias bibliográficas

  • Agrusti, F., Poce, A., y Re, M. R. (2017). Mooc design and heritage education.Developing soft and work-based skills in higher education students. Journal of ELearning and Knowledge Society, 13(3), 97–107, https://doi.org/10.20368/1971-8829/1385.
  • Baker, F. B. (2001). The basics of item response theory. In ERIC clearinghouse on assessment and evaluation. College Park: University of Maryland.
  • Bangdiwala, K. (1987). Using SAS software graphical procedures for the observer agreement chart. Proceedings of the SAS User’s Group International Conference, 12, 1083–1088.
  • Bolívar, C. R. (2013). Instrumentos y técnicas de investigación educativa: Un enfoque cuantitativo y cualitativo para la recolección y análisis de datos (3.rd. Ed.). Houston, Texas: Danaga.
  • Brady, M. K., y Cronin, J. J., Jr. (2001). Some new thoughts on conceptualizing perceived service quality: A hierarchical approach. Journal of Marketing, 65(3), 34–49.
  • Cai, L., Thissen, D., y du Toit, S. (2011). IRTPRO user’s guide. Lincolnwood, IL: Scientific Software International.
  • Cai, L., Thissen, D., y du Toit, S. (2015). IRTPRO for Windows [Computer software]. Lincolnwood, IL: Scientific Software International.
  • Calaf, R., Gillate, I., y Gutiérrez, S. (2015). Transitando por la evaluación de los Programas Educativos de Museos de Arte del proyecto ECPEME. Educatio Siglo XXI, 33(1), 129–150, https://doi.org/10.6018/j/222531.
  • Calaf, R., San Fabián, J. L., y Gutiérrez, S. (2017). Evaluación de programas educativos en museos: Una nueva perspectiva. Bordón, 69(1), 45–65. http://dx.doi.org/10.13042/Bordon.2016.42686
  • Cobaleda, M. (2016). The «didactic guide»: A tool for the cultural goods and heritage program. Opción, 32(11), 856–872.
  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2.nd. ed.). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Corral, Y. (2009). Validez y confiabilidad de los instrumentos de investigación para la recolección de datos. Revista Ciencias de la Educación, 19(33), 228–247.
  • Cozzani, G., Pozzi, F., Dagnino, F. M., Katos, A. V., y Katsouli, E. F. (2017). Innovative technologies for intangible cultural heritage education and preservation: The case of i-Treasures. Personal and Ubiquitous Computing, 21(2), 253–265, https://doi.org/10.1007/s00779-016-0991-z.
  • Deng, L. (2015). Inclusive museum and its impact on learning of special needs children. Proceedings of the Association for Information Science and Technology, 52(1), 1–4, https://doi.org/10.1002/pra2.2015.1450520100110.
  • Domínguez, A., y López, R. (2017). Patrimonios en conflicto, competencias cívicas y formación profesional en educación primaria. Revista de Educación, 375, 86–104, https://doi.org/10.4438/1988-592X-RE-2016-375-336.
  • Ferrando, P. J., y Lorenzo-Seva, U. (2017). Assessing the quality and appropriateness of factor solutions and factor score estimates in exploratory ítem factor analysis. Educational and Psychological Measurement, https://doi.org/10.1177/0013164417719308.
  • Fontal, O. (2003). La educación patrimonial. Teoría y práctica en el aula el museo e Internet. Gijón: Trea.
  • Fontal, O. (2016a). The spanish heritage education observatory/El observatorio de educación patrimonial en Espana. ˜ Culture and Education, 28(1), 254–266, https://doi.org/10.1080/11356405.2015.1110374.
  • Fontal, O. (2016b). Educación patrimonial: retrospectiva y prospectivas para la próxima década. Estudios Pedagógicos, 42(2), 415–436.
  • Fontal, O. (2016c). El patrimonio a través de la educación artística en la etapa de primaria. Arte, Individuo y Sociedad, 28(1), 105–120, https://doi.org/10.5209/rev ARIS.2016.v28.n1.47683.
  • Fontal, O., y Ibánez-Etxeberria, ˜ A. (2017). La investigación en educación patrimonial Evolución y estado actual a través del análisis de indicadores de alto impacto. Revista de Educación, 375, 184–214.
  • Fontal, E. E., Ibánez-Etxeberria, ˜ A., Martínez, M., y Rivero, M. P. (2017). El patrimonio como contenido en la etapa de Primaria: del currículum a la formación de maestros. Revista Electrónica Interuniversitaria de Formación del Profesorado, 20(2), 79–94, https://doi.org/10.6018/reifop/20.2.286321.
  • Fontal, O., y Juanola, R. (2015). La educación patrimonial: Una disciplina útil y rentable en el ámbito de la gestión del patrimonio cultural Cadmo. International Journal of Educational Research, 23(1), 254–266, https://doi.org/10.3280/CAD2015-001002.
  • Gómez-Redondo, C., Calaf, R., y Fontal, O. (2017). Design of an instrument of analysis for heritage educational resources. Cadmo. International Journal of Educational Research, 1, 63–80, https://doi.org/10.3280/CAD2017-001008.
  • Gürc¸ ayir, S. (2013). Customary modes, modern ways: formal, non-formal education and intangible cultural heritage. Milli Folklor, 12(100), 31–39. Ibánez-Etxeberria, ˜ A., Fontal, O., y Rivero, P.(en prensa). Educación patrimonial y TIC en Espana: ˜ marco normativo, variables estructurantes y programas referentes. Arbor, 195.
  • Kitungulu, L. (2015). Collaborating to enliven heritage collections. Museum International, 65(1–4), 113–122, https://doi.org/10.1111/muse.12043.
  • Klein, S., y van Boxtel, M. G. (2011). ’See, think, feel, ask, talk, listen, and wonder’. Distance and proximity in history teaching and heritage education in the Netherlands. Tijdschrift Voor Geschiedenis, 124(3), 381–395.
  • Lorenzo-Seva, U., y Ferrando, P. J. (2006). FACTOR: A computer program to fit the exploratory factor analysis model. Behavior Research Methods, 38(1), 88–91.
  • Marín-Cepeda, S., García-Ceballos, S., Vicent, N., Gillate, I., y Gómez-Redondo, C. (2017). Educaciónpatrimonialinclusiva en OEPE:unestudioprospectivo.Revista de Educación, 375, 110–135.
  • Martín-Cáceres, M., y Cuenca, J. M. (2011). La ensenanza ˜ y el aprendizaje del patrimonio en los museos: la perspectiva de los gestores. Revista de Psicodidáctica, 16(1), 99–122.
  • Martín-Cáceres, M., y Cuenca, J. M. (2016). Communicating heritage in museums: outlook, strategies and challenges through a SWOT analysis. Museum Management and Curatorship, 31(3), 1–18, https://doi.org/10.1080/09647775.2016.1173576.
  • Maydeu-Olivares, A., y Joe, H. (2006). Limited information goodness-of-fit testing in multidimensional contingency tables. Psychometrika, 71(4), 713, https://doi.org/10.1007/s11336-005-1295-9.
  • Meade, A. W. (2010). A taxonomy of effect size measures for the differential functioning of ítems and scales. Journal of. Applied Psychology, 95(4), 728–743, https://doi.org/10.1037/a0018966.
  • Orlando, M., y Thissen, D. (2000). Likelihood-based item-fit indices for dichotomous ítem response theory models. Applied Psychological. Measurement, 24(1), 50–64, https://doi.org/10.1177/01466216000241003.
  • Pérez Juste, R. (2000). La evaluación de programas educativos Conceptos básicos, planteamientos generales y problemática. Revista de Investigación Educativa, 18(2), 261–287.
  • Popham, W. J. (1983). Evaluación basada en criterios. Madrid: Magisterio Espanol, ˜ S.A.
  • Potocnik, ˇ R. (2017). Effective approaches to heritage education: raising awareness through fine art practice. International Journal of Education Through Art, 13(3), 285–294, https://doi.org/10.1386/eta.13.3.285 1.
  • Rivero, P., Fontal, O., García-Ceballos, S., y Martínez, M. (2018). A model for heritage education through archaeological sites: The case of the roman city of Bilbilis. Curator, the Museum Journal, 61(2), 315–326, https://doi.org/10.1111/cura.12258.
  • Rodríguez,A., Reise, S. P., y Haviland, M. G.(2016). Evaluating bifactor models: Calculating and interpreting statistical indices. Psychological Methods, 21(2), 137–150, https://doi.org/10.1037/met0000045.
  • Samejima, F. (1997). Graded response model. En W. J. van der Linden y R. K. Hambleton (Eds.), Handbook of modern ítem response theory (pp. 85–100). New York, NY: Springer-Verlag.
  • Simons, H. (2011). El estudio de caso: Teoría y práctica. Madrid: Ed. Morata.
  • Stake, R.(2006). Evaluación comprensiva. Evaluación basada en estándares. Barcelona: Grao.
  • Stake, R. (2010). Investigación con estudios de casos (5.a ed). Madrid: Morata.
  • Stake, R., y Munson, A. (2008). Qualitative assessment of arts education. Arts Education Policy Review, 109(6), 13–22, https://doi.org/10.3200/AEPR.109.6.13-22.
  • Tay, L., Meade, A. W., y Cao, M. (2014). An overview and practical guide to IRT measurement equivalence analysis. Organizational Research Methods, 18(1), 3–46, https://doi.org/10.1177/1094428114553062.
  • Timmerman, M. E., y Lorenzo-Seva, U.(2011). Dimensionality assessment of ordered polytomous ítems with parallel analysis. Psychological Methods, 16, 209–220, https://doi.org/10.1037/a0023353.
  • Tsai, S. C. (2011). Multimedia courseware development for world heritage sites and its trial integration into instruction in higher technical education. Australasian Journal of Educational Technology, 27(7), 1171–1189, https://doi.org/10.14742/ajet.911.
  • Vicent, N., Ibánez-Etxeberria, ˜ A., y Asensio, M. (2015). Evaluation of heritage education technology-based programs. Virtual Archaeology Review, 6(13), 20