Didactic look at the explanation of the differences in performance of Moroccan and Quebec students in international assessments
Main Article Content
Abstract
This article presents a study in science teaching which aimed to better understand and explain the differences in performance achieved by young Moroccans and Quebecers aged 15 in an international assessment. Our study is part of a diagnostic perspective of conceptions in science (Tsai and Chou, 2002, Kraus and Minstrell, 2002; Thijs and Van Den Berg, 1995; Tsai and Chou, 2002). The theoretical framework is inspired by the work of Balacheff (1995) on the characterization of conceptions according to the subject / environment dynamic. The research approach is based on a mixed, predominantly qualitative approach. Our results attest to the existence of an important influence of conceptions of cultural origin in the orientation of student responses to standardized items used in certain large international surveys. These results therefore offer avenues for reflection on the possibility of putting into perspective the explanations for the difference in performance between countries with different cultural backgrounds, as well as avenues for solution to minimize them. Consequently, our study raises a questioning of a didactic nature on the modeling of the contents and format of so-called standardized items. What can be the invariants in the dressing of a standardized item in science with regard to the contextually valid conceptions which orient the pupils' responses?
Article Details
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.
References
Allal, L. (2007). Évaluation dans le contexte de l’apprentissage situé : peut-on concevoir l’évaluation comme un acte de participation à une communauté de pratiques ? Dans M. Behrens, La qualité en éducation : pour réfléchir à la formation de demain (p.39-56). Québec: Presses de l’Université du Québec.
Angell, C. (2004). Exploring Students’ Intuitive Ideas Based on Physics Items in TIMSS 1995. In C. Papanastasiou (éd.), Proceedings of the IRC-2004 TIMSS. Nicosia: Cyprus University Press, p. 108-123.
Angell, C., Kjærnsli, M. & Lie S. (2006). Curricular and cultural effects in patterns of students’ responses to TIMSS science items. In S. J. Howie & T. Plomp (éd.), Contexts of learning mathematics and science: Lessons learned from TIMSS. London: Routledge, p. 277-290.
Bachelard, G. (1938). La formation de l’esprit scientifique. Paris : Vérin.
Balacheff, N. (1995). Conception, propriété du système sujet/milieu. In R. Noirfalise., M.-J. Perrin-Glorian (eds.) Actes de la VII° École d'été de didactique des mathématiques (p.215-229). Clermont-Ferrand: IREM de Clermont-Ferrand.
Bertrand, R., & Blais, J.G. (2004). Modèles de mesure – L’apport de la théorie des réponses aux items. Sainte-Foy, Québec : Presses de l’Université du Québec.
Blum, A., Goldstein H. & Guerin-Pace, F. (2001). International adult literacy survey (IALS): an analysis of international comparisons of adult literacy. Assessment in Education, vol. 8, n° 2, p. 225-246. DOI: https://doi.org/10.1080/09695940123977
Boutin, G. (2000). L’entretien de recherche qualitatif. Sainte-Foy : Presses de l’Université du Québec.
Brousseau, G. (1976). Les obstacles épistémologiques et les problèmes en mathématiques. In (1983) Recherches en didactique des mathématiques, 4(2), 164-198.
Bulles, N (2010, mars-avril). L’imaginaire réformateur. PISA et les politiques de l’École. Le Débat, n°159, p.95-109. DOI: https://doi.org/10.3917/deba.159.0095
Caleon, I., & Subramaniam, R. (2010). Do Students Know What They Know and What They Don’t Know? Using a Four-Tier Diagnostic Test to Assess the Nature of Students’ Alternative Conceptions.Research in Science Education, 40(3), 313-337. DOI: https://doi.org/10.1007/s11165-009-9122-4
Carr, M. (1996). Interviews about instances and interviews about events. New York: Teachers College Press.
Chang, C. Y., Yeh, T. K., & Barufaldi, J. P. (2010). The positive and negative effects of science concept tests on student conceptual understanding.International Journal of Science Education, 32(2), 265-282. DOI: https://doi.org/10.1080/09500690802650055
Chi, M. T. H. (2008). Three types of conceptual change: belief revision, mental model transformation and categorical shift. In S. Vosniadou (Dir.), International Handbook of research of conceptual change (p. 61-82). New-York: Routledge.
Clarke, D. (2006). The LPS research design. In D. Clarke, C. Keitel & Y. Shimizu (Eds.) Mathematics in twelve countries: the insider’s perspective, pp. 15-36. Rotterdam: Sense Publishers. DOI: https://doi.org/10.1163/9789087901622_003
Confrey, J. (1986). Misconceptions accross subjet matters : charting the course from a constructivist perspective. Annual meeting of the American Educational Research Association. Document polycopié.
Coppens, N., & Munier, V. (2005). Évaluation d'un outil méthodologique, le double QCM, pour le recueil de conceptions et l'analyse de raisonnements en physique. Didaskalia, 27, 41-77. DOI: https://doi.org/10.4267/2042/23946
Creswell, J.W. (1994). Research design Qualitative and quantitative approaches. Thousand Oaks/London/New Delhi: SAGE Publications.
Duit, R. (1999). Conceptual Change Approaches in Science Education. In W. Schnotz., S. Vosniadou., & M. Carretero (Eds.), New Perspectives on Conceptual Change (p.263-282). Amsterdam: Pergamon Press.
Duit, R., & Treagust D.F. (2003). Conceptual change: a powerful framework for improving science teaching and learning. International Journal of Science Education, 25 (6), 671-688. DOI: https://doi.org/10.1080/09500690305016
Finegold, M., & Gorsky, R (1991). Students' concepts of force as applied to related physical systems: a search for consistency. International Journal of Science Education, 13(1), 97-113. DOI: https://doi.org/10.1080/0950069910130109
Franklin, B. J. (1992). The development and application of a two-tier diagnostic instrument to detect misconceptions in the area of force, heat light and electricity. Dissertation Abstracts International, 53(12), 41-86,
Gilles, J.-L. (2002). Qualité spectrale des tests standardisés universitaires – Mise au point d’indices édumétriques d’analyse de la qualité spectrale des évaluations des acquis des étudiants universitaires et application aux épreuves MOHICAN check up 99. Thèse de doctorat inédite en Sciences de l’éducation. Liège : Université de Liège, Faculté de psychologie et des sciences de l’éducation de l’université de Liège.
Givry, D. (2003). Le concept de masse en physique : quelques pistes à propos des conceptions et des obstacles. Didaskalia, 22, 41-6 DOI: https://doi.org/10.4267/2042/23920
Goldstein, H. (2004b). International comparative assessment: how far have we really come? Assessment in Education, vol. 11, n° 2, p. 227-234.
Grønmo, L. S., Kjærnsli, M. & Lie, S. (2004). Looking for cultural and geographical factors in patterns of response to TIMSS items.In C. Papanastasiou (éd.), Proceedings of the IRC-2004 TIMSS. Nicosia: Cyprus University Press, vol. 1, p. 99-112.
Gustafsson, J.-E. & Rosen, M. (2004). « The IEA 10-Year Trend Study of Reading Literacy: A multivariate reanalysis ». In C. Papanastasiou (éd.), Proceedings of the IRC-2004. Nicosia: Cyprus University Press, vol. 1, p. 99-112.
Hallden, O. (1999). Conceptual change and contextualisation. In W. Schnotz & S. Vosniadou & M. Carretero (Eds.), New Perspectives on Conceptual Change (p. 53-65). Amsterdam: Pergamon Press.
Halloun, I. A., & Hestenes, D. (1985). The initial knowledge of college physics students. American Journal of Physics, 53(11), 1043-1055. DOI: https://doi.org/10.1119/1.14030
Harlow, A. & Jones, A. (2004). « Why Students Answer TIMSS Science Test Items the Way They Do ». Research in Science Education, vol. 34, n° 2, p. 221-238. DOI: https://doi.org/10.1023/B:RISE.0000033761.79449.56
Hegarty-Hazel, E., & Prosser, M. (1991).Relationship between students’ conceptual knowledge and study strategies. Part 1: Student learning in physics. International Journal of Science Education, 13, 303–312. DOI: https://doi.org/10.1080/0950069910130308
Hutchins, E. (1995). Cognition in the Wild.Cambridge, MA: MIT Press. DOI: https://doi.org/10.7551/mitpress/1881.001.0001
Hutchison, G., & Schagen, I. (2007). Comparisons between PISA and TIMSS – Are We the Man with Two Watches? In Loveless, T. (Ed.), Lessons Learned – What International Assessments Tell Us about Math Achievement. Washington, DC: The Brookings Institution.
Joshua, S., & Dupin, J.-J. (1993). Introduction à la didactique des sciences et des mathématiques. Paris: PUF.
Kaplan, D. (2004). The SAGE Handbook of Quantitative Methodology for the Social Sciences. University of Wisconsin – Madison: SAGE Publications, Inc. DOI: https://doi.org/10.4135/9781412986311
Kjærnsli, M.; Angell, C. & Lie, S. (2002). « Exploring Population 2 Students’ Ideas about Science ».In D. F. Robitaille & A. E. Beaton (éd.), Secondary Analysis of the TIMSS Data. Dordrecht: Kluwer, p. 127-144. DOI: https://doi.org/10.1007/0-306-47642-8_9
Kraus, P., & Minstrell, J. (2002).Designing Diagnostic Assessments.Proceedings of the Physics Education Research Conference. Boise, ID. DOI: https://doi.org/10.1119/perc.2002.inv.002
Kyriadkides, L., & Charalambous, C. (2005).Using educational effectiveness research to design international comparative studies, Research Papers in Education, 20(4), 391-412. DOI: https://doi.org/10.1080/02671520500335816
Lafontaine, D. (2010). Les standards internationaux en éducation : la place des situations et des contextes. In D. Masciotra, F. Medzo et Ph. Jonnaert, Vers une approche située en éducation : Réflexions, pratiques, recherches et standards (p.159-174). Cahier scientifique 111, Acfas.
Lave, J. (1988). Cognition in Practice: Mind, mathematics, and culture in everyday life. Cambridge, UK: Cambridge University Press. DOI: https://doi.org/10.1017/CBO9780511609268
Leduc, D., Riopel, M., Raîche, G. et Blais, J.-G. (2011). L’influence des définitions des habiletés disciplinaires sur la création et le choix d’items dans le PISA et le TEIMS. Mesure et évaluation en éducation, 34 (1), 97-130. DOI: https://doi.org/10.7202/1024864ar
Leontiev, A. (1974). The problem of activity in psychology.Soviet psychology, 13(2), 4-33. DOI: https://doi.org/10.2753/RPO1061-040513024
Lee, G., Kwon, J., Park, S.-S., & Kim, J. W. (2003).Development of an instrument for measuring cognitive conflict in secondarylevelscience classe.Journal of research in science teaching, 40(6), 585-603. DOI: https://doi.org/10.1002/tea.10099
Lundeberg, M. A., Fox, P. W., Brown, A. C., & Elbedour, S. (2000). Cultural influences on confidence: country and gender. Journal of Educational Psychology, 92(1), 152–159. DOI: https://doi.org/10.1037/0022-0663.92.1.152
Malhotra, N. K., Décaudin, J.-M., & Bouguerra, A., (2007). Etudes marketing avec SPSS. 5ème éd, Pearson Education.
Messick, S. (1994). Alternative modes of assessment, uniform standards of validity. Princeton, NJ: Educational Testing Service. DOI: https://doi.org/10.1002/j.2333-8504.1994.tb01634.x
Millar, R., & Hames, V. (2001).Using diagnostic assessment to improve students' learning: some preliminary findings from work to develop and test diagnostic tools.In D. Psillos & al., (Eds).Proceedings of the Third International Conference on Science Education Research in the Knowledge Based Society (p 141-143). Thessaloniki, Greece: Art of text Publications.
Meunier, O. (2005). Standards, compétences de base et socle commun. Dossier de synthèse, veille scientifique et technologique, Lyon : INRP récupéré de http://www.inrp.fr/vst/Dossiers/Standards/sommaire.htm le 25-12 2010.
Mons, N. (2008). Evaluation des politiques éducatives et comparaisons internationales : introduction. Revue française de pédagogie, n° 164, p.5-13. DOI: https://doi.org/10.4000/rfp.1985
Nadelson, L. S., & Southerland, S. A. (2010).Development and preliminary evaluation of the Measure of Understanding of Macroevolution: Introducing the MUM. The Journal of Experimental Education, 78, 151–190. DOI: https://doi.org/10.1080/00220970903292983
Nandakumar, R. (1994). Assessing dimensionality of a set of item responses- Comparison of different approaches.Journal of Educational Measurement, 31, 17-35. DOI: https://doi.org/10.1111/j.1745-3984.1994.tb00432.x
Neidorf, T.S., Binkley, M., Gattis, K., & Nohara, D. (2006). Comparing Mathematics Content in the National Assessment of Educational Progress (NAEP), Trends in International Mathematics and Science Study (TIMSS), and Program for International Student Assessment (PISA) 2003 Assessments (NCES 2006-029). U.S. Department of Education. Washington, DC: National Center for Education Statistics.
Palmer, D. (1997). The effect of context on students' reasoning about forces.International Journal of Science Education, 19(6), 681-696. DOI: https://doi.org/10.1080/0950069970190605
Park, C., & Bolt, D. M. (2008). Application of multi-level IRT to investigate cross-national skill profiles on TIMSS 2003. IERI monograph series: Issues and methodologies in large-scale assessments, 1, 71–96.
Potvin, P., Riopel, M., Charland, P., Mercier, J. (2011). Portrait des différences entre les genres dans le contexte de l’apprentissage de l’électricité en fonction de la certitude exprimée lors de la production de réponses. Canadian Journal of science, mathematics and technology education, 11(4), 328-347. DOI: https://doi.org/10.1080/14926156.2011.624672
Rolf V. Olsen, Svein Lie (2006). Les évaluations internationales et la recherche en éducation : principaux objectifs et perspectives. Revue française de pédagogie, n° 157, octobre-novembre-décembre, 11-26. DOI: https://doi.org/10.4000/rfp.393
Säljö, R. (1999). Concepts, cognition and discourse: from mental structures to discursive tools. In W. Schnotz &S. Vosniadou & M. Carretero (Eds.), New Perspectives on Conceptual Change (pp. 53-65). Amsterdam: Pergamon Press.
Shealy, R. T., & Stout W. F. (1993). « A model-based standardization approach that separates true bias/DIF from group ability differences and detect test bias/DTF as well as item bias/DIF ». Psychometrika, 58, 159-194. DOI: https://doi.org/10.1007/BF02294572
Thijs, G. D., & Van Den Berg, ED. (1995).Cultural Factors in the Origin and Remediation of Alternative Conceptions in Physics.Science & Education, 4, 317-347. DOI: https://doi.org/10.1007/BF00487756
Tiberghien, A., Delacote, G., & Guesde, E. (1978, oct, nov, déc). Méthodes et résultats concernant l'analyse des conceptions des élèves dans différents domaines de la physique : Deux exemples : les notions de chaleur et lumière. Revue française de pédagogie. 45, 25-32. DOI: https://doi.org/10.3406/rfp.1978.1674
Treagust, D. F., & Duit, R. (2008). Compatibility between cultural studies and conceptual change in science education: there is more to acknowledge than to fight straw men!Cultural Studies of Science Education, 3, 387–395. DOI: https://doi.org/10.1007/s11422-008-9096-y
Tsai, C.-C., & Chou, C. (2002). Diagnosing students' alternative conceptions in science.Journal of Computer Assisted Learning: Blackwell Science, 18(2), 157-165. DOI: https://doi.org/10.1046/j.0266-4909.2002.00223.x
Tsui, C.Y., & Treagust, D. (2009).Evaluating Secondary Students’ Scientific Reasoning in Genetics Using a Two-Tier Diagnostic Instrument. International Journal of Science Education, 1-26. DOI: https://doi.org/10.1080/09500690902951429
Turmo, A. (2003, août).Understanding a newsletter article on ozone: a cross-national comparison of the scientific literacy of 15-year-olds in a specific context.Communication présentée à la 4e conférence ESERA Research and the Quality of Science Education, Noordwijkerhout [Pays-Bas].
Tüysüz, C. (2009, juin).Development of two-tier diagnostic instrument and assess students’ understanding in chemistry. Scientific Research and Essay, 4 (6), 626-631.
Van der Maren, J.-M. (2003). La recherche appliquée en pédagogie : Des modèles pour l’enseignement 2ème éd. Bruxelles : De Boeck Université. DOI: https://doi.org/10.3917/dbu.maren.2003.01
Viennot, L. (1977). Le raisonnement spontané en dynamique élémentaire. Thèse de doctorat d'État inédite. Paris : université Paris 7.
Vrignaud, P. (2002). Les biais de mesure : savoir les identifier pour y remédier. Bulletin de psychologie, 55(6), 625-634. DOI: https://doi.org/10.3406/bupsy.2002.15183
Wu, M. L. (2008, September.). A Comparison of PISA and TIMSS 2003 achievement results in Mathematics and Science. Paper presented at the Third IEA Research Conference, Taipei.
Wu, M. L. (2008, March). A Comparison of PISA and TIMSS 2003 achievement results in Mathematics. Paper presented at the AERA Annual Meeting, New York. DOI: https://doi.org/10.1007/s11125-009-9109-y
Zhang, J., & Norman, D. A. (1994).Representations in Distributed Cognitive Tasks.Cognitive Science, 18(1), 87-122. DOI: https://doi.org/10.1207/s15516709cog1801_3