Référence APA
Kappeler, G., Monnier, E.-C., Muguerza Bengoechea, S., Gay, P., & Genoud, P. A. (2025, July). Development of a scale to assess teacher education students' perceptions of generative AI tools. Poster presented at the SSRE-SSFE-congrès annuel 2025 Lucerne, Suisse.
Résumé
The integration of conversational artificial intelligence (AI) systems in higher education is profoundly changing teaching and learning practices, particularly in teacher education. These systems, which can generate diverse textual content or interact in a conversational way, are increasingly being used by students to complete various academic tasks. Research (Mogavi et al., 2024; Yilmaz & Karaoglan Yilmaz, 2023) highlights several benefits, such as increased productivity and efficiency in completing specific tasks such as writing and revising texts, preparing presentations, or acquiring technical skills. These tools offer constant accessibility, allowing students to work independently and flexibly. Furthermore, their use promotes the development of critical thinking skills by encouraging students to evaluate and adjust their answers, especially in problem-solving (Yilmaz & Karaoglan Yilmaz, 2023).
However, these conversational AI systems pose significant challenges. Key concerns include excessive user dependency, which could limit intellectual engagement and the ability to solve tasks independently. The responses provided by these tools, while often quick and helpful, can sometimes be incorrect, incomplete or ambiguous, potentially leading to confusion or misinformation among students (Wang et al., 2024). In addition, the uncritical use of these technologies’ risks promoting problematic academic practices such as superficial learning, thereby compromising the development of critical thinking and learner autonomy. Furthermore, as Stiegler (2016) points out, the rapid development of technologies, which often surpasses our ability to integrate them harmoniously, can exacerbate inequalities, disrupt social structures and reduce critical social networks, creating a sense of isolation and disorientation. These issues raise ethical questions, particularly for teacher education, as future educators will need to both understand these tools and guide their students towards ethical and thoughtful use.
In this context, existing research argues for the integration of generative AI tools into pedagogical practice. Alnahhal et al. (2024) highlight the need to design pedagogical tasks that encourage critical analysis and rigorous evaluation of the responses provided by these tools. Levine et al. (2024) recommend that discussions about the use of AI be integrated into curricula to make students aware of the risks and opportunities associated with these technologies. These approaches aim to find a balance between the benefits of these tools and the precautions needed to mitigate their negative effects. To support this integration, the use of data on their use is an advantage to be as close as possible to the identified needs of students.
This poster presents the development and improvement of a scale designed to explore key dimensions of students' use of generative AI tools in teacher education. Inspired by the work of Yilmaz et al. (2023) and Haglund (2023), this scale focuses on five main axes: perceived usefulness of the tools, ease of use, user attitudes towards the tools, credibility, and social influence on adoption. By combining these dimensions, the scale aims to provide a comprehensive overview of students' practices and perceptions of teaching education.
Data for this study will be collected in the spring of 2024 from students in teacher training programmes at the HEP Vaud (primary and secondary level) and the University of Fribourg (secondary level). These two contexts offer an interesting comparison between different pedagogical approaches and educational levels, while providing cross-referenced insights into students' needs and expectations regarding generative AI tools.
The expected results will provide a deeper understanding of students' perceptions of the benefits and limitations of these tools in their education, while identifying specific trends related to educational levels, disciplinary areas, or prior experience with these technologies. This scale can also serve as a basis for longitudinal research to document the evolution of the use and perceptions of generative AI tools over the coming years.
Bibliographie
Alnahhal, M., Alali, H., & Alshamsi, R. (2024). The Effect of ChatGPT on Education in the UAE. International Journal of Emerging Technologies in Learning (iJET), 19(6), 65-78.
Levine, S., Beck, S. W., Mah, C., Phalen, L., & PIttman, J. (2024). How do students use ChatGPT as a writing support?. Journal of Adolescent & Adult Literacy.
Mogavi, R. H., Deng, C., Kim, J. J., Zhou, P., Kwon, Y. D., Metwally, A. H. S., ... & Hui, P. (2024). ChatGPT in education: A blessing or a curse? A qualitative study exploring early adopters’ utilization and perceptions. Computers in Human Behavior: Artificial Humans, 2(1), 100027.
Stiegler, B. (2016). Dans la disruption: Comment ne pas devenir fou?. Éditions les liens qui libèrent.Stiegler, B., Béjà, A., & Padis, M. O. (2014). Le numérique empêche-t-il de penser?. Esprit, (1), 66-78.
Wang, B., Li, S., Dong, Y., & Zhang, H. (2024, April). ChatGPT-aided education teaching. In 2024 6th International Conference on Computer Science and Technologies in Education (CSTE) (pp. 141-145). IEEE.
Yilmaz, R., & Yilmaz, F. G. K. (2023). Augmented intelligence in programming learning: Examining student views on the use of ChatGPT for programming learning. Computers in Human Behavior: Artificial Humans, 1(2), 100005.
Evaluation par les pairs (peer reviewing)
oui
Nom de la manifestation
SSRE-SSFE-congrès annuel 2025
Date(s) de la manifestation
2-4 juillet 2025
Ville de la manifestation
Lucerne
Pays de la manifestation
Suisse
Participation sur invitation
oui