File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Understanding AI guilt: the development, pilot-testing, and validation of an instrument for students

TitleUnderstanding AI guilt: the development, pilot-testing, and validation of an instrument for students
Authors
KeywordsAcademic integrity
AI ethics
AI guilt
AI literacy
Cognitive dissonance
Generative AI
Imposter syndrome
Issue Date4-Jun-2025
PublisherSpringer
Citation
Education and Information Technologies, 2025 How to Cite?
Abstract

This study explores the concept of AI guilt, a psychological phenomenon where individuals feel guilt or moral discomfort when using generative AI tools, fearing negative perceptions from others or feeling disingenuous (Chan, 2024). The phenomenon has become increasingly relevant as AI tools gain prominence in educational contexts. This paper introduces the development, pilot-testing, and validation of an instrument designed to measure AI guilt among students. Data were collected from 121 secondary school participants at an AI teaching and learning expo. The instrument identifies three dimensions of AI guilt: perceived laziness or inauthenticity, fear of judgment, and identity and self-efficacy concerns. Principal Component Analysis (PCA) and Cronbach’s alpha were employed to refine the instrument, ensuring its reliability and validity. By understanding AI guilt, educators and policymakers can mitigate its psychological effects and promote ethical AI usage in education.


Persistent Identifierhttp://hdl.handle.net/10722/357696
ISSN
2023 Impact Factor: 4.8
2023 SCImago Journal Rankings: 1.301
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorChan, Cecilia Ka Yuk-
dc.date.accessioned2025-07-22T03:14:21Z-
dc.date.available2025-07-22T03:14:21Z-
dc.date.issued2025-06-04-
dc.identifier.citationEducation and Information Technologies, 2025-
dc.identifier.issn1360-2357-
dc.identifier.urihttp://hdl.handle.net/10722/357696-
dc.description.abstract<p>This study explores the concept of AI guilt, a psychological phenomenon where individuals feel guilt or moral discomfort when using generative AI tools, fearing negative perceptions from others or feeling disingenuous (Chan, <a title="Chan, C. K. Y. (2024). Exploring the factors of AI guilt among students– Are you guilty of using AI in your homework? Preprint arXiv:2407.10777v1." href="https://link.springer.com/article/10.1007/s10639-025-13629-y#ref-CR6">2024</a>). The phenomenon has become increasingly relevant as AI tools gain prominence in educational contexts. This paper introduces the development, pilot-testing, and validation of an instrument designed to measure AI guilt among students. Data were collected from 121 secondary school participants at an AI teaching and learning expo. The instrument identifies three dimensions of AI guilt: perceived laziness or inauthenticity, fear of judgment, and identity and self-efficacy concerns. Principal Component Analysis (PCA) and Cronbach’s alpha were employed to refine the instrument, ensuring its reliability and validity. By understanding AI guilt, educators and policymakers can mitigate its psychological effects and promote ethical AI usage in education.<br></p>-
dc.languageeng-
dc.publisherSpringer-
dc.relation.ispartofEducation and Information Technologies-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subjectAcademic integrity-
dc.subjectAI ethics-
dc.subjectAI guilt-
dc.subjectAI literacy-
dc.subjectCognitive dissonance-
dc.subjectGenerative AI-
dc.subjectImposter syndrome-
dc.titleUnderstanding AI guilt: the development, pilot-testing, and validation of an instrument for students-
dc.typeArticle-
dc.identifier.doi10.1007/s10639-025-13629-y-
dc.identifier.scopuseid_2-s2.0-105007244775-
dc.identifier.eissn1573-7608-
dc.identifier.isiWOS:001502035400001-
dc.identifier.issnl1360-2357-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats