There’s No Ghost in the Machine: How AI Changes Our Views of Ourselves
Aimed at integrating cutting-edge psychological science into the classroom, columns about teaching Current Directions in Psychological Science offer advice and how-to guidance about teaching a particular area of research or topic in psychological science that has been the focus of an article in the APS journal Current Directions in Psychological Science.
More teaching resources from the Observer: Empirical Evidence Is My Love Language
Have you ever wondered if someone you were communicating with was real or an artificial intelligence (AI)? ChatGPT‘s launch in November 2022 threw higher education into a tizzy. Naysayers bemoaned how students would use artificial intelligence to cheat. Many faculty kicked off the new year by modifying their assignments to make them AI-proof. This was a Sisyphean task as every week saw ChatGPT evolve. Even those optimistic about generative AI’s potential to improve learning sometimes saw it as the spectral rise of the machine.
What many people have missed in the excitement and concern to find ways to let AI make work easier is that the advent of AI involves dehumanization (Bender, 2024). Large language models such as ChatGPT are trained to respond with sequences of words when fed sequences of words. The output makes sense. The output even seems to come from a sentient being. But the reality is that like stochastic parrots, a term coined to capture this, AI stitches together words similar to previously programmed patterns and without any reference to meaning (Bender et al., 2021).
Because AI seems to be human, it has the result of paradoxically making actual humans seem less so. Bender (2024) outlines six ways AI contributes to dehumanization, providing fodder for a variety of classes and to aid discussions of racism, sexism, White privilege, transphobia, emotion, and many more topics. For example, Bender discusses how metaphors comparing the brain to a computer can be reversed to view the computer as a brain. Giving AI human-like qualities results in making the rational computer seem better than an emotional human.
There is a lot to be wary of when using AI. Bender (2024) nicely alerts the reader to issues such as digital physiognomy—the use of AI to attempt to predict sexual orientation or political affiliation from photos, voice samples, or videos. The author describes the shortcomings of the data used to train most AIs and how AI programming reinforces a White world view (e.g., AI voice assistants speak like White people do).
The student activities described will help students critically analyze claims of AI capabilities while becoming more familiar with some of its problems.
- Divide students into two groups for a 10-minute conversation. Have one group start a text conversation with their friends and the other group interact with an AI server such as Bing or ChatGPT. (You can Google each and sign up is quick and easy.) You can provide a set of questions to guide each group (e.g., What should I do this weekend? Give me a show to stream? Can you cheer me up?). Have each group rate their mood and the quality of the conversation and how they felt about their conversation partner. Watch for humanization of the AI.
- To better prepare students for the influx of AI, you can tap into a range of activities on Google Arts and Culture that make fun ice breakers for any class and also give students a feel for AI’s power. You can extend the ideas below by having students discuss AI’s creative ability or its potential to help us be more creative ourselves. Some individuals already believe mind-altering substances can enhance creativity. Do students feel the same about AI?
- Have them try to distinguish real artwork from AI-generated art in this exercise.
- Provide them with a short exercise in prompt engineering—the skill of writing prompts to get more specific details from AI.
- Instructors can modify the following two writing assignments (E. Bender, personal communication), to create engaging student activities.
- Critical reading of popular press articles on AI. This writing assignment for undergraduate students can be modified for a classroom discussion or can be used as is to develop critical thinking about technology in general. Have students find an article about AI published within the last year on social media or in the popular press. Instruct students to summarize the AI described and identify the ideas the author wants the reader to understand. Students can then try the AI tool discussed. The challenge is for students to identify the beliefs formed about the tool, its limitations, and the possible implications for learning resulting from using it.
- Societal impacts of language technology. This graduate student assignment involves a short assignment in which students write either a “letter to the editor,” an op-ed, a tech explanation/tutorial, an ethics lesson plan, or a tweetorial (a long explanatory thread on X, formerly known as Twitter). Have students describe what a specific AI tool (e.g., ChatGPT) does and what institutional or classroom policies related to the AI might be needed.
Feedback on this article? Email [email protected] or login to comment.
Reference
Bender, E. M., Gebru, T., McMillan-Major, A., & Shmitchell. M. (2021). On the dangers of stochastic parrots: Can language models be too big? In FAccT ’21: Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency (pp. 610–623). Association for Computing Machinery. https://doi.org/10.1145/3442188.3445922
APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.
Please login with your APS account to comment.