If our goal is to prep students for high-stakes tests, they won't stand a chance against AI. Changing the outcomes of education becomes the imperative, says Christopher Dede.
“All I know about AI I learned in my book club” was one notion that popped into my head when I sat down to write this piece.
That’s how the brain works. Deciphering information and discerning meaning through articulation, it’s a process shaped by experience, culture, and values, in this case, mine. (More on the book club to come.)
It’s a key element of human cognition lacking in AI. Nevertheless, media coverage—largely focused on the large language model (LLM) chatbot ChatGPT—has misrepresented its capacity. When you start with a technology, warns Christopher Dede, “it’s a solution looking for a problem, and that’s never a good thing.”
A senior research fellow and professor in learning technologies at Harvard Graduate School of Education, Dede spoke at the AI x Education conference last month. He’s excited about the possibilities of AI in teaching and learning but addressed some important misconceptions.
Akin to a digital parrot, LLMs perceive a prompt as a set of symbols and respond accordingly, word by word. “Humans think in bigger chunks than that,” says Dede. “An LLM is like a brain without a mind—no consciousness, metacognition, agency, senses, experiences of implicit knowledge of what it is like to have a biological body, a family, friends, a culture, and an ethical system with moral values.”
These limits trouble Dede because in education, “human teachers remember what it was like to be a learner. AI has no sense of that at all.”
Moreover, LLMs are subject to bias and drift—veering off topic—as well as hallucinations, in which the machine simply makes stuff up, including sources.
AI does not discriminate when it adapts content from the web, freely training on content without regard to veracity or intellectual property status. And students, among other users, might be none the wiser.
That said, LLMs are skilled at “reckoning,” or calculative prediction, which can complement human strengths of judgment and practical wisdom. What Data was to Picard, quips Dede, in a Star Trek analogy. The college professor foresees having AI assistants, each focused on reckoning tasks: formative assessments, for example, or content tutoring.
In the library, “AI can give us a diversity audit, tell us where we have holes in our collection, and look at usage data to find emerging topics,” says Joni Gilman. “It can help us with weeding and resource allocation,” adds the media specialist at Seckinger High School in Gwinnett County, GA, interviewed for our cover story, “Mission Critical” by Andrew Bauld.
When it comes to acing high-stakes tests, it’s hard to beat AI. And if that’s the task we’re preparing students for, we’re setting them up for failure. Changing the outcomes of education becomes the imperative, “otherwise we are training people to think in ways that are not necessary anymore,” says Dede.
Facilitating deeper understanding and emphasizing the learning journey over the artifact requires a significant shift. Take eighth grade teacher Sarah Cooper, of Flintridge Preparatory School in La Canada, CA. She recast a current events assignment to be “more relevant and less able to drive through using ChatGPT,” asking students, for example, to provide a personal, verbal explanation of why an idea is important to them, their communities, and so on. Cooper and Dede both emphasize the value of discussion to expand understanding as well as perspective (as in my book group).
Whether we, as a culture, will take the time to devise the most thoughtful approach to adapting AI for learning environments remains to be seen. And it will make all the difference.
We are currently offering this content for free. Sign up now to activate your personal profile, where you can save articles for future viewing
Add Comment :-
Be the first reader to comment.
Comment Policy:
Comment should not be empty !!!