In the past, educators were concerned about the dangers of CliffsNotes. CliffsNotes are study guides that provide a series of bullet points for great works of literature, which many students use instead of actually reading.
Today, that certainly seems strange.
Suddenly there are new consumer AI tools on the market that can take any text, audio, or video and provide the same kind of simplified overview. And those summaries aren’t just fancy bullet points. Students now have access to tools like Google’s NotebookLM Turn your lecture notes into a podcastHere, a cheerful-sounding AI bot jokes and riffs on important points. Most tools are free and can do the job in seconds with the click of a button.
Understandably, all of this has raised concerns among some educators that students are leaving AI to do the heavy lifting of synthesizing information at a pace never before possible. That’s what I’m thinking.
But the picture becomes more complex, especially as these tools become more mainstream and their use begins to move beyond the classroom and become the norm in business and other settings.
And these tools can serve as an extra lifeline for neurodivergent students, who suddenly have access to services that can help them stay organized and support their reading comprehension, education experts say. say.
“There’s no universal answer,” says Alexis Peirce Cordell, an information studies lecturer at Indiana University Bloomington who recently did an assignment in which many students shared their experiences and concerns about AI tools. “Biology students are going to use it one way, chemistry students are going to use it another way. All of my students are using it in different ways. Masu.”
Instructors stress that it is not as simple as assuming that all students are frauds.
“Some students felt pressured to use tools, even if all of their classmates were doing them, feeling like it was getting in the way of their true learning. Even so, I thought I should do it too,” she says. They’re asking themselves questions like, “I’m going to go ahead and apply for five classes and an internship, so will this help me get through this particular challenge or this particular test?” Is there a cost involved?
All of this poses new challenges for schools and universities seeking to set boundaries and policies for the use of AI in the classroom.
The need for “friction”
It seems like almost every week, or even daily, technology companies announce new features that students are adopting for learning.
For example, just last week, Apple released Apple Intelligence features for iPhone. Any text can be remade into different tonessuch as casual or professional. And last month, OpenAI, the creator of ChatGPT, released a feature called . canvas It includes a slider bar that allows users to instantly change the reading level of the text.
Mark Watkins, a writing and rhetoric instructor at the University of Mississippi, says students are tempted by the time-saving promises of these tools and use them without skipping the actual work needed to internalize them. He says he’s worried he doesn’t realize what it could mean. Remember the ingredients.
“From a teaching and learning standpoint, that’s pretty alarming to me,” he says. “Because we want our students to have a little bit of a struggle, a little bit of friction, because that’s important for their learning.”
And new features make it harder to encourage teachers to use AI in ways that help students, he says. For example, teach them how to create prompts to change the writing level of something. “With the push of a button, the desired final level difficulty is eliminated.” Mash to get the final draft, and also get feedback on the final draft. ”
Even professors and universities that have adopted AI policies may need to rethink them in light of these new types of capabilities.
As mentioned by two professors recent editorials“Your AI policy is already outdated.”
“If a student reads an article you uploaded but can’t remember the important points, use an AI assistant to summarize it or remind them where they read it. Have you ever used AI in your research?” Ask authors Zach Justus, director of faculty development at California State University, Chico, and Nick Janos, professor of sociology at the school. They point out that popular tools such as Adobe Acrobat have “AI assistant” features that can summarize documents with the push of a button. “Should I promise not to press a button when reading through hundreds of pages of student teaching evaluations, even when evaluating colleagues in tenure and promotion files?” the professors wrote. There is.
Instead of drafting or re-drafting AI policies, the professors argue that educators should broadly consider what support from chatbots would look like.
However, Watkins urges manufacturers of AI tools to do more to reduce abuse of their systems in educational settings. In Edsage’s words when we spoke to him, “It’s about making sure we can take advantage of this tool that is so prominently used by students.” [is] It’s not just a load-relieving tool, it’s actually effective for their learning. ”
Non-uniform accuracy
These new AI tools pose many new challenges that go beyond those from the days when printed CliffsNotes were the everyday learning tool.
One is that AI summarization tools are unlikely to always provide accurate information due to a phenomenon in large-scale language models known as “hallucinations,” in which chatbots infer facts but present them to the user as certain. This means that there is no limit.
For example, when Bonnie Stachowiak first tried the podcast feature in Google’s NotebookLM, she said she was struck by how realistic the robot’s voice sounded and how well it summarized the documents she typed into it. Stachowiak is the host of the Longevity Podcast. teach in higher educationwho is also the dean of the College of Teaching and Learning at Vanguard University of Southern California, regularly experiments with new AI tools in education.
However, as I experimented with the tool further and typed documents on complex subjects that I knew well, I realized that there were occasional errors and misunderstandings. “It just flattens it out and you miss all of this nuance,” she says. “It sounds very intimate because it’s a voice and audio is a very intimate medium. But as soon as it’s something you know well, it’s going to fail.”
Still, she feels that NotebookLM’s podcasting feature helps her understand and communicate the bureaucratic issues at the university. For example, turn part of your teacher handbook into a podcast summary. She says she checked with colleagues who were familiar with the policy and felt they were doing a “perfectly good job.” “It’s really good at making two-dimensional bureaucracies more approachable,” she says.
Perth Cordell of Indiana University said students are also raising ethical questions about the use of AI tools.
“Some people say they are very concerned about the environmental costs of generative AI and how it is used,” she says, adding that ChatGPT and other AI models Requires large amounts of computing power and power.
Some also worry about how much data users end up giving to AI companies, especially if students use free versions of the tools, she added.
“We don’t have those conversations,” she says. “We are not talking about what it means to actively resist the use of generative AI.”
Still, instructors believe they are having a positive impact on students, including using tools to help create flashcards for study.
Then she listened to a student with ADHD. The student, who had always found reading long texts “overwhelming,” used ChatGPT to “overcome the hurdle of initial reading engagement, and then used ChatGPT to improve comprehension.” I was checking.”
Stachowiak also said she has heard of other AI tools being used by students with intellectual disabilities. one This allows users to break down large tasks into smaller, more manageable subtasks.
“This is not cheating,” she insists. “It’s about breaking things down and estimating how long something will take. That doesn’t come naturally to a lot of people.”