Many teachers and professors have been spending their time this summer experimenting with AI tools to help them prepare slide presentations, write test and homework questions, etc. This is thanks in large part to the plethora of new tools and updated features incorporating ChatGPT that companies have released in recent weeks.

As more teachers experiment with using generative AI to create instructional materials, an important question arises: Should teachers disclose it to their students?

It’s a natural question, given widespread industry concerns about students using AI to write essays or bots to do their homework: If students are expected to be transparent about how and when they use AI tools, shouldn’t educators be too?

When Mark Watkins returns to the classroom this fall to teach a digital media studies course, he’ll show his students how he’s using AI behind the scenes to prepare for his classes. Watkins is a lecturer in composition and rhetoric at the University of Mississippi and director of the AI ​​Summer Institute for Teachers of Composition, a faculty-option program at the university.

“When you use AI, you have to be open and honest and transparent,” Watkins says. “I think it’s important to show how you use it and how you’re going to model this behavior going forward.”

It may seem logical for teachers and professors to openly disclose their use of AI to develop instructional materials, just as they would ask students to use AI in their assignments, but Watkins points out that it’s not as simple as it seems. There’s a culture in universities where professors pick up materials from the web without necessarily citing them. And K-12 teachers frequently use materials from a variety of sources, including school and district curriculum and textbooks, resources given to them by colleagues or found on websites, and materials purchased from marketplaces like Teachers Pay Teachers, he said. But teachers rarely tell students where they get these materials.

Watkins said that a few months ago, when he saw a demo of a new feature at a popular learning management system that uses AI to create learning materials with one click, he asked a company representative if it could add a button that would automatically add a watermark when AI was being used so students would know.

But the company wasn’t receptive, he says: “The impression I got from developers, and this is the most infuriating thing about this whole situation, was that they were basically like, ‘Who cares?'”

Many educators seem to agree. Recent Research In a survey conducted by Education Week, about 80% of K-12 teachers said they don’t need to tell students or parents if they use AI in lesson planning, and most of the educators said the same was true for designing assessments and tracking behavior. In open-ended responses, some educators said they see AI as a tool similar to a calculator, or the same as using content from a textbook.

But many experts say it depends on what a teacher is doing with the AI: For example, a teacher may decide to forgo disclosure if they are using a chatbot to do a task like improving a text or drafting slides, but may want to be clear if they are using the AI ​​for a task like helping with grading assignments.

So while teachers are learning how to use generative AI tools themselves, they are also struggling with how and when to communicate what they are trying to do.

Leading by example

For Alana Winick, director of instructional technology at Pocantico Hills Central School District in Sleepy Hollow, New York, when using generative AI in a new way — and when people might not realize it’s possible — it’s important to communicate that clearly to her colleagues.

For example, when she started using the technology to compose emails to her staff, she included the line “created in collaboration with artificial intelligence” at the end. This was because she relied on the AI ​​chatbot to ask for ideas on how to make her messages “more creative and engaging,” and then “tweaked” the results into her own messages, she explains. She imagines that teachers might use AI in a similar way to create assignments and lesson plans. “Whatever it is, the idea has to start and end with the human user,” she stresses.

However, a book on AI in education,The Age of Becoming: Artificial Intelligence and the Future of Education” said the host. Podcasts The researcher believes that publishing this disclosure document is a temporary measure and not a fundamental ethical requirement, because he believes that this type of use of AI will become commonplace. [that] “After 10 years, I have to do that,” she said. “I did it to raise awareness and normalize it.” [it] And encourage that and say, “It’s OK.”

Jane Rosenzweig, director of the Harvard Writing Center, said adding disclosures will depend on how teachers are using AI.

“If instructors are using ChatGPT to generate writing feedback, I think they should absolutely tell students that,” she says. After all, she points out, the purpose of writing instruction is “two people communicating with each other.” When Rosenzweig grades students’ papers, she assumes that the writing is student-written unless otherwise noted, and she thinks students will expect the feedback to come from a human instructor unless otherwise noted.

When EdSurge posed the question of whether teachers and professors should disclose to their readers whether they use AI to create educational materials, Higher Education NewsletterSeveral readers responded that doing so was important because it provided a learning opportunity for both their students and themselves.

“If you’re just using it to help with brainstorming, you may not need AI,” says Katie Datko, director of distance learning and instructional technology at Mount San Antonio College, “but if you’re using AI as a co-creator of content, evolving standards for citing AI-generated content should apply.”

Searching for policy guidance

Since the release of ChatGPT, many schools and universities have rushed to create policies on the appropriate use of AI.

But most of these policies don’t address the question of whether educators should tell students how they’re using new generative AI tools, says Pat Yonpradit, chief academic officer at Code.org and leader of TeachAI, a consortium of education organizations that works to develop and share guidance for educators about AI. (EdSurge is an independent newsroom that shares a parent organization with ISTE, which is also part of the consortium. You can learn more about EdSurge’s ethics and policies here and about its supporters here.)

a Toolkit for schools Guidelines released by TeachAI recommend that “if teachers and students use AI systems, they should disclose and explain their use.”

But Yonpradit said his personal view is that “it depends” on what the AI ​​is being used for. If the AI ​​is simply helping with writing emails or even as part of lesson plans, it may not need to be disclosed, he explained. But more core educational activities, such as when AI grading tools are used, are something that should be disclosed, he said.

But even if educators do decide to use AI chatbots, how they work can be complicated, Jonpradit said. Modern Language Association And that American Psychological Association While some studies have issued guidelines for citing generative AI, the approach remains clunky, he says.

“It’s like pouring new wine into old wineskins,” he says, “because you’re taking a paradigm from the past for obtaining and citing source material and fitting it to a tool that doesn’t work in the same way. What came before was human and static. It’s weird to fit AI into that model, because AI is a tool, not a source.”

For example, the output of an AI chatbot depends heavily on how the prompt is worded, and most chatbots will give slightly different answers each time, even when the exact same prompt is used.

Yonpradit recently participated in a panel discussion where students cheered when educators encouraged teachers to disclose their AI use because they ask students to. But to Yonpradit, these situations are not quite the same.

“They’re two very different things,” he says. “As a student, you turn your work in for a grade and it gets graded. Teachers know how to do it, they’re just doing it more efficiently.”

That said, he added, “if teachers are disclosing it and putting it on Teachers Pay Teachers, then of course they should disclose it.”

The key, he said, is for states, school districts and other education agencies to develop their own policies so the rules are clear.

“Without guidance, expectations become lawless.”



Source

Share.

TOPPIKR is a global news website that covers everything from current events, politics, entertainment, culture, tech, science, and healthcare.

Leave A Reply

Exit mobile version