As ChatGPT and other new generative AI tools emerged in late 2022, cheating was a top concern for educators. After all, students quickly spread rumors on TikTok and other social media platforms that with a few simple prompts, a chatbot could write essays or answer homework questions in ways that would be hard for teachers to detect.

But recently, another concern about AI has come into focus: that the technology could reduce human interaction in schools and universities, and that school administrators may one day try to use AI to replace teachers.

Educators aren’t the only ones worried: this is becoming an education policy issue.

For example, last week, invoice The bill passed both houses of the California Legislature and aims to ensure that classes in the state’s community colleges are taught by qualified humans rather than AI bots.

“This is a fundamental right,” said California Assemblywoman Sabrina Cervantes, a Democrat who introduced the bill. In a statement The bill’s purpose is to “create guardrails for the introduction of AI in the classroom while ensuring that community college students can receive instruction from human instructors.”

To be clear, no one seems to have actually proposed replacing professors at the state’s community colleges with ChatGPT or other generative AI tools. Even the bill’s authors say they can imagine beneficial uses for AI in education, and the bill would not stop colleges from using generative AI to help with tasks like grading and course material creation.

But supporters of the bill also say they have reason to worry about the possibility of AI replacing professors in the future. For example, earlier this year, a Boston University dean Raised concerns Interest in AI grew among striking graduate student workers in pursuit of higher wages when they cited AI as one possible strategy for handling class discussions and other classroom activities affected by the strike, but university officials later clarified that they had no intention of replacing graduate students with AI software.

California is the furthest along, but is the only state considering such a measure. In Minnesota, Rep. Dan Wolgamott of the Democratic-Farmer-Labor Party said: proposed the bill The bill would ban campuses in the Minnesota State University System from using AI “as the primary instructor of a credit-bearing course.” The bill has stalled for now.

K-12 teachers are also starting to call for similar protections against AI replacing teachers. The National Education Association, the nation’s largest teachers union, recently Policy Statement on the Use of AI in Education He stressed that human educators “must remain at the heart of education.”

This reflects a complex but highly tense atmosphere among many educators, who see both promise and potential threat in generative AI technology.

Careful wording

Even education leaders who are pushing for measures to keep AI from replacing educators are careful to point out that the technology could have beneficial applications in education, and they are careful to use language that avoids outright bans on its use.

For example, California’s bill initially faced pushback from even supporters of the concept over concerns that it was too early to codify fast-changing generative AI technology, said Wendy Brill Wynkoop, president of the California Community College Faculty Association, which led the bill’s drafting effort.

An earlier version of the bill stated that AI “cannot be used to replace teachers for the purpose of instructing or regularly interacting with students in class, but only as a peripheral tool.”

Internal arguments nearly led leaders to abandon the effort, she said, before Brill-Wynkoop proposed a compromise: removing all explicit references to artificial intelligence from the bill’s language.

“We don’t need the word AI in the bill, we just need to make sure that humans are at the center,” she says. So the final, very brief, bill language reads: “This bill would explicitly require that faculty members who teach classes be individuals who meet the minimum qualifications set forth above for credit-based faculty positions.”

“Our goal is not to build a giant brick wall in front of AI,” Brill Wynkoop said. “That’s crazy. AI is like a fast-moving train. We’re not against technology, but the question is, ‘How do we use it sensibly?'”

And she acknowledges that she doesn’t believe there’s some evil mastermind in Sacramento saying, ‘I want to get rid of all those mean teachers.'” But she adds that in California, “education has been underfunded for years, and budgets are tight, so you have some technology companies saying, ‘How can we help you with your limited budget by driving efficiency?'”

Ethan Mollick, a professor at the University of Pennsylvania who is renowned for his work in the field of AI in education, said: Written in the newsletter Last month, he said he was concerned that many companies and organizations are rushing to adopt AI technology, placing too much emphasis on efficiency and reducing headcount. Instead, he argues, leaders should focus on finding ways to rethink how they do things so they can take advantage of the tasks that AI is best at.

He noted in the newsletter that even the companies building these new large-scale language models don’t yet know which real-world tasks they’re best suited to perform.

“I worry that the lessons of the Industrial Revolution are being lost in the adoption of AI in the enterprise,” he wrote. “Efficiency gains must translate into cost savings before anyone in the organization can see the benefits of AI. It’s like when in the 1700s, after getting the steam engine, all the manufacturers maintained production volume and quality, decided to fire employees in response to the newfound efficiencies, and never tried to scale production and build a global company.”

The professor is a new Generative AI Lab The institute is trying to model the approach it would like to see: Researchers will explore evidence-based ways to use AI and strive to avoid what he calls “downside risks” — the concern that organizations might lay off skilled employees in the name of cutting costs and not use AI effectively. He said the institute is committed to sharing what it learns.

Putting people at the center

The AI ​​Education Project, a nonprofit organization focused on AI literacy, investigated The study will survey more than 1,000 U.S. educators in 2023 about how they feel about how AI is impacting the world, specifically education. The survey asked participants to choose from a list of their top concerns about AI, with the most commonly cited being that AI could lead to a “lack of human interaction.”

This may be a response to recent announcements by major AI developers, including ChatGPT’s developer OpenAI, of new versions of their tools that can respond to voice commands and see and respond to what students are typing on the screen. Khan Academy founder Sal Khan recently posted a video demo of him tutoring his teenage son using a prototype of the organization’s chatbot, Khanmigo, with these capabilities. Khan said the technology shown in the demo is not yet available and is at least six months to a year away. Still, the video has gone viral and sparked a debate about whether machines can replace humans in something as personal as one-on-one tutoring.

Meanwhile, many of the new features and products released in recent weeks are focused on helping educators with administrative tasks and responsibilities like creating lesson plans and other classroom materials — and these are behind-the-scenes uses of AI that students may never know about.

That was evident in the exhibit hall at the ISTE Live conference in Denver last week, which brought together more than 15,000 educators and education technology leaders. (EdSurge is an independent newsroom with the same parent organization as ISTE. Learn more about EdSurge’s ethics and policies here and supporters here.)

Companies ranging from small startups to major tech companies touted new features that use generative AI to help educators with their various responsibilities, with some offering tools that act as virtual classroom assistants.

Many of the teachers at the event weren’t actively worried about being replaced by bots.

“What I bring to the classroom can’t be replicated with AI, so that’s not my concern,” said Lauren Reynolds, a third-grade teacher at Riverwood Elementary School in Oklahoma City. “I have a human connection. I try to understand each kid as an individual. I read beyond what they tell me.”

Christina Matasavage, a STEM teacher at Belton Preparatory School in South Carolina, believes the COVID-19 closures and emergency shift to distance learning proved that gadgets are no substitute for human teachers. “When COVID hit and we went online, I think we realized we needed teachers so badly. People were very understanding. [quickly] We can’t be replaced by technology.”



Source

Share.

TOPPIKR is a global news website that covers everything from current events, politics, entertainment, culture, tech, science, and healthcare.

Leave A Reply

Exit mobile version