The presence of generative AI in schools continues to evolve and its relationship with educational settings is unstable, with some school districts banning generative AI only in the following cases: reverse course. While automating tasks can save teachers time, it can also cause headaches as they become accomplices to student misbehavior.

So how much effort does it take to create guidelines to help educators address the challenges of using generative AI tools in their work? At Michigan, it was a team effort.

A coalition of 14 educational institutions led by the nonprofit Michigan Virtual Learning Institute. Released sample guidelines Earlier this month, we educated teachers and administrators on the potential pitfalls to consider before using AI tools in the classroom and other tasks. This includes verifying the accuracy of AI-generated content, quoting AI-generated content, and determining what kind of data is safe to feed into AI programs. Masu.

Ken Durkin, senior director of the Michigan Virtual Learning Institute, said the group wanted to create a document that was easy to understand, but “could have probably included 40,000 important things.”

“What we experience when working with school districts is a general lack of knowledge, interest, and awareness of generative AI,” Durkin said. I think they are doing something wrong because there is no strong guidance on what they should explore and do. ”

Durkin said the organization is using the document to help school districts and educators think about the use of generative AI without taking the extremes of banning it or allowing its use without restriction. He said he was hoping for that.

“This was just the way we operated: How can we just enable exploration without disabling access?” he says. “Otherwise, you have people saying, ‘This is the latest trend, it’s going to go away.'”

Mark Smith, executive director of the Michigan Learning Computer Users Association, said generative AI is evolving quickly and will create guidelines for educators and school districts on how and when to use it. He says this is an important time.

“AI is everywhere. It’s doing everything for everyone who’s interested,” he says. “By the time we figure out his one-year, three-year, five-year plan, it’s changing before our eyes. If we don’t address the problem, things will continue to change.”

Student data protection

According to Paul Liabenow, school principals want to know how AI can be used in the classroom, not just having students copy and paste it, and of course students are using AI to cheat. He is concerned about this.

But many of the questions he receives as executive director of the Michigan Elementary and Secondary School Principals Association revolve around legal compliance with AI programs and student privacy laws, and how they align with laws like FERPA and the Individuals with Disabilities Education Act. Liabenau explains: .

“We receive countless questions every week, and the number of questions continues to grow,” Liabenow said. Principals want guidance from organizations like Michigan Virtual to “not only avoid walking into a black hole as leaders, but also to effectively leverage it to improve student achievement.” There is.

The AI ​​guidance document asks educators to always assume that the data they are inputting will be publicly available unless the company that owns the generative AI tool has an agreement with the school district.

Liabenow said one confidentiality concern is that teachers, counselors and administrators may try to use AI programs to manage student data related to mental health and discipline. Masu. This could lead to litigation.

“People think they can use AI tools to enter individual student names and run a master schedule, but this leads to some ethical and legal challenges,” Liabenow said. says. “I love this guidance tool because it reminds us of areas we need to be sensitive to and guard diligently.”

The privacy pitfall lies not in the everyday use of generative AI, but in the growing number of apps with potentially weak data protection policies, said Smith of the Michigan Association of Learner Computer Users. says. Read when signing up for online services. It could be easy to violate privacy laws, he added. Changes suggested to enhance Children’s Online Privacy Protection Act.

“How many people downloaded the latest iPhone contract without reading it?” Smith said. “If you scale this up to her 10,000 students in the district, you can imagine how many end users will have to read user agreements.”

Is AI your co-writer?

It’s not just students’ use of AI that needs to be considered. Teachers can use generative AI to create lesson plans, and any district employee can use it to create working documents.

As such, the new guidelines include examples of how to cite the use of generative AI in educational materials, research, or business documents.

“The more transparent you are about how you’re using AI and what it’s for, the better off everyone in the conversation will be,” Durkin says. “I don’t think people will be publicly announcing the use of AI in two or three years. It will be integrated into our workflows. But we will learn from each other and integrate AI into human involvement in the process. It’s important to connect to it. It will disappear over time.”

When AI is embedded in everything

Generative AI is increasingly integrated into software that is already widely used. Consider a spell-checking program like Grammarly. A Georgia student says he was accused of cheating after his AI detection software flagged a paper that used the program.

With increased adoption, Durkin says, AI-powered educational tools become easier to access and therefore more complex when it comes to using them with safety in mind. One of the important considerations regarding the current generative AI situation is that a user still has to copy and paste content into his AI program in order to use it, thus stopping it for a little while. That means you need to do it.

“It’s often the Wild West in terms of access to tools. Everyone has a Google Account, and people can use their Google Account to log into a number of free services,” Durkin said. says. “We wanted to make sure we gave people the tools to reflect on whether they were using it for legal purposes or for ethical purposes. [way], or whether it violates any policy before that. So stop and think for a moment. ”

Smith points to a section of the new guidelines that asks educators to think about how what is generated by AI may be inaccurate or biased. Even as generative AI improves, “all AI, no matter how good, has risks and limitations,” he says.

“Sometimes the best dataset for educators is not an AI tool, but teachers in the field with over 10 years of experience,” Smith says. “There’s still a human element to this, and I think the guidance document that talks about these risks and limitations is a kind of friendly nudge. It says, ‘Hey, don’t forget this.’ That’s a polite way of saying it.”



Source

Share.

TOPPIKR is a global news website that covers everything from current events, politics, entertainment, culture, tech, science, and healthcare.

Leave A Reply

Exit mobile version