When Satya Nitta worked at IBM, he and a team of colleagues undertook a bold mission to use the latest in artificial intelligence to build a new kind of personal digital tutor.
This was before ChatGPT existed, and very few people were talking about how great AI was. But Nitta was using his IBM’s Watson, perhaps the hottest AI system at the time. This AI tool has achieved major wins, including: Defeated a human on the quiz show Jeopardy in 2011.
Nitta said he was optimistic that Watson could empower private tutors, but knew the job would be extremely difficult. “I remember telling the upper management at IBM that this was going to be a journey that would take him 25 years,” he recently told EdSurge.
He says his team has been experimenting for about five years, and along the way they’ve made smaller attempts at learning products, including a pilot chatbot assistant that became part of Pearson’s online psychology courseware system in 2018. assisted in the construction of
Ultimately, however, Nitta believes that even though the generative AI technology driving recent excitement may bring new capabilities that will transform education and other fields, the technology will not be used by generalized personal tutors. It has not yet reached the point where it has become popular, and it is judged that it will not become popular in the future. At least for a few decades, if ever.
“Before there will be AI tutors, there will be flying cars,” he says. “This is a very human process that is hopelessly impossible for AI to respond to in any meaningful way. It’s like being a therapist or a nurse.”
Instead, he co-founded a new AI company called Merlyn Mind to build other types of AI-powered tools for educators.
Meanwhile, many companies and education leaders are working hard these days to pursue the dream of building an AI tutor.Even about recent things white house executive order Try to help the cause.
Earlier this month, Sal Khan, leader of the nonprofit organization Khan Academy, said: told the New York Times: “We are on the brink of using AI for perhaps the biggest positive transformation the education world has ever seen. And the way we’re going to make it happen is by engaging every student on the planet.” to give them an amazing personal tutor powered by artificial intelligence.”
Khan Academy was one of the first organizations to attempt to develop such a tutor, called Khanmigo, using ChatGPT, and is currently in pilot stages with a range of schools.
But Khan’s system also comes with the unpleasant caveat that it “occasionally makes mistakes.” This warning is necessary because all modern AI chatbots suffer from so-called “hallucinations.” This term is used to describe situations where a chatbot simply fabricates details when it doesn’t know the answer to a user’s question.
AI experts are working hard to offset the problem of hallucinations, and one of the most promising approaches so far is deploying another AI chatbot to check the results of a system like ChatGPT. , is to check if the details are fabricated. This is what researchers at the Georgia Institute of Technology, for example, are trying to do, with the hope that their multi-chatbot system can get to the point of removing false information from answers before showing them to students. However, it is not yet clear whether the approach can reach a level of accuracy that is acceptable to educators.
However, at this critical point in the development of new AI tools, it is useful to ask whether chatbot tutoring is the right goal for developers. Or is there a better metaphor for what generative AI can do to help students and teachers than “tutor”?
“Always Connected Helper”
Michael Feldstein has been spending a lot of time experimenting with chatbots lately. He has been an edtech consultant and blogger for many years, and has previously been unashamed in criticizing what he sees as overhype by companies selling edtech tools.
In 2015 he famously criticized It promised a tool from a company called Knewton, which was cutting-edge in educational AI at the time. His CEO at Knewton, Jose Ferreira, said his company’s product will be “like a robot tutor in the sky that can read half of your mind and know your strengths and weaknesses down to your percentile.” For this reason, Feldstein said the CEO responded that the tool was “selling snake oil” because it was far from delivering on its promises. (Knewton’s assets were secretly sold a few years later.)
So what does Feldstein think about the latest promises from AI experts that effective tutors could be on the horizon?
“ChatGPT is by no means snake oil, far from it,” he tells EdSurge. “It’s also not some empty robot tutor that can half-read your mind. This has new capabilities and requires us to think about what kinds of tutoring features today’s technology can provide that will be helpful to our students.” there is.”
However, he believes that tutoring is a useful way to see what ChatGPT and other new chatbots can do. And he says that comes from personal experience.
Feldstein has a relative who is battling a brain hemorrhage. That’s why Feldstein uses his ChatGPT to provide personalized lessons on understanding the medical condition and prognosis of a loved one. Feldstein says he’s asking questions in an ongoing thread on ChatGPT to better understand what’s going on as he receives updates from his friends and family on Facebook. .
“If you ask the question in the right way, you can get the right amount of detail about, ‘What do we know today about her chances of being okay again?'” Feldstein says. “Although it’s not the same as talking to a doctor, it taught me about serious topics in a meaningful way and helped me gain more knowledge about my relative’s condition.”
Feldstein says he’ll call it tutoring, but insists it’s still important that companies don’t oversell their AI tools. “We’ve done a terrible job of saying this is a tell-all box, or that it’s coming in a few months,” he says. “They are tools. Strange tools, aren’t they? Like humans, they cheat in strange ways.”
He points out that while human tutors can make mistakes, most students understand what they’re getting into when they book with a human tutor.
“Even if you go to a college tutoring center, they don’t know everything. You don’t know how trained they are. There’s a chance they might say something wrong to you. But you go in and get all the help you can.”
Whatever you call these new AI tools, it’s helpful to have “an always-on helper you can ask questions of,” he says, even if the results are just a starting point for further learning.
“Boring” but important support work
If tutoring is ultimately not appropriate, what new ways can generative AI tools be used in education?
For Nitta, his role is more as an assistant to an expert than a substitute for an expert tutor. In other words, he imagines chatbots could help human therapists summarize and organize notes from patient sessions, rather than replacing them, for example.
“It’s not an AI pretending to be a therapist, it’s a very helpful tool,” he says. Even though some may find it “boring,” the technology’s superpower is to “automate things that humans don’t want to do,” he argues.
In education, his company builds AI tools designed to help teachers and human tutors do their jobs better. To that end, Merlyn Mind took the unusual step of building its own so-called large-scale language model from scratch designed for education.
Still, by training with vetted datasets, our models can support specific educational fields, rather than relying on ChatGPT and other mainstream tools that utilize vast amounts of information from the internet. He argues that the best results are achieved when the system is adjusted to:
“What are human tutors good at? They know their students well and provide human motivation,” he added. “We are all about AI that powers tutoring.”