Credit: CC0 Public Domain
Credit: CC0 Public Domain
When researchers asked hundreds of people to watch others shake a box, it took almost everyone just a few seconds to understand the purpose of the shaking.
A seemingly simple study by perception researchers at Johns Hopkins University has demonstrated for the first time that people can tell when others are trying to learn just by looking at their actions.It was published in the magazine Proceedings of the National Academy of Sciencesthis research reveals important yet neglected aspects of human cognition, and aspects that impact artificial intelligence.
“Just by looking at someone’s body movements, you can tell what they are trying to learn about their environment,” says a psychologist and researcher who studies how vision and thought interact. said Chaz Firestone, assistant professor of neuroscience and author. “We do this all the time, but there’s very little research on it.”
Recognizing other people’s actions is something we do every day, like guessing which way someone is heading or figuring out which object they’re reaching for. These are known as “pragmatic actions.” Many studies have shown that people can quickly and accurately identify these behaviors simply by observing them. A new study from Johns Hopkins examines a different type of behavior: epistemic behavior, which occurs when someone attempts to “learn” something.
For example, someone may put their feet in a pool to go swimming, or they may put their feet in a pool to test the water. Although the behaviors are similar, there are differences, and the Johns Hopkins team speculated that observers could detect another person’s “epistemic goals” simply by observing their behavior.
Across several experiments, the researchers asked a total of 500 participants to watch two videos of someone picking up a box full of objects and swinging them around. One shows someone shaking a box to figure out how many things are inside. Another shows someone shaking a box to figure out the shape of the object inside. Almost all participants knew who was shaking for numbers and who was shaking for shapes.
“What was surprising to me was how intuitive this was,” said lead author Shorey Croom, a graduate student at Johns Hopkins University. “People can really sense what other people are trying to understand. It shows you.”
“When you think about all the mental math someone has to do to understand what another person is trying to learn, it’s a very complex process. But our findings show that it’s easy for people to It shows what can be done,” Firestone added.
The discovery could also help develop artificial intelligence systems designed to interact with humans. For example, a commercial robot assistant that can look at a customer and guess what they are looking for.
“It’s another thing to know where someone is going and what products they want,” Firestone said. “But it’s another thing to guess whether someone is lost or what kind of information they’re looking for.”
In the future, the researchers hope to explore whether people can observe someone’s epistemic and practical intentions – what they do when they dip their feet in a pool. They are also able to build computational models that show in precise detail when these observational skills emerge in human development and how observed physical actions reveal cognitive intentions. I’m also interested in whether or not.
The Johns Hopkins team also included Zhou Hanbei, a second-year neuroscience student.