A lot goes through our brain when we analyse another person’s decision. Watching someone order a coffee indicates to us that person prefers coffee to any other drink on offer. If that person orders a tea later in the day, we may further conclude that person only drinks coffee in the morning, or they only like coffee from a particular cafe.
In a study, researchers played children two animations. In the first video, a character jumped over a low barrier to reach an object but refused to jump over a medium height barrier. The character then jumped over the medium barrier to reach another object but refused to jump over a higher barrier.
The children then watched the characters choose between the object with no barriers. Infants looked at the screen longer when the characters approached the object for which they would not jump over the barrier. This behaviour indicated the children were confused by the character choosing the object they were less prepared to reach.
“This makes intuitive sense to us as adults. If you see your friend travel across the street to get a cup of coffee, and decline the same action for a cup of tea, you might later be surprised when your friend reaches for tea over coffee when both are right there in front of them,” said Harvard graduate student and lead author Shari Liu.
“One way that [infants] explore the world is by looking at it and by making decisions about what to pay attention to, and what to look away from so that they can learn about something else.”
Engineers at the university commented that understanding how human cognition evolves in early years could provide the key to developing models for artificial intelligence to ‘learn’.
“If we can understand in engineering terms the intuitive theories that even these young infants seem to have, that hopefully would be the basis for building machines that have more human-like intelligence,” said Josh Tenenbaum, a professor in MIT’s Department of Brain and Cognitive Sciences and a director at the MIT.