Search Papers | Poster Sessions | All Posters
Poster B60 in Poster Session B - Thursday, August 8, 2024, 1:30 – 3:30 pm, Johnson Ice Rink
What common error patterns can tell us about human problem solving
Caroline Ahn1 (), Quan Do1, Leah Bakst1, Michael Pascale1, Jingxuan Guo1, Joseph McGuire1, Michael Hasselmo1, Chantal Stern1; 1Boston University
This study examines human abstract reasoning using the Cognitive Abstraction and Reasoning Corpus (CogARC), a visuospatial task inspired by an AI competition and adapted here to assess human problem-solving strategies. We analyzed online behavioral data from 233 participants who engaged in few-shot learning to learn input-output transformation rules from limited examples and apply these to novel problems. Our human subjects (M = 78.9% accuracy) significantly outperformed competing AI programs in the task. While the performance data shows considerable subject- and task-level variability, DBSCAN clustering of first attempt solutions also reveals that on certain tasks, a substantial proportion of participants made similar errors. The findings suggest shared cognitive biases in human abstract reasoning and suggest directions for future research to explore the representational space of problem-solving.
Keywords: Abstraction Reasoning Problem Solving Decision Making