Visuospatial Re-Representation in Analogical Reasoning
Jim Davies 1, *, Ashok K. Goel 2
Identifiers and Pagination:Year: 2008
First Page: 11
Last Page: 20
Publisher Id: TOAIJ-2-11
Article History:Electronic publication date: 10/4/2008
Collection year: 2008
open-access license: This is an open access article distributed under the terms of the Creative Commons Attribution 4.0 International Public License (CC-BY 4.0), a copy of which is available at: (https://creativecommons.org/licenses/by/4.0/legalcode). This license permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Visual and spatial representations seem to play a significant role in analogy. In this paper, we describe a specific role of visual representations: two situations that appear dissimilar non-visuospatially may appear similar when rerepresented visuospatially. We present a computational theory of analogy in which visuospatial re-representation enables analogical transfer in cases where there are ontological mismatches in the non-visuospatial representation. Realizing this theory in a computational model with specific data structures and algorithms first requires a computational model of visuospatial analogy, i.e., a model of analogy that only uses visuospatial knowledge. We have developed a computer program, called , which implements a core part of this model: it transfers problem-solving procedures between analogs that contain only visual and spatial knowledge. In this paper, we describe both how Galatea accomplishes analogical transfer using only visuospatial knowledge, and how it might be extended to support visuospatial re-representation of situations represented non-visually.