Visuospatial Re-Representation in Analogical Reasoning

Jim Davies 1, *, Ashok K. Goel 2
Institute of Cognitive Science, Carleton University, Canada
2College of Computing, Georgia Institute of Technology, USA

© 2017 Davies and Goel et al.;

open-access license: This is an open access article distributed under the terms of the Creative Commons Attribution 4.0 International Public License (CC-BY 4.0), a copy of which is available at: ( This license permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

* Address correspondence to this author at the Institute of Cognitive Science, Carleton University, 22nd Floor, Dunton Tower, 1125 Colonel By Dr., Ottawa, Ontario, K1S 5B6, Canada; Tel: 613-520-2600, Ext. 1109; E-mail:


Visual and spatial representations seem to play a significant role in analogy. In this paper, we describe a specific role of visual representations: two situations that appear dissimilar non-visuospatially may appear similar when rerepresented visuospatially. We present a computational theory of analogy in which visuospatial re-representation enables analogical transfer in cases where there are ontological mismatches in the non-visuospatial representation. Realizing this theory in a computational model with specific data structures and algorithms first requires a computational model of visuospatial analogy, i.e., a model of analogy that only uses visuospatial knowledge. We have developed a computer program, called , which implements a core part of this model: it transfers problem-solving procedures between analogs that contain only visual and spatial knowledge. In this paper, we describe both how Galatea accomplishes analogical transfer using only visuospatial knowledge, and how it might be extended to support visuospatial re-representation of situations represented non-visually.