Document Type
Article
Publication Date
11-30-2007
Abstract
In his path-breaking work on the foundations of visual perception, the MIT neuroscientist David Marr distinguished three levels at which any information-processing task can be understood and emphasized the first of these: "Although algorithms and mechanisms are empirically more accessible, it is the top level, the level of computational theory, which is critically important from an information-processing point of view. The reason for this is that the nature of the computations that underlie perception depends more upon the nature of the computational problems that have to be solved than upon the particular hardware in which their solutions are implemented."
In this comment on Joshua Greene's essay, The Secret Joke of Kant's Soul, I argue that a notable weakness of Greene's approach to moral psychology is its neglect of computational theory. A central problem moral cognition must solve is to recognize (i.e., compute representations of) the deontic status of human acts and omissions. How do people actually do this' What is the theory which explains their practice'
Greene claims that emotional response "predicts" deontological judgment, but his own explanation of a subset of the simplest and most extensively studied of these judgments - trolley problem intuitions - in terms of a personal/impersonal distinction is neither complete nor descriptively adequate. In a series of influential papers, Greene argues that people rely on three features to distinguish the well-known Bystander and Footbridge problems: "whether the action in question (a) could reasonably be expected to lead to serious bodily harm, (b) to a particular person or a member or members of a particular group of people (c) where this harm is not the result of deflecting an existing threat onto a different party." Greene claims to predict trolley intuitions and patterns of brain activity on this basis. However, this explanation is incomplete, because we are not told how people manage to interpret the stimulus in terms of these features; surprisingly, Greene leaves this crucial first step in the perceptual process unanalyzed. Additionally, Greene's account is descriptively inadequate, because it cannot explain even simple counterexamples, let alone countless real-life examples which can be found in any casebook of torts or criminal law. Hence Greene has not shown that emotional response predicts these moral intuitions in any significant sense. Rather, his studies suggest that some perceived deontological violations are associated with strong emotional responses, something few would doubt or deny. Moreover, a better explanation of these intuitions is available, one that grows out of the computational approach to cognitive science that Marr helped to pioneer.
Scholarly Commons Citation
Mikhail, John, "Moral Cognition and Computational Theory " (2007). Georgetown Law Faculty Working Papers. 44.
https://scholarship.law.georgetown.edu/fwps_papers/44