How Can We Construct a Science of Consciousness?
Abstract and Keywords
In recent years there has been an explosion of scientific work on consciousness in cognitive neuroscience, psychology, and other fields. It has become possible to think that we are moving toward a genuine scientific understanding of conscious experience. But what is the science of consciousness all about, and what form should such a science take? This chapter gives an overview of the agenda.
In recent years there has been an explosion of scientific work on consciousness in cognitive neuroscience, psychology, and other fields. It has become possible to think that we are moving toward a genuine scientific understanding of conscious experience. But what is the science of consciousness all about, and what form should such a science take? This chapter gives an overview of the agenda.
1. First-Person Data and Third-Person Data
The task of a science of consciousness, as I see it, is to systematically integrate two key classes of data into a scientific framework: third-person data, or data about behavior and brain processes, and first-person data, or data about subjective experience. When a conscious system is observed from the third-person point of view, a range of specific behavioral and neural phenomena present themselves. When a conscious system is observed from the first-person point of view, a range of specific subjective phenomena present themselves. Both sorts of phenomena have the status of data for a science of consciousness.1
(p. 38 ) Third-person data concern the behavior and the brain processes of conscious systems. These behavioral and neurophysiological data provide the traditional material of interest for cognitive psychology and cognitive neuroscience. Where the science of consciousness is concerned, some particularly relevant third-person data are those concerning perceptual discrimination and those involving verbal reports. Direct measurements of brain processes also play a crucial role in cognitive neuroscience, of course.
First-person data concern the subjective experiences of conscious systems. It is a datum for each of us that such experiences exist: we can gather information about them both by attending to our own experiences and by monitoring subjective verbal reports about the experiences of others. These phenomenological data provide the distinctive subject for the science of consciousness. Some central sorts of first-person data include those having to do with the following:
• visual experiences (e.g., the experience of color and depth)
• other perceptual experiences (e.g., auditory and tactile experience)
• bodily experiences (e.g., pain and hunger)
• mental imagery (e.g., recalled visual images)
• emotional experience (e.g., happiness and anger)
• occurrent thought (e.g., the experience of reflecting and deciding)
Both third-person data and first-person data need explanation. An example is provided by the case of musical processing. If we observe someone listening to music, relevant third-person data include those concerning the nature of the auditory stimulus, its effects on the ear and the auditory cortex of the subject, various behavioral responses by the subject, and any verbal reports the subject might produce. All of these third-person data need explanation, but they are not all that needs explanation. As anyone who has listened to music knows, there is also a distinctive quality of subjective experience associated with listening to music. A science of music that explained the various third-person data just listed but that did not explain the first-person data of musical experience would be a seriously incomplete science of music. A complete science of musical experience must explain both sorts of phenomena, preferably within an integrated framework.
2. Explaining the Data
The problems of explaining third-person data associated with consciousness are among the “easy” problems of consciousness from the last chapter. (p. 39 ) The problem of explaining first-person data is the hard problem. In chapter 1 we saw that, to explain third-person data, we need to explain the objective functioning of a system and can do so in principle by specifying a mechanism. When it comes to first-person data, however, this model breaks down. The reason is that first-person data—the data of subjective experience—are not data about objective functioning. Merely explaining the objective functions does not explain subjective experience.
The lesson is that as data, first-person data are irreducible to third-person data and vice versa. That is, third-person data alone provide an incomplete catalogue of the data that need explaining: if we explain only third-person data, we have not explained everything. Likewise, first-person data alone are incomplete. A satisfactory science of consciousness must admit both sorts of data and must build an explanatory connection between them.
What form might this connection take? A common position holds that, although there are two sorts of data, we can explain first-person data wholly in terms of material provided by third-person data. For example, many think that we might wholly explain the phenomena of subjective experience in terms of processes in the brain. This position is very attractive, but in chapter 1 I give reasons to be skeptical about it. Here I present a simple argument that encapsulates some reasons for doubt:
1. Third-person data are data about the objective structure and dynamics of physical systems.
2. (Low-level) structure and dynamics explain only facts about (high-level) structure and dynamics.
3. Explaining structure and dynamics does not suffice to explain the first-person data.
4. First-person data cannot be wholly explained in terms of third-person data.
Premise 1 captures something about the character of third-person data: these data always concern certain physical structures and their dynamics. Premise 2 says that explanations in terms of processes of this sort only explain further processes of that sort. There can be big differences between the processes, as when simple low-level structure and dynamics give rise to highly complex high-level structure and dynamics (in complex systems theory, for example), but there is no escaping the structural/dynamical circle. Premise 3 encapsulates the points, discussed earlier, that explaining structure and dynamics is only to explain objective functions and that to explain objective functions does not suffice to explain first-person data (p. 40 ) about subjective experience. From these three premises, the conclusion, 4, follows.2
Of course, it does not follow that first-person data and third-person data have nothing to do with one another; there is obviously a systematic association between them. We have good reason to believe that subjective experiences are systematically correlated with brain processes and behavior. It remains plausible that whenever subjects have an appropriate sort of brain process, they will have an associated sort of subjective experience. We simply need to distinguish correlation from explanation. Even if first-person data cannot be wholly explained in terms of third-person data, the two sorts of data are still strongly correlated.
It follows that a science of consciousness remains entirely possible. It is just that we should expect this science to take a nonreductive form. A science of consciousness will not reduce first-person data to third-person data, but it will articulate the systematic connections between them. Where there is systematic covariation between two classes of data, we can expect systematic principles to underlie and explain the covariation. In the case of consciousness, we can expect systematic bridging principles to underlie and explain the covariation between third-person data and first-person data. A theory of consciousness will ultimately be a theory of these principles.
Of course, these foundational issues are controversial, and there are various alternative views. One class of views (e.g., Dennett 1991) holds that the only phenomena that need explaining are those that concern objective functioning. The most extreme version of this view says that there are no first-person data about consciousness at all. A less extreme version of this view says that all first-person data are equivalent to third-person data (e.g., about verbal reports), so that explaining these third-person data explains everything. Another class of views (e.g., P. S. Churchland 1997) accepts that that first-person data need further explanation but holds that they might be reductively explained by future neuroscience. One version of this view holds that future neuroscience could go beyond structure and dynamics in ways we cannot currently imagine. Another version holds that if we can find sufficient correlations between brain states and consciousness, that will qualify as a reductive explanation. I argue against views of this sort in chapter 5.
In this chapter I focus on constructive projects for a science of consciousness. I will sometimes presuppose the reasoning sketched earlier, but much of what I say has application even to alternative views.
(p. 41 ) 3. Projects for a Science of Consciousness
If what I have said so far is correct, then a science of consciousness should take first-person data seriously and should proceed by studying the association between first-person data and third-person data without attempting a reduction. In fact, this is exactly what one finds in practice. The central work in the science of consciousness has always taken first-person data seriously. For example, much central work in psychophysics and perceptual psychology has been concerned with the first-person data of subjective perceptual experience. In research on unconscious perception, the first-person distinction between the presence and absence of subjective experience is crucial. In recent years, a growing body of research has focused on the correlations between first-person data about subjective experience and third-person data about brain processes and behavior.
In what follows I articulate what I see as some of the core projects for a science of consciousness, with illustrations drawn from existing research.
Project 1: Explain the Third-Person Data
One important project for a science of consciousness is that of explaining the third-person data in the vicinity: explaining the difference between functioning in sleep and wakefulness, for example, and explaining the voluntary control of behavior. This sort of project need not engage the difficult issues relating to first-person data, but it may still provide an important component of a final theory.
One example of this sort of project is that of explaining binding in terms of neural synchrony, as discussed in the last chapter. It is not yet clear whether this hypothesis is correct, but if it is correct, it will provide an important component in explaining the integration of perceptual information, which in turn is closely tied to questions about consciousness. Of course, explaining binding will not on its own explain the first-person data of consciousness, but it may help us to understand the associated processes in the brain.
Research on the “global workspace” hypothesis also falls into this class. Baars (1988) has postulated such as a workspace as a mechanism by which shared information can be made available to many different cognitive processes. More recently, other researchers (e.g., Dehaene and Changeux 2004) have investigated the potential neural basis for this mechanism and postulated a neuronal global workspace. If this hypothesis is correct, it will help to explain third-person data concerning access to information within the cognitive (p. 42 ) system, as well as data about the information made available to verbal report. Again, explaining these processes will not in itself explain the first-person data of consciousness, but it may well contribute to the project (project 4 later in this chapter) of finding neural correlates of consciousness.
Project 2: Contrast Conscious and Unconscious Processes
Many cognitive capacities can be exercised both consciously and unconsciously, that is, in the presence or absence of associated subjective experience. For example, the most familiar sort of perceptual processing is conscious, but there is also strong evidence of unconscious perceptual processing (Merikle and Daneman 2000). One finds a similar contrast in the case of memory, where the now common distinction between explicit and implicit memory (Schacter and Curran 2000) can equally be seen as a distinction between conscious and unconscious memory. Explicit memory is essentially memory associated with a subjective experience of the remembered information; implicit memory is essentially memory in the absence of such a subjective experience. The same goes for the distinction between explicit and implicit learning (Reber 1996), which is in effect a distinction between learning in the presence or absence of relevant subjective experience.
Conscious and unconscious processes provide pairs of processes that are similar in some respects from the third-person point of view (e.g., both involve registration of perceptual stimuli) but differ from the first-person point of view (one involves subjective experience of the stimulus; one does not). Of course, there are also differences from the third-person point of view. For a start, a researcher’s evidence for conscious processes usually involves a verbal report of a relevant experience, and evidence for unconscious processes usually involves a verbal report of the absence of a relevant experience. There are also less obvious differences between the behavioral capacities that go along with conscious and unconscious processes, as well as between the associated neural processes. These differences make for the beginning of a link between the first-person and third-person domains.
For example, evidence suggests that, while unconscious perception of visually presented linguistic stimuli is possible, semantic processing of these stimuli seems limited to the level of the single word rather than complex expressions (see Greenwald 1992). By contrast, conscious perception allows for semantic processing of very complex expressions. Here, experimental results suggest a strong association between the presence or absence of subjective experience and the presence or absence of an associated functional capacity—a systematic link between first-person and third-person (p. 43 ) data. Many other links of the same sort can be found in the literature on unconscious perception, implicit memory, and implicit learning.
Likewise, there is evidence for distinct neural bases for conscious and unconscious processes in perception. Appealing to an extensive body of research on visuomotor processing, Milner and Goodale (1995; see also Goodale 2004) have hypothesized that the ventral stream of visual processing subserves conscious perception of visual stimuli for the purpose of the cognitive identification of stimuli, while the dorsal stream subserves unconscious processes involved in fine-grained motor capacities. If this hypothesis is correct, one can again draw a systematic link between a distinction in first-person data (presence or absence of conscious perception) and a distinction in third-person data (visual processing in the ventral or dorsal stream). A number of related proposals have been made in research on memory and learning.
Project 3: Investigate the Contents of Consciousness
Consciousness is not simply an on-off switch. Conscious experiences have a complex structure with complex contents. A conscious subject usually has a manifold of perceptual experiences, bodily sensations, emotional experiences, and conscious thoughts, among other things. Each of these elements may itself be complex. For example, a typical visual experience has an internal structure representing objects with many different colors and shapes in varying degrees of detail. We can think of all of this complexity as composing the contents of consciousness.
The contents of consciousness have been studied throughout the history of psychology. Weber’s and Fechner’s pioneering work in psychophysics concentrated on specific aspects of these contents, such as the subjective brightness associated with a visual experience, and correlated it with properties of the associated stimulus. This provided a basic link between first-person data about sensory experience and third-person data about external stimuli. Later work in psychophysics and gestalt psychology took an approach of the same general sort, investigating specific features of perceptual experience and analyzing how these covary with the properties of a stimulus.
This tradition continues in a significant body of contemporary research. For example, research on visual illusions often uses subjects’ first-person reports (and even scientists’ first-person experiences) to characterize the structure of perceptual experiences. Research on attention (Mack and Rock 1998; Treisman 2003) aims to characterize the structure of perceptual experience inside and outside the focus of attention. Other researchers investigate the contents of consciousness in the domains of mental imagery (Baars (p. 44 ) 1996), emotional experience (Kaszniak 1998), and stream of conscious thought (Pope 1978; Hurlburt 1990).
An important line of research investigates the contents of consciousness in abnormal subjects. For example, subjects with synesthesia have unusually rich sensory experiences. In a common version, letters and numbers trigger reports of extra color experiences, in addition to the standard perceived color of the stimulus. Recent research strongly suggests that these reports reflect the subjects’ perceptual experiences and not just cognitive associations. For example, Ramachandran and Hubbard (2001) find that certain visual patterns produce a perceptual “pop-out” effect in synesthetic subjects that is not present in normal subjects. When first-person data about the experiences of abnormal subjects are combined with third-person data about brain abnormalities in those subjects, this yields a new source of information about the association between brain and conscious experience.
Project 4: Find the Neural Correlates of Consciousness
This leads us to what is perhaps the core project of current scientific research on consciousness: the search for neural correlates of consciousness (Metzinger 2000; Crick and Koch 2004). A neural correlate of consciousness (NCC) can be characterized as a minimal neural system that is directly associated with states of consciousness (see chapter 3 for much more on this). Presumably the brain as a whole is a neural system associated with states of consciousness, but not every part of the brain is associated equally with consciousness. The NCC project aims to isolate relatively limited parts of the brain (or relatively specific features of neural processing) that correlate directly with subjective experience.
It may be that there will be many different NCCs for different aspects of conscious experience. For example, it might be that one neural system is associated with being conscious as opposed to being unconscious (perhaps in the thalamus or brainstem; see e.g., Schiff 2004), while another neural system is associated with the specific contents of visual consciousness (perhaps in some part of the visual cortex), and other systems are associated with the contents of consciousness in different modalities. Any such proposal can be seen as articulating a link between third-person data on brain processes and first-person data on subjective experience.
In recent years, by far the greatest progress has been made in the study of NCCs for visual consciousness. Milner and Goodale’s (1995) work on the ventral stream provides an example of this sort of research. Another (p. 45 ) example is the research of Nikos Logothetis and colleagues on binocular rivalry in monkeys (e.g., Logothetis 1998; Leopold, Maier, and Logothetis 2003). When different stimuli are presented to the left and the right eyes, subjects usually undergo alternating subjective experiences. Logothetis trained monkeys to signal such changes in their visual experience and correlated these changes with changes in underlying neural processes. The results indicate that changes in visual experience are only weakly correlated with changes in patterns of neural firing in the primary visual cortex: in this area, neural firing was more strongly correlated with the stimulus than with the experience. In contrast, changes in visual experience were strongly correlated with changes in patterns of neural firing in later visual areas, such as the inferior temporal cortex. These results tend to suggest that the inferior temporal cortex is a better candidate than the primary visual cortex as an NCC for visual consciousness.
Of course, no single experimental result can provide conclusive evidence concerning the location of an NCC, but much evidence concerning the location of NCCs for vision has accumulated in the last few years (Koch 2004), and one can expect much more to come. If successful, this project will provide some highly specific connections between brain processes and conscious experiences.
Project 5: Systematize the Connection
To date, links between first-person data and third-person data have been studied in a somewhat piecemeal fashion. Researchers isolate correlations between specific aspects of subjective experience and specific brain processes or behavioral capacities in a relatively unsystematic way. This is to be expected at the current stage of development. We can hope that as the science develops, however, more systematic links will be forthcoming. In particular, we can hope to develop principles of increasing generality that link a wide range of first-person data with a correspondingly wide range of third-person data. For example, there might eventually be an account of the neural correlates of visual consciousness that will not only tell us which neural systems are associated with visual consciousness but will also yield systematic principles that tell us how the specific content of a visual experience covaries with the character of neural processes in these systems.
A few principles of this sort have been suggested to date in limited domains. For example, Hobson (1997) has suggested a general principle linking certain levels of neurochemical activity with different states of consciousness in wakefulness, sleep, and dreaming. It is likely than any such proposals will be (p. 46 ) heavily revised as new evidence comes in, but one can expect that in the coming decades increasingly well-supported principles of this sort will emerge. The possibility of such principles holds out the tantalizing prospect that eventually we might use them to predict features of an organism’s subjective experience based on knowledge of its neurophysiology.
Project 6: Infer Fundamental Principles
If the previous project succeeds, we will have general principles that connect third-person data and first-person data, but these general principles will not yet be fundamental. The principles might still be quite complex, and limited both to specific aspects of consciousness and to specific species. A science of consciousness consisting of wholly different principles for different aspects of consciousness and different species would not be entirely satisfactory. It is reasonable to hope that eventually some unity will be discovered behind this diversity. We should at least aim to maximize the generality and the simplicity of the relevant principles wherever possible. In the ideal situation, we might hope for principles that are maximally general in their scope, applying to any conscious system whatsoever and to all aspects of conscious experience. In addition, we might hope for principles that are relatively simple in their form in the way that the basic laws of physics appear to be simple.
It is unreasonable to expect that we will discover principles of this sort anytime soon, and it is an open question whether we will be able to discover them at all. Currently, we have little idea what form such principles might take (notwithstanding the speculation at the end of chapter 1). Still, if we can discover them, principles of this sort would be candidates for fundamental principles: the building blocks of a fundamental theory of consciousness. If what I said earlier is correct, then something about the connection between first-person data and third-person data must be taken as primitive, just as we take fundamental principles in physical theories as primitive. But we can at least hope that the primitive element in our theories will be as simple and as general as possible. If, eventually, we can formulate simple and general principles of this sort, based on an inference from accumulated first-person and third-person data, we could be said to have an adequate scientific theory of consciousness.
What would this entail about the relationship between physical processes and consciousness? The existence of such principles is compatible with different philosophical views. One might regard the principles as laws that connect two fundamentally different domains. One might (p. 47 ) regard them as laws that connect two aspects of the same thing. Or one might regard them as grounding an identification between properties of consciousness and physical properties. Such principles could also be combined with different views of the causal relation between physical processes and consciousness. These matters are all discussed further in chapter 5, but for many purposes, the science of consciousness can remain neutral on these philosophical questions. One can simply regard the principles as principles of correlation while staying neutral on their underlying causal and ontological status. This makes it possible to have a robust science of consciousness even without having a widely accepted solution to the philosophical mind/body problem.
4. Obstacles to a Science of Consciousness
The development of a science of consciousness as I have presented it thus far may sound remarkably straightforward. We simultaneously gather first-person data about subjective experience and third-person data about behavior and brain processes, isolate specific correlations between them, formulate general principles governing these correlations, and infer the underlying fundamental laws. But of course it is not as simple as this in practice. There are a number of serious obstacles to this research agenda. The most serious obstacles concern the availability of the relevant data in both the third-person and first-person domains. In what follows I discuss a few of these obstacles.
Obstacles Involving Third-Person Data
The third-person data relevant to the science of consciousness include both behavioral and neural data. The availability of behavioral data is reasonably straightforward: one is constrained only by the ingenuity of the experimenter and the limitations of experimental contexts. In practice, researchers have accumulated a rich body of behavioral data relevant to consciousness. In contrast, the availability of neural data is much more constrained by technological and ethical limitations, and the body of neural data that has been accumulated to date is correspondingly much more limited.
In practice, the most relevant neurophysiological data come from two or three sources: brain imaging via functional magnetic resonance imaging (fMRI) and positron emission tomography (PET) technology, single-cell recording through insertion of electrodes, and surface recording through electroencephalographs (EEG) and magnetoencephalography (MEG). (p. 48 ) Each of these technologies is useful, but each has serious limitations for the science of consciousness. Both EEG and MEG have well-known limitations in spatial localization. Brain imaging through fMRI and PET is better in this regard, but these methods are still spatially quite coarse grained. Single-cell recording is spatially fine grained but is largely limited to experimentation on nonhuman animals.
These limitations apply to all areas of cognitive neuroscience, but they are particularly pressing for the science of consciousness because the science of consciousness relies on gathering third-person and first-person data simultaneously. By far the most straightforward method for gathering first-person data is verbal report, but verbal report is limited to human subjects. By far the most useful third-person data are data at the level of single neurons, where one can monitor representational content that correlates with the content of consciousness (e.g., when one monitors a neuron with a specific receptive field, as discussed in chapter 3), but these experiments are largely limited to nonhuman subjects. As a result, it is extremely difficult to discover strong associations between first-person data and corresponding neural data with current techniques.
There have been numerous ingenious attempts to circumvent these limitations. The best known include Logothetis’s experiments on monkeys, in which the animals are trained extensively to provide a substitute for a verbal report of visual consciousness by pressing a bar. Research on blindsight in monkeys by Cowey and Stoerig (1995) has done something similar. Still, the very fact that researchers have to go to such great lengths in order to gather relevant neural data illustrates the problem. Others have performed neuron-level measurements on human surgical patients (e.g., Kreiman, Fried, and Koch 2002), but there are obvious practical limitations here. Many others (e.g., Rees 2004) have tried to get as much relevant information as they can from the limited resources of brain imaging and surface recording; nevertheless, fewer deep correlations have emerged from this sort of work than from neuron-level studies.
One can hope that this is a temporary limitation imposed by current technology. If a technology is eventually developed that allows for noninvasive monitoring of neuron-level processes in human subjects, we might expect a golden age for the science of consciousness to follow.
Obstacles Involving First-Person Data
Where the availability of first-person data is concerned, a number of related obstacles run quite deep. I discuss three of these here.
(p. 49 ) 1. Privacy
The most obvious obstacle to the gathering of first-person data concerns the privacy of such data. In most areas of science, data are intersubjectively available: they are equally accessible to a wide range of observers. But in the case of consciousness, first-person data concerning subjective experiences are directly available only to the subject having those experiences. To others, these first-person data are only indirectly available, mediated by observation of the subject’s behavior or brain processes. Things would be straightforward if we had a “consciousness meter” (discussed in the next chapter) that could be pointed at a subject and reveal his or her subjective experiences to all. But in the absence of a theory of consciousness, no such consciousness meter is available. This imposes a deep limitation on the science of consciousness, but it is not a paralyzing limitation. For a start, subjects have direct access to first-person data concerning their own conscious experiences. We could imagine that Robinson Crusoe on a desert island, equipped with the latest brain-imaging technology, could make considerable progress toward a science of consciousness by combining this technology with first-person observation. More practically, each of us has indirect access to first-person data concerning others’ experiences by relying on behavioral indicators of these data.
In practice, by far the most common way of gathering data on the conscious experiences of other subjects is to rely on their verbal reports. Here, one does not treat the verbal reports just as third-person data (as a behaviorist might, limiting the datum to the fact that a subject made a certain noise). Rather, one treats the report as a report of first-person data that are available to the subject. Just as a scientist can accumulate third-person data by accepting reports of third-person data gathered by others (rather than simply treating those reports as noises), a scientist can also gather first-person data by accepting reports of first-person data gathered by others. This is the typical attitude that researchers adopt toward experimental subjects. If there is positive reason to believe that a subject’s report might be unreliable, a researcher will suspend judgment about it. But in the absence of any such reason, researchers will take a subject’s report of a conscious experience as good reason to believe that the subject is having a conscious experience of the sort that the subject is reporting.
In this way, researchers have access to a rich trove of first-person data that is made intersubjectively available. Of course, our access to these data depends on our making certain assumptions: in particular, the assumption that other subjects really are having conscious experiences and that by and large their verbal reports reflect these conscious experiences. We cannot directly test this assumption; instead, it serves as a sort of background (p. 50 ) assumption for research in the field. But this situation is present throughout other areas of science. When physicists use perception to gather information about the external world, for example, they rely on the assumption that the external world exists and that perception reflects the state of the external world. They cannot directly test this assumption; instead, it serves as a sort of background assumption for the whole field. Still, it seems a reasonable assumption to make, and it makes the science of physics possible. The same goes for our assumptions about the conscious experiences and verbal reports of others. These seem to be reasonable assumptions to make, and they make the science of consciousness possible.
Of course, verbal reports have some limits. Some aspects of conscious experience (e.g., the experience of music or of emotion) are very difficult to describe; in these cases we may need to develop a more refined language. What is more, verbal reports cannot be used at all in subjects without language, such as infants and nonhuman animals. In these cases, one needs to rely on other behavioral indicators, as when Logothetis relies on a monkey’s bar pressing. These indicators require further assumptions. For example, Logothetis’s work requires both the assumption that monkeys are conscious and the assumption that visual stimuli that the monkey can exploit in the voluntary control of behavior are consciously perceived.
These assumptions appear reasonable to most people, but they go beyond those required in the case of verbal report. The farther we move away from the human case, the more questionable the required assumptions become. For example, it would be very difficult to draw conclusions about consciousness from experiments on insects. Still, verbal reports in humans, combined with behavioral indicators in primates, give researchers enough access to first-person data to enable a serious body of ongoing research.
A second obstacle is posed by the fact that our methods for gathering first-person data are quite primitive compared with our methods for gathering third-person data. The latter have been refined by years of scientific practice, but the former have not received nearly as much attention. Where simple first-person data are concerned, this problem is not too pressing. There is usually no great difficulty in determining whether one is having an experience of a certain color in the center of one’s visual field, for example. Where more subtle aspects of subjective experience are concerned, however, the obstacle arises quickly.
Even with a phenomenon as tangible as visual experience, the issue arises in a number of ways. Visual experiences as a whole usually have a rich and (p. 51 ) detailed structure, for example, but how can subjects investigate and characterize that detail? Most subjects have great difficulty in introspecting and reporting this detail more than superficially. Particular difficulties arise in investigating the character of consciousness outside attention. To introspect and report this structure would seem to require attending to the relevant sort of experience, which may well change the character of the experience.
Here we can expect that at least some progress can be made by developing better methods for the gathering of first-person data. It may be reasonable to pay attention to traditions where the study of experience has been explored in detail. These traditions include those of Western phenomenology, introspectionist psychology, and even Eastern meditative traditions. Even if one is skeptical of the theories put forward by proponents of these traditions, one might still benefit from attending to their data-gathering methods. This research strategy has been pursued most notably in the “neurophenomenology” of Francisco Varela and colleagues (Varela 1997; Lutz et al. 2002), in which neurophysiological investigation is combined with phenomenological investigation in the tradition of Husserl. A number of other attempts at refining first-person methods are discussed in the papers collected in Varela and Shear (2001).
Of course, any method has limitations. Subjects’ judgments about their subjective experiences are not infallible, and although training may help, it also introduces the danger that observations may be corrupted by theory. The introspectionist program of experimental psychology in the nineteenth century famously fell apart when different schools could not agree on the introspective data (Boring 1929). Still, our ambitions need not be as grand as those of the introspectionists. For now, we are not aiming for a perfect characterization of the structure of consciousness but simply for a better characterization. Furthermore, we are now in a position where we can use third-person data as a check on first-person investigation. Experimental investigation has helped us to distinguish circumstances in which first-person reports are unreliable from those in which they are more reliable (Schooler and Fiore 1997), and there is room for much more investigation of this sort in the future. So it is reasonable to hope that at least a modest refinement of our methods for the reliable investigation of first-person data will be possible.
A final obstacle is posed by the absence of general formalisms with which first-person data can be expressed. Formalisms are important for two purposes. First, they (p. 52 ) are needed for data gathering: it is not enough to simply know what one is experiencing; one also has to write it down. Second, they are needed for theory construction: to formulate principles that connect first-person data with third-person data, we need to represent the data in a way that such principles can exploit.
The main existing formalisms for representing first-person data are quite primitive. Researchers typically rely either on simple qualitative characterizations of data (as in “an experience of red in the center of the visual field”) or on simple parameterization of data (as when color experiences are parameterized by hue, saturation, and brightness). These simple formalisms suffice for some purposes, but they are unlikely to suffice for the formulation of systematic theories.
It is not at all clear what form a proper formalism for the expression of first-person data about consciousness should take. The candidates include (1) parametric formalisms, in which various specific features of conscious experience are isolated and parameterized (as in the case of color experience); (2) geometric and topological formalisms, in which the overall structure of an experience (such as a visual experience) is formalized in geometric or topological terms; (3) informational formalisms, in which one characterizes the informational structure of an experience, specifying it as a sort of pixel-by-pixel state that falls into a larger space of informational states; and (4) representational formalisms, in which one characterizes an experience by using language for the states of the world that the experience represents (one might characterize an experience as an experience as of a yellow cup, for example). While each of these formalisms may have limitations, a detailed study of various alternative formalisms is likely to have significant benefits.
Overall, the prospects for a science of consciousness are reasonably bright. There are numerous clear projects for a science of consciousness that take first-person data seriously. One can recognize the distinctive problems that consciousness poses and still do science. Of course, there are many obstacles, and it is an open question how far we can progress. But the last ten years have seen many advances, and the next fifty years will see many more. For now, it is reasonable to hope that we may eventually have a theory of the fundamental principles that connect physical processes to conscious experience.
Afterword: First-Person Data and First-Person Science
The very idea of first-person data and of first-person methods in science makes some philosophers and scientists queasy. A common reaction is to say that first-person data are private, and science is public, so first-person (p. 53 ) data cannot be part of science.3 Now, demarcation disputes over what counts as “science” are usually verbal and sterile. What matters is whether we have good reasons to accept certain conclusions, not whether those reasons count as scientific reasons. Still, it is worth noting that first-person data are private only in a very limited sense, one that is compatible with the most important elements of the publicity of science.
The sense in which first-person data are private is that a given datum that concerns the experience of a given subject is directly accessible only to one person—that subject. Still, that datum will be indirectly accessible to many others, for example when it is expressed through verbal report: “I experienced a small cross inside a large circle.” So first-person data are certainly communicable in the way that scientific data are required to be.
One might worry that, for scientific purposes, an individual datum should be jointly observable by many parties, but this requirement is largely irrelevant to the ordinary practice of science. One does not standardly require five scientists to crowd around the same oscilloscope. Rather, what matters for publicity is that data be replicable. If someone reports a datum, from observing an oscilloscope or an animal for example, we expect that others will be able to observe other data of the same sort, using their own oscilloscopes or looking at different animals of the same species. That is publicity enough for the purposes of science.
First-person data can straightforwardly have this sort of publicity. Experiments on consciousness typically have many subjects reporting data of the same sort. In many cases, scientists can replicate the first-person data in their own case by direct observation. Indeed, this is standard practice in the field of psychophysics. To be convinced of the reality of a putative illusion, for example, scientists just need to experience it for themselves. Even when this sort of direct validation is not possible (say, because the data pertain to a group to which the scientist does not belong), it is usually straightforward to cross-validate observations with reports from many subjects. As in any science, occasional one-off observations involve unique or hard-to-replicate conditions. In these cases, as in any science, the putative observation need not be ignored, but one will be more cautious about its use. So first-person data appear to be quite compatible with scientific methods.
Of course, there are cases in which the privacy of first-person data poses a much more serious obstacle. In many cases involving infants, machines, and nonhuman animals, any first-person data cannot easily be communicated to (p. 54 ) others. Because of this, theorists cannot be said to have access to these data. This situation is akin to the situation in the historical sciences (such as evolutionary biology), in which there were data accessible in the past to which current scientists do not have access. This means that the availability of data is limited, making the science more difficult. Still, in all of these cases, we have access to enough data (data concerning the present, in the historical case; data concerning human experience, in the case of consciousness) that one can build a reasonably robust science all the same.
Daniel Dennett responds to the ideas in an early version of this chapter (and to related claims by Alvin Goldman) in an article called “The Fantasy of First-person Science” (2001). Where Goldman and I think a science of consciousness needs to explain both third-person and first-person data, Dennett thinks that explaining the third-person data is enough: once we have explained these—including verbal reports—we have explained everything. This is Dennett’s method of “heterophenomenology.” A key idea is that taking verbal reports to be the explananda turns apparently irreducible first-person data into straightforwardly explicable third-person data.
The obvious objection to Dennett’s method is that the science of consciousness is not primarily about verbal reports or even about introspective judgments. It is about the experiences that the reports and judgments are reports and judgments of. Explaining our reports and judgments may be useful for many purposes, but to explain our reports and judgments is not to explain experience.
Dennett’s central argument for heterophenomenology proceeds by making a case that heterophenomenology is guaranteed to be at least as good a guide to first-person data as first-person phenomenology. In cases where reports about consciousness go wrong, introspection will also go wrong, so heterophenomenology misses nothing that first-person phenomenology covers. This way of proceeding mislocates the debate, however. The central problem with heterophenomenology has nothing to do with the fallibility of reports. Even if reports were infallible, the same problem would arise: to explain reports is not to explain the experiences that the reports are reports of.
Dennett says that my view requires a nonneutral attitude toward a subject’s reports, taking them uncritically as a guide to their subject matter. And he says that scientists have to take a neutral attitude (taking reports themselves as data but making no claims about their truth) because reports can go wrong. But this misses the natural intermediate option that Max Velmans (2007) has called critical phenomenology: accept verbal reports as a prima facie guide to a subject’s conscious experience except where there are specific reasons to doubt their reliability.
(p. 55 ) Critical phenomenology seems to capture most scientists’ attitudes toward verbal reports and consciousness. The attitude is not “uncritical acceptance,” but it is also far from the “neutrality” of heterophenomenology. On this view, we do not treat reports merely as data to be explained in their own right (though we might regard them in this way for some purposes). Rather, we are interested in them as a (fallible) guide to the first-person data about consciousness that we really want to explain. This way of thinking about the phenomena accommodates a major role for reports while preserving the distinction between explaining the reports (and third-person data in general) and explaining the experiences (first-person data) that they are reports of.
The issue is clouded just a little by Dennett’s occasional blurring of the line between heterophenomenology and critical phenomenology. At one point in his reply to Goldman, he appears to allow that researchers on blindsight can give prima facie credence to subjects’ claims that they have no experience in a given area as long as they do not give the subjects infallible authority. Clearly, to adopt this attitude would be to give up on the strict “neutrality” of heterophenomenology and to move beyond treating verbal reports and introspective judgments as mere third-person data. But I think Dennett’s attitude is best captured by the suggestion that researchers are analogous to anthropologists listening to subjects’ folk tales: they play along to obtain the subject’s judgments without ever accepting them, unless those judgments are later validated by an independent investigation.
It is clear that this anthropological attitude is not the attitude of most researchers toward blindsight subjects. Even in the absence of a good theory of blindsight, the results are generally taken to be very strong (although not infallible) evidence for a conclusion concerning visual experience. The conclusion is not just that the subjects judge that they do not visually experience objects in a part of their visual field; it is that subjects do not visually experience objects in a part of their visual field. Likewise, in psychophysics, experiments are usually taken to be strong evidence not just that subjects judge that they visually experience a certain illusion but also that they visually experience a certain illusion. Dennett may be right that the anthropological attitude is the typical attitude of researchers toward subjects’ reports about their cognitive mechanisms: about whether they rotated a mental image using mechanisms of symbolic representations or analog imagery, for example. But when it comes to reports about experience, by far the most common attitude is nonneutral. Reports are given prima facie credence as a guide to conscious experience.
(p. 56 ) Another relevant illustration is provided by the debate about unconscious perception among cognitive psychologists. A central part of this debate has concerned precisely which third-person measures (direct report, discrimination, etc.) are the best guide to the presence and absence of conscious perception. Here, third-person data are being used as a (fallible) guide to first-person data about consciousness, which are of primary interest. Given the heterophenomenological view, this debate is without much content: some states subserve report, some subserve discrimination, and that is about all there is to say. I think that something like this is Dennett’s attitude toward those debates, but it is not the attitude of most of the scientists working in the field.
Dennett makes something of the fact that his method appeals to a subject’s beliefs rather than just to reports, but the same issues arise. On Dennett’s account of beliefs, to believe that such-and-such is roughly to be disposed to report that such-and-such and to behave in appropriate ways, and to explain such beliefs is in large part to explain the associated dispositions. If one takes this view of beliefs, everything I have said earlier applies to beliefs, as well as to reports: explaining the beliefs does not suffice to explain the experiences. If one does not take this view, it is no longer obvious that third-person methods can explain our specific beliefs about consciousness, as these beliefs may themselves have a phenomenal element: see chapter 9 for much more on this.
Dennett “challenges” me to name an experiment that “transcends” the heterophenomenological method, but of course both views can accommodate experiments equally. For any experiment in which I say that we are using a verbal report or an introspective judgment as a guide to first-person data, Dennett can say that we are using it as third-person data and vice versa. So the difference between the views does not lie in the range of experiments “compatible” with them. Rather, it lies in a philosophical difference concerning the way that experimental results are interpreted: should verbal reports be seen as reports of data about the subject’s experience or just as data points to be explained in their own right?
In any case, the fundamental reasons for rejecting the heterophenomenological view lie prior to these questions about experiments. The fundamental question is, does explaining behavior and other third-person data suffice on its own to explain conscious experience? I think there are overwhelming grounds to say no (as argued in the previous chapter and in the response to Dennett there). The question is then, given that this is so, can there be a science of consciousness anyway? The answer provided by this chapter is yes, as long as we accept that science can deal with first-person data and as long as we allow that verbal reports and the like can be used as an indirect guide to the first-person data about consciousness.
(p. 57 ) Given this basic orientation, we can see the role of verbal reports in the science of consciousness as analogous to the role of perceptual experiences and perceptual judgments in ordinary third-person science. Roughly speaking, verbal reports stand to others’ experiences as our own perceptual experiences stand to the external world. Suppose a physicist sees an object in a certain location and judges it to be in that location. One could take the “anthropological” view that the datum to be explained here is always just the perceptual experience or the perceptual judgment. Then one’s science would be a science of explaining experiences and judgments, and the external-world hypothesis would be one hypothesis among many that are competing with various skeptical hypotheses to explain the data. Alternatively, one could take the attitude that the datum to be explained here is that the object is in the location. Then skeptical hypotheses will not really be on the table when doing the science, and instead the competition will be among different physical hypotheses for explaining the external datum.
Whatever the merits of the two views, the second corresponds to the way that science is usually practiced. On this view, perceptual states themselves do not exhaust the data. Rather, the states of affairs that they report (or represent) are taken as data. It is not unquestioned that perception is accurate. If there are conflicts among the apparent data or specific reasons to believe that perception may be unreliable, then one might retreat to taking the experiences and judgments as data in a given case and look for alternative explanations. But in the absence of positive reasons to question these data, one takes for granted that perception functions as a guide to the external world. Doing this enables science to proceed, even in the absence of decisive rejoinders to skeptical worries about the external world.
Likewise, in the science of consciousness, verbal reports do not exhaust the data. Rather, the states of affairs that they report are taken as data. It is not unquestioned that introspection is accurate. If there are conflicts among the apparent data or specific reasons to believe that introspection may be unreliable, then one might retreat to taking the reports and judgments as data in a given case and look for alternative explanations. But in the absence of positive reasons to question these data, one takes for granted that introspection and report function as a guide to consciousness. Doing this enables the science of consciousness to proceed, even in the absence of decisive rejoinders to skeptical worries about other minds.
One might say that just as a certain degree of trust in perception is a precondition for the science of physics, a certain degree of trust in introspection is a precondition for the science of consciousness. Of course, in both cases one would also like to have some reason to believe that this trust is reasonable, over and above the pragmatic justification that the science (p. 58 ) does not work as well without it. Here we enter familiar philosophical territory. For example, one might try to discredit skeptical hypotheses concerning the external world on the grounds that they are more complex than alternative explanations, or in some other way (see chapter 13 for my own take on this issue), and one might try to discredit skeptical hypotheses concerning other minds in analogous ways (I think that the best justification here may come from abduction from regularities observed in one’s own case). Such philosophical justifications for ordinary practices are almost invariably partial and questionable. Still, the science seems to go on successfully all the same.
From this perspective, the most worrying obstacle for the science of consciousness comes from specific reasons to doubt the accuracy of introspection even in one’s own case. It is well known that the introspective approach to psychology a century ago collapsed in part because of disagreement about the putative data: most famously, some theorists discerned an experience of imageless thought where others did not. A number of disagreements of this sort can still be found among scientists and philosophers today, and there is good reason to believe that introspection is very far from infallible. Schwitzgebel (2008), among others, has used considerations of this sort to cast doubt on the possibility of a science of consciousness.
Still, we can distinguish hard cases from easy cases. Just as we can isolate conditions under which we have specific reason not to trust perception, we can also isolate conditions under which we have specific reason not to trust introspection. There are good reasons for worrying about introspection of nonsensory experience, of consciousness outside attention, and so on, but these reasons do not obviously extend to core sensory experiences within the focus of attention. When a normal subject claims to experience a vivid red square, for example, there is usually little reason to disbelieve the subject. And it is experiences of this sort that are the bread and butter of the current science of consciousness. So for now, the science of consciousness is making considerable progress by focusing largely on these easy cases, and there is reason to believe that there will be much more progress ahead. Progress of that sort could well yield the core of a fruitful theory of consciousness. Eventually the science of consciousness will have to say something about the hard cases, too. New methods might well be required then, and it is difficult to know how far things will go. But in the meantime, one can predict that the science of consciousness will continue to thrive.
(1.) *What is a datum—a phenomenon, a proposition, a judgment, a record? The issue is to some extent verbal, but for present purposes it is best to take data to be observed states of affairs (in a broad enough sense of “observed”): for example, the state of affairs of a specific subject having a certain sort of experience (first-person data) or exhibiting a certain sort of behavior or having a certain sort of brain process (third-person data). Here, states of affairs should be individuated in a fine-grained way that is sensitive to the mode of presentation, as befits the epistemological role of data. So, for example, “The glass contains water” (observed by tasting) and “The glass contains H2O” (observed by chemical test) may count as different data.