11 Feb Introspection as Empirical Science
Heightened awareness – inner vision – focus. I have looked to psychology, neuroscience and philosophy for clues about how to get computers to pay attention to all the right cues so they can engage in meaningful dialog with me. Much of what I encounter in the literature mixes experimental results with empirical observations. And many paths lead to discussions of phenomena, such as meta-awareness, that seem difficult to trace to a specific area of the brain or a specific pattern of excitation and inhibition between neurons. In the absence of hard facts, I am content to draw from whatever observations I may, to get as close as possible to pragmatic description of the inner workings of thought.
Many machines have built-in sensors and meters that show what’s going on inside. Some are even “self-healing” with the ability, through introspection, to adjust their own operating parameters to correct failure conditions. Today I want to talk a bit more about seeing within through introspection, and about consciousness phenomena, and tie them together as foundations for modeling similar processes in machines.
|Understanding Context Cross-Reference
|Click on these Links to other posts and glossary/bibliography references
|Schooler et al on meta-awareness
|perception kinesthesia experience
Introspective psychology deals with the moment. Given a current stream of sensory input, it seeks to discover what experience the person – or the brain or the mind – is undergoing. The current-stream notion suggests that we may be able to segregate a snapshot of input, then describe a snapshot of response. The complexity of the stream creates significant challenges. If you only considering the input from a single sense, such as smell or hearing alone, the stream may be like a still river, contained by a dam: complex but possible to analyze before it moves on. By the way, this spot on a backwater of the Rum River is a place I go frequently to seek a little solitude, and, while looking out, try to see within.
Unfortunately, the combination of sight, sound, taste, smell, touch and a raft of kinesthetic sensory input, seem more like a raging torrent than a becalmed stream. This is a picture of Gooseberry Falls where I bring the family for fun and frolic: less solitude, more intensity, completely conscious. How can we test James’ understanding of the “stream of consciousness” (yesterday’s post)? Remember, he suggested that human consciousness flows like a stream, in Principles of Psychology. James’ five characteristics of the stream are (quoting):
- Every thought tends to be part of a personal consciousness.
- Within each personal consciousness thought is always changing.
- Within each personal consciousness thought is sensibly continuous.
- It always appears to deal with objects independent of itself.
- It is interested in some parts of these objects to the exclusion of others”.
The continuity of consciousness and the imperfect resolution of our current monitoring capabilities make any attempt to segregate such a snapshot very nebulous at best. Still, we can see enough to better understand the respective roles that the major parts of the brain play in our intellectual and emotional lives. We are justified in describing our intellectual and emotional lives as continuous streams of experiences.
We may, in the course of our torrential experiences, become aware of our own cognitive processes. We may come to see, as if from a perch outside ourselves, what is going on in our heads. This is meta-awareness. Perhaps we are sitting in a lecture hall watching our-self learn, or perhaps, as we look inward, we see that our mind is wandering. Perhaps we even become aware of how our brain can process both the lecture and the wandering and the meta-awareness all at once. Brains are capable of such feats, but it’s not really clear how. Computers can multi-task as well, and I recently mentioned how many computing devices have an array of capabilities that enable introspection.
The word “experience” is fuzzy in that it may be difficult to conceive a concrete scientific definition of an experience. It may also be difficult to define discreet boundaries within which an experience cleanly fits. The process of introspection is useful, however, so this study shall rely to some extent on empirical observations of experiences and anecdotes.
Joseph LeDoux (1996, p. 85) makes a strong case for his assertion that conscious emotional experiences are a consequence of prior emotional processes (evaluations or appraisals) that occur outside of conscious awareness, which is to say, unconsciously.
Introspection is more subjective than a CT scan, but it represents another way to seek greater knowledge and understanding about human knowledge and understanding. In “The View from Nowhere” (1986), Nagel adopts a phenomenological point of view in examining perception and consciousness. He turns to empirical observations that are patently subjective to support his research:
“I make no pretense of knowing any more than I have been able to empirically observe in the mind with which I am most intimately associated, and which, for the moment, I am amused to call my own. This is a subjective point of view!” (p. 7).
Nagel goes on to point out that attempts to describe the world in purely objective terms, with no consideration of whose point of view is providing the filter, “inevitably leads to false reductions or to outright denial that certain patently real phenomena exist at all” (ibid). In short, a great deal of what we know, particularly empirical observations, are connected to the observer’s point of view. What you see depends on your point of view. The most objective thing I know of is a photo taken at random from a satellite. Absent objective evidence, we must rely on introspection and tainted conjecture and hope for the best.
Again, will computers ever be self-aware to the point of introspection? In some ways they are already. It is difficult for us to trace the lineage of a thought, but computers can log the entire history of a file or a data record and play back to you its lineage. That’s different, you may rejoin. I agree. The subjectivity remains, and since human competence is laced with subjectivity, matching human competence need not attempt to eliminate it. To make the self-awareness of a computer meaningful it needs much more complete context that it has today. I think this is in the realm of “doable”. I’ll show you why and how soon.
|Click below to look in each Understanding Context section
|Perception and Cognition
|Language and Dialog
|Apps and Processes
|The End of Code