29 Jan Brain Signal Variations
I’ve focused, in the last few posts, on the structural and functional variety in neurons and synapses. Brain signal variations include local potentials and post-synaptic potentials. Local potential differs from action potential in that the latter is generally characterized by a brief electrical spike and return to resting potential. This characterization of action potential does not always apply: the case of rapid successions of potentials, described later, is one example. Local potentials, in contrast, are graded and less transient. The standard characterization for local potential is that of receptor potential. Local potentials in sensory nerve endings, such as those in the skin, permit analog responses to stimuli. Because of local potentials in the thumb, for instance, we can differentiate between the sensation in our thumb when we turn a page in a book and when we strike our thumb with a hammer.
Less extreme examples of the utility of graded potentials are our sensitivities to pressure, texture, and temperature. The magnitude of the stimuli determine the magnitude of our responses. Such receptor potentials are evidenced throughout the peripheral nervous system (PNS) and parts of the central nervous system (CNS) dedicated to perception.
|Understanding Context Cross-Reference|
|Click on these Links to other posts and glossary/bibliography references|
|Prior Post||Next Post|
|Go With the Flow||From Aristotle to the Enchanted Loom|
|Brain signal spike||Kuffler 1984|
|action potential sensory||The Mind's Machine|
|analog response||Stevens 1989|
Regarding receptor or local potentials, Kuffler, et al., say that the job of such receptors is to “transduce” or change the physical stimulus “into a receptor potential which can then be processed further by the nerve cell” [1984, p. 101].
Another type of local potential, and one more relevant to cybernetics, is postsynaptic potential. These are passive, chemically mediated voltages that can be excitatory or inhibitory. As Kuffler, et al. explain, the “size of a postsynaptic potential also can be graded and is a reflection of the number and rate of activity of excitatory or inhibitory presynaptic nerve terminals giving rise to it” [1984, p. 101]. “Reflection” in this sense means that the local potential is an indirect result of the flow of E/I in other neurons in the environment of the synapse, not the synapse itself.
Identifying more types of electrical activity in the nervous system further complicates the physiological circuitry. But this phenomenon is too important to ignore in examining the functions of nerves and the flow of potential within and between them. The presence of excitatory local potentials that are inherently less transient than action potentials will prolong or strengthen action potentials. Inhibitory local potentials, on the other hand, can increase the negative potential of inhibitory action potential, thus decreasing or even nullifying incoming excitatory action potentials.
If Neurons Represented Discrete Objects or Concepts…
For the purposes of cybernetic modeling, consider the possibility of neurons representing discrete objects or concepts. If this were the case, and if related things/concepts occupied contiguous places in the brain’s geography, then postsynaptic potentials could represent exceptions to general rules. In other words, assume a general rule and several of the types of objects to which it applied were represented in area Y. Then a new exception to the rule was learned. Whenever the exception was encountered, its local effects could inhibit application of the general rule.
Notice in the illustration of a prolonged potential below (green line) that the time the membrane is in a state of excitation is double . This is one of the simplest varieties of impulse curves.
The chemically induced nature of local potentials suggests that a prolonging effect may frequently result from interaction between local and action potentials. This results from the expanded range of depolarization and reversal. Over time, however, as the balance of chemicals in the immediate environment of the potentials shifts, refraction, or a reversal of the potential, enables the membranes to return to resting potential more quickly. Thus, as local and action potentials interact, the entire curve of an impulse can become more rounded and diffuse. Local potentials can also partially account for general inhibition in the region of a depolarized or reversed potential.
The graphs at right show the difference between typical spikes and prolonged potentials. The labels on the left side of the graphs represent what happens during hyperpolarization: maximum depolarization (Max), threshold potential (Thresh), resting potential (Rest), and minimum potential (Min). The time is shown in milliseconds. Since the time represented is extremely short, it is difficult to describe how prolonged potentials contribute to complex cognitive activities, but the neural network modeling implications are profound, especially when the prolonged potentials last dozens or hundreds of milliseconds.
Polarization is Analog
There are analog and digital components to the processes within and between neurons. Polarization (depolarization and hyperpolarization) raises or lowers the electrical potential of nerve fibers to any of a number of levels. The degree to which the electric potential is raised or lowered varies from roughly -100 millivolts to +70 millivolts. This constitutes an analog scale; thus, the processes of polarization can be correctly described as analog processes, i.e. having a range of possible values.
Threshold is Digital
The binary element of impulse propagation is the effect of the threshold phenomenon. The threshold of action potential in neurons is about -40 millivolts. At that level, excitatory impulses are propagated through the system while lower action potentials are not. Thus there are only two possible values: 1) high enough to propagate and 2) not high enough to propagate. Because of this effect, the term “fired” has been applied to neurons that reach action potentials above threshold. The binary distinctions ON (fired) and OFF in a system of neurons has appealed to computer scientists for decades because of the ease of modeling binary systems on digital computers.
The decay of ACh-driven impulses is catalyzed by hydrolysis. Cholinesterase is the primary agent in ACh hydrolysis, but anticholinesterase agents, such as eserine and neostigmine, inhibit cholinesterase. Inhibition of the cholinesterase slows the decay of excitation responses in nerve fibers because neurotransmitter is not hydrolyzed. The normal decay time of impulses is roughly one millisecond. When decay is delayed by anticholinesterase, however, the decay is retarded. Under these circumstances, residual depolarization can last from 10 up to 100 milliseconds or more [Stevens, 1989, pp. 11-12]. This permits the accumulation of additional impulses to strengthen or reinforce the original impulse.
After the neurotransmitter is liberated, it is destroyed by enzymes. Thus, the membrane channels or gates quickly return to their resting state. The intensity of E/I response decays rapidly, and the locus of the response is isolated to the immediate synaptic link. E/I often occurs as multiple pulses over a short duration instead of as a single spike. The frequency of impulses, as mentioned above, has some impact on the level of excitation or inhibition reached in the neuron and the speed with which it returns to resting potential.
Event Related Potentials
The concentration and distribution of chemicals at synaptic junctions has a profound influence both on the strength of individual impulses and their decay. Event-related potentials (ERP) in the central nervous system are normally 500 milliseconds or more (this may be partly due to repetitive stimuli). A typical brain area where ERP are common is the receptive speech (Wernicke’s) area, where we receive and interpret language based on audio (speech) or visual (text) input. the following images are from:
National Institutes of Health http://pubs.niaaa.nih.gov/publications/arh313/238-242.htm
The Journal Nature.com http://www.nature.com/neuro/journal/v7/n1/suppinfo/nn1160_S1.html
These variations in electronic signaling may help explain how the brain responds to different durations of signals. When we see a familiar image, we may recognize it in less than 100 milliseconds (one tenth of a second). Research shows that our responses to spoken language can differ profoundly. Individual words in ordinary speech elicit responses of roughly 100 milliseconds. Keywords elicit responses up to 300 milliseconds. This temporal element in physiological processing of sensory input is an essential component of cognition: the lengthened impulse is likely to contribute to recognition and understanding by providing more intense or more lasting activation to conceptually related knowledge. A simulation of spreading activation in a neural network should account for both the temporal and spatial elements.
|Click below to look in each Understanding Context section|
|4||Perception and Cognition||5||Fuzzy Logic||6||Language and Dialog||7||Cybernetic Models|
|8||Apps and Processes||9||The End of Code||Glossary||Bibliography|