Under the hood of awareness

Behind what is subjectively perceived as "the unity of the self" (or "the unity of awareness") there lies a "bubbling surface" of parallel processes which, although they operate autonomously, they also interact with one another in numerous ways.

  • for example, imagine driving a car and in the mean time having a conversation with a passenger: there will be numerous autonomous, but inter-connected and coordinated, processes involved in the proper handling of this compound task, each of them dealing with an autonomous activity e.g. the subconscious steering of the car such that it stays on course, the subconscious articulation of sentences which are grammatically correct, the subconscious parsing of the passenger's replies, etc.


Mind, mental state, and the cognitive processes

In the representation above, the cylinder-shaped container represents one's mind, the "bubbling surface" represents the entire state of mind at a certain moment in time, and each individual bubble represents one individual process that occurs in one's mind (each such process is not only correlated with the rest of the processes, but it also may be connected to various body sensors and actuators).


Awareness and non-awareness
Let us now posit that any sensation that we can ever be aware of is directly related to the activity of a specific area of one's mind (this thesis seems to be supported by experimental results); in this case, we can identify areas on the mental state who's activity would correspond to the various kinds of sensations that we are capable of feeling:


Sensation mapping on the mental state

In the sensation map representation above, the areas that correspond to possible sensations (e.g. the sensation of "seeing a mental image", or "hearing a mental sound", or a "feeling a touch", etc) have been deliberately limited such that they do not cover the entire area of one's mental state, thus revealing the possibility that some mental processes occur outside the reach of awareness. In other words, we posit that only a limited area of one's mental state is within the realm of awareness (i.e. the processes that occur in that area have a direct correspondent in sensations), while there may well exist many more "hidden processes" in one's mind who's existence can only be inferred by observing the "sampling points" that are accessible through our sensations.
  • note: if we consider a "mental state" as depicted in the above diagrams to correspond to the overall state of a complete (mammal) brain, then the hypothesis of non-sensible processes is experimentally confirmed by the fact that there is no specific (i.e. "direct") sensory representation of the activity that occurs on the surface of e.g. the cerebellum.

Process interconnectivity
The "bubble processes" in the diagrams above are autonomous but not independent; specifically, various levels of interconnectivity exist between separate processes, and indeed each single process may be composed of a multitude of autonomous, locally interconnected sub-processes. In fact, based on what is known about the structure of the brain, we can posit an elaborate, and biologically consistent, pattern of connectivity between mental processes based on an underlying 'connectivity grid of the mind' which connects any two different regions, large or small, neighboring or not, according to a certain 'connectivity map':

Multi-layer inter-process connectivity grid

The connectivity grid can be regarded as a many-layer construction, with each layer being dedicated to to a certain connectivity range. Furthermore, the implication of multiple physically identifiable layers may (but needs not) be disposed of, allowing instead for a "continuum of layers" image in the sense that the distinction between layers is not physical in nature, but rather it is based on the connectivity property of their "building bricks" (i.e. we'd call a set of elements that are interconnected within a certain range of distances a "layer"). In other words, we can draw a connectivity map for the short-range connections (the short-range "layer"), another map for longer-range connections (the longer-range "layer"), a.s.o. until we'd reach a 'inter-regional connectivity grid' that would represent the connections between macro-regions (such as e.g. the sensation-specific areas), much in the way various function-specific areas of the brain are interconnected at the macro-level.


Levels of organization
The presence of the multi-layer connectivity grid reveals the "bubbling surface" model as a recursive architecture at various levels of detail (i.e. at different "zoom levels"). In this architectural scenario, a mental process actually consists of several "sub-processes", with each sub-process consisting of other "sub-sub-processes", a.s.o. until we'd reach a very low level of 'atomic processes' that are effectively supported in hardware by 'architectural atoms'. In other words, each individual autonomous process is composed of a number of parallel autonomous sub-processes, each such parallel sub-process may in turn be composed of other parallel sub-sub-processes, etc, resulting in some sort of a hierarchy of processes that all contribute to the overall activity of their common 'super-process'.
  • for example, the process of steering the wheel of a car during driving is in fact composed of a myriad of networked sub-processes that control the activity of all the muscles involved in steering, etc, with the extra remark that these individual sub-processes (or even their sub-sub-processes components) may, or may not, be within the realm of attention at a given time.


Recursive "bubbling surface" model

At first sight, one might be tempted to identify a hierarchical architecture in the diagram above, but this is not the case: what the "bubbling surface" model provides is in fact a distributed processing architecture of interconnected architectural atoms (these would roughly correspond to the cortical columns in the biological cortex) that allows the generation of hierarchical patterns of processes. Unlike a hierarchical architecture, the hierarchical organization of processes is not static, as it involves the dynamic creation, merger, migration, destruction, etc, of processes, all supported by the common ground of the underlying architecture's distributed structures.


Competing super-processes
For any given sub-process there may be occasional competition between several higher-level super-processes that all try to "incorporate" said sub-process as one of their own components. For example, if we try to reach a glass of water with one of our hands and in the mean time one of our cheeks starts feeling itchy, our hand may be automatically (i.e. subconsciously) diverted from reaching to the glass of water and directed towards our cheek (the fact that this process is many times subconscious is illustrated by the fact that it will usually take several fractions of a second after the hand changes direction for us to become aware of the new direction of our hand; in fact, we will usually become aware that the hand has been "hijacked" for scratching our cheek only when we effectively feel our cheek being scratched, i.e. this action was controlled by a mechanism outside the realm of our awareness). What happens in the above scenario is a competition between two super-processes, both requiring the control of a single physical resource by means of a dedicated sub-process: eventually, in the case of normal (undamaged) brains, the usage of the unique resource will be somehow arbitrated (unconsciously and/or consciously, depending on the situation) such that only one of the competing super-processes will effectively control the resource by "coordinating" a dedicated sub-process that ultimately controls said resource
  • note: if the mechanisms that arbitrate the competition between super-processes for the "coordination" of a sub-process do not function properly, this condition will lead to various localizations, and levels, of incoordination. Some evident cases of severe incoordination are those involving the motor behaviors as they can lead to erratic, and even self-destructive, bodily behaviors (e.g. a specific genetic mutation has been shown to disable the coordination of the mouse limbs and trunk), but the lack of coordination can manifest itself even in strictly mental activities, e.g. a chronic inability to focus (as in the attention-deficit disorder), or the inability to follow a logic argument, etc.

Specialization

No comments:

All content on this website is covered by the copyright policy of the AI Project.