Cybercognition
Cybernetics is the science of regulation, either of industrial plant, or of biological organisms. Cybernetic theory has both difficulties and advantages when applied to Cognition, recursively defined as the control of Behavior and itself (that is, Cognition). The application of cybernetics to cognition is generically referred to as Cybernetic Cognition, or Cybercognition.[1][citation needed]
Definition[edit]
A theoretical model of mind which explains all human cognition as consisting of real-time cybernetic interactions between perceptual ('bottom-up', feedback, sensor-oriented) and conceptual ('top-down', feedforward, motor-oriented) information flows and hierarchies. The theory is inspired and informed by the alternative Cybernetics research of William Ross Ashby. The percept and concept hierarchies are the data representations which most closely resemble those described by Antonio Damasio.[2] Consciousness and volition are both treated as 'Dual' or 'dyadic' components of subjective experience. The prefix Cyber- applies to the theory because it is characterized by perceptual, feedback information flows which are primary, as per the Common coding theory originating in the work of psychologist William James.[3]
Introduction[edit]
In his first book, 'An Introduction to Cybernetics', cybernetics pioneer W. Ross Ashby describes the basic principles of feedback control, and applies them to both analysis of real situations and systems, and includes some techniques for synthesis of design of control circuits. In his second book, 'Design for a brain', Ashby (1960)[4] extends the basic principles of feedback control, and applies them to more dynamic situations and complex systems. For various reasons that are probably as historical as they are technical, it was the name of Norbert Weiner,[5] and not Ashby's, that went on to become popularly associated with Cybernetics. Although Ashby's ideas have been largely forgotten, some of the key concepts are similar to those of Grounded Cognition, as described by Lawrence Barsalou.[6]
Comparison to similar paradigms[edit]
In 'Design for a Brain', Ashby suggests that conventional cybernetics are an insufficient explanation of the uncanny ability that animals (and humans) have to sudden unexpected disturbances (shocks, surprises), and goes on to describe a generic design for a complex system which uses both feedback and feedforward information circuits to rapidly adapt to sudden changes in ambient conditions. Since it is the successful dynamic adaptive reaction to novelty and challenge that guides the evolution of all higher animal nervous systems, he suggests that his new system prototype be used as a design for an artificial brain. Ashby built and successfully tested a basic prototype.
Ashby (p9, S.1/13) avoids any latent teleology in his design, commenting that, "No teleological explanation for behavior will be used...Never will we use the explanation that the action is performed because it will later be advantageous to the animal". Teleology or purpose is a property of feedforward construction, and Ashby has restricted himself to the use of feedback (Cyber-) architectures. Conventional feedforward (i.e. compiled programs) cybernetic models are included in the category Computational cybernetics, and are considered distinct from those models included within the Cybercognition category. Implicit in this division is the belief that real living minds (examples of animal and human cognition) are primarily cybernetic, feedback-driven mechanisms, while it is digital computers that employ feedforward mechanisms as a primary architectural paradigm.
The Cybercognition category posits a hybrid viewpoint: (1) a conscious mind that consists of cybernetic (controlled processes) (2) an unconscious mind consisting of computational(automatic processes). This hybrid model was first described by Schneider & Schiffrin (1977).[7] Note that, at the time (the 1970's) the authors found it necessary to use the word 'controlled' for 'conscious' and 'automatic' for 'non-conscious. Arguably, they were influenced by the prevailing negative attitude to the concept of machine consciousness, an idea that, though entirely compatible with physicalism, is still debated.
See also[edit]
References[edit]
- ↑ Heylighen, F.(2010) Lecture Notes 2009-2010. ECCO: Evolution, Complexity and Cognition - Vrije Universiteit Brussel
- ↑ Damasio, A., & Meyer, K. (2009). Convergence and divergence in a neural architecture for recognition and memory. Trends in Neurosciences 32 , 376-382.
- ↑ James, W. (1890) Principles of Psychology. Vol I & II
- ↑ Ashby, W.R. Design for a Brain. Wiley:New York.
- ↑ Wiener, N. (1948) Cybernetics; or, Control and communication in the animal and the machine. Technology Press, Paris ed. Hermann & Cie
- ↑ Barsalou, L.W.(1999) Perceptual Symbol Systems. BEHAVIORAL AND BRAIN SCIENCES 22, 577–660
- ↑ Schneider, W. & Shiffrin.R. M. (1977). Controlled and automatic human information processing: 1. Detection, search, and attention. Psychological Review, 84, pp1-66.
This article "Cybercognition" is from Wikipedia. The list of its authors can be seen in its historical. Articles copied from Draft Namespace on Wikipedia could be seen on the Draft Namespace of Wikipedia and not main one.