You can edit almost every page by Creating an account. Otherwise, see the FAQ.

Expected float entropy

From EverybodyWiki Bios & Wiki


Example of an expected float entropy histogram generated by selecting weighted relations uniformly at random. The long left tail shows that certain choices of weighted relations are isolated from other choices in the sense that they give much lower expected float entropy values.

Expected float entropy (efe) stems from information theory but differs from Shannon entropy by including relationships as parameters. It is a measure of the expected amount of information required to specify the state of a system (such as an artificial or biological neural network) beyond what is already known about the system from the relationship parameters. For non-random systems, certain choices of the relationship parameters are isolated from other choices in the sense that they give much lower expected float entropy values and, therefore, the system defines relationships. Expected float entropy is an important definition in a recently developed mathematical theory of consciousness in which, through a variety of learning paradigms, the brain builds a model of the world by building relationships and in the context of these relationships a brain state acquires meaning in the form of the relational content of the corresponding experience. The principle article (Quasi-Conscious Multivariate Systems[1]) on this mathematical theory was published in 2015 following a more preliminary publication in 2012.[2]

The nomenclature "float entropy" comes from the notion of floating a choice of relationship parameters over a state of a system, similar to the idiom "to float an idea". Optimisation methods are used in order to obtain the relationship parameters that minimise expected float entropy. A process that performs this minimisation is itself a type of learning paradigm.

Overview[edit]

Relationships are ubiquitous among mathematical structures. In particular, weighted relations (also called weighted graphs and weighted networks) are very general mathematical objects and, in the finite case, are often handled in matrix form. They are a generalisation of graphs and include all functions since functions are a rather constrained type of graph. It is also the case that consciousness is awash with relationships; for example, red has a stronger relationship to orange than to green, relationships between points in our field of view give rise to geometry, some smells are similar whilst others are very different, and there's an enormity of other relationships involving many senses such as between the sound of someone's name, their visual appearance and the timbre of their voice. Expected float entropy includes weighted relations as parameters and, for non-random systems, certain choices of weighted relations are isolated from other choices in the sense that they give much lower expected float entropy values. Therefore, systems such as the brain define relationships and, according to the theory, in the context of these relationships a brain state acquires meaning in the form of the relational content of the corresponding experience.

Expected float entropy minimisation is very general in scope. For example, the theory has been successfully applied in the context to image processing[1] but also applies to waveform recovery from audio data.[2]

Definitions and connection with Shannon entropy[edit]

Definitions[edit]

For a nonempty set , a weighted relation on is a function of the form

.

Such a weighted relation is called reflexive if for all , and symmetric if for all . The set of all reflexive, symmetric weighted relations on is denoted .

If is the set of nodes of a system, such as a neural network, then a state of the system is given by the aggregate of the states of the nodes over some range of node states. Therefore each state of the system is determined by a corresponding function . The set of all possible states of the system is denoted .

Given an element , the above definitions give rise to a canonical map from to . That is, for , the function defined by

, for all ,

is an element of .

For and , the float entropy of a state of the system , relative to and , is defined as

,

where is a metric given by a matrix norm on the elements of in matrix form. In the article Quasi-Conscious Multivariate Systems[1] the norm is used. The article also includes a more general definition of float entropy called multirelational float entropy and the nodes of the system can be larger structures than individual neurons.

The expected float entropy (efe) of a system, relative to and , is defined as

,

where is the probability distribution determined by the bias of the system due to the long term effect of the system's inherent learning paradigms in response to external stimulus.

According to the theory, a system (such as the brain and its subregions) defines a particular choice of and (up to a certain resolution) under the requirement that the efe is minimized. Therefore, for a given system (i.e., for a fixed ), solutions in and to the equation

are the weighted relations of interest.

Connection with Shannon entropy[edit]

The Shannon entropy of a system is defined as

.

For , and the following equalities holds

                    (1).

The expression on the left is similar in form to the definition of Shannon entropy. The middle expression reveals the value to be similar to that of when the probabilities in the argument of the logarithm are comparable. Indeed, is an approximation of (1). The expression on the right of (1) shows the mathematical connection to Shannon entropy; the first term is the Shannon entropy of the system and, with consideration of the log function, the second term has a negative value between and 0.

Connection to other mathematical theories of consciousness[edit]

There are some similarities between the minimisation of expected float entropy and the minimisation of surprise in Karl J. Friston's Free energy principle. The theory is also somewhat complementary to Giulio Tononi's Integrated information theory (IIT) which was initially developed to quantify consciousness but gave little priority to how systems may define relationships.

See also[edit]

References[edit]

  1. 1.0 1.1 1.2 Mason, Jonathan. "Quasi-Conscious Multivariate Systems". Complexity. 21 (S1): 125–147. doi:10.1002/cplx.21720.
  2. 2.0 2.1 Mason, Jonathan. "Consciousness and the structuring property of typical data". Complexity. 18 (3): 28–37. doi:10.1002/cplx.21431.


This article "Expected float entropy" is from Wikipedia. The list of its authors can be seen in its historical. Articles copied from Draft Namespace on Wikipedia could be seen on the Draft Namespace of Wikipedia and not main one.