Algorithmic information theory (nonfiction): Difference between revisions

From Gnomon Chronicles
Jump to navigation Jump to search
No edit summary
 
Line 15: Line 15:
* [[Algorithmically random sequence (nonfiction)]] - an infinite sequence of binary digits that appears[clarification needed] random to any algorithm. The notion can be applied analogously to sequences on any finite alphabet (e.g. decimal digits). Random sequences are key objects of study in algorithmic information theory.
* [[Algorithmically random sequence (nonfiction)]] - an infinite sequence of binary digits that appears[clarification needed] random to any algorithm. The notion can be applied analogously to sequences on any finite alphabet (e.g. decimal digits). Random sequences are key objects of study in algorithmic information theory.
* [[Chaitin's constant (nonfiction)]] - a [[Real number (nonfiction)|real number]] representing the probability that a randomly constructed program will halt. Although there are infinitely many halting probabilities, one for each method of encoding programs, it is common to use the letter Ω to refer to them as if there were only one. Each halting probability is a normal and [[Transcendental (nonfiction)|transcendental]] real number that is not computable, which means that there is no algorithm to compute its digits. Indeed, each halting probability is Martin-Löf random, meaning there is not even any algorithm which can reliably guess its digits.
* [[Chaitin's constant (nonfiction)]] - a [[Real number (nonfiction)|real number]] representing the probability that a randomly constructed program will halt. Although there are infinitely many halting probabilities, one for each method of encoding programs, it is common to use the letter Ω to refer to them as if there were only one. Each halting probability is a normal and [[Transcendental (nonfiction)|transcendental]] real number that is not computable, which means that there is no algorithm to compute its digits. Indeed, each halting probability is Martin-Löf random, meaning there is not even any algorithm which can reliably guess its digits.
* [[Chaitin–Kolmogorov randomness (nonfiction)]]
* [[Chaitin–Kolmogorov randomness (nonfiction)]] - see [[Kolmogorov complexity (nonfiction)]]
* [[Computationally indistinguishable (nonfiction)]]
* [[Computational indistinguishability (nonfiction)]] - two [[Distribution ensemble (nonfiction)|distribution ensembles]] are computationally indistinguishable if no efficient algorithm can tell the difference between them except with small probability.
* [[Distribution ensemble (nonfiction)]]
* [[Distribution ensemble (nonfiction)]] - in [[Cryptography (nonfiction)|cryptography]], a distribution ensemble or probability ensemble is a family of distributions or random variables.
* [[Epistemology (nonfiction)]]
* [[Epistemology (nonfiction)]] - the branch of philosophy concerned with the theory of knowledge.
* [[Inductive inference (nonfiction)]]
* [[Inductive probability (nonfiction)]] - the probability of future events based on past events. It is the basis for inductive reasoning, and gives the mathematical basis for learning and the perception of patterns. It is a source of knowledge about the world.
* [[Inductive probability (nonfiction)]]
* [[Inductive reasoning (nonfiction)]] - a method of reasoning in which the premises are viewed as supplying some evidence for the truth of the conclusion; this is in contrast to deductive reasoning. While the conclusion of a deductive argument is certain, the truth of the conclusion of an inductive argument may be probable, based upon the evidence given. Many dictionaries define inductive reasoning as the derivation of general principles from specific observations, though there are many inductive arguments that do not have that form.
* [[Invariance theorem (nonfiction)]]
* [[Invariance theorem (nonfiction)]] may refer to:
** Chou's invariance theorem, a result in multivariate statistics
** Invariance of domain, a theorem in topology
** A theorem pertaining to Kolmogorov complexity
** A result in classical mechanics for adiabatic invariants
** A theorem of algorithmic probability
* [[Kolmogorov complexity (nonfiction)]] - given an object, such as a piece of text, Kolmogorov complexity the length of the shortest computer program (in a predetermined programming language) that produces the object as output. It is a measure of the computational resources needed to specify the object.
* [[Kolmogorov complexity (nonfiction)]] - given an object, such as a piece of text, Kolmogorov complexity the length of the shortest computer program (in a predetermined programming language) that produces the object as output. It is a measure of the computational resources needed to specify the object.
* [[Limits of knowledge (nonfiction)]]
* [[Limits of knowledge (nonfiction)]] - see [[Epistemology (nonfiction)]]
* [[Minimum description length (nonfiction)]]
* [[Minimum description length (nonfiction)]] - a formalization of [[Occam's razor (nonfiction)|Occam's razor]] in which the best hypothesis (a model and its parameters) for a given set of data is the one that leads to the best compression of the data.
* [[Minimum message length (nonfiction)]]
* [[Minimum message length (nonfiction)]] - a formal information theory restatement of [[Occam's razor (nonfiction)|Occam's Razor]]: even when models are equal in goodness of fit accuracy to the observed data, the one generating the shortest overall message is more likely to be correct (where the message consists of a statement of the model, followed by a statement of data encoded concisely using that model).
* [[Pseudorandom ensemble (nonfiction)]]
* [[Pseudorandom ensemble (nonfiction)]] - in [[Cryptography (nonfiction)|cryptography]], a family of variables.
* [[Pseudorandom generator (nonfiction)]]
* [[Pseudorandom generator (nonfiction)]] - a class of statistical tests is a deterministic procedure that maps a random seed to a longer pseudorandom string such that no statistical test in the class can distinguish between the output of the generator and the uniform distribution. The random seed is typically a short binary string drawn from the uniform distribution.
* [[Claude Shannon (nonfiction)]]
* [[Claude Shannon (nonfiction)]]
* [[Simplicity theory (nonfiction)]]
* [[Simplicity theory (nonfiction)]] - a cognitive theory that seeks to explain the attractiveness of situations or events to human minds.  It asserts that interesting situations appear simpler than expected to the observer.
* [[Solomonoff's theory of inductive inference (nonfiction)]]
* [[Solomonoff's theory of inductive inference (nonfiction)]] - a theory of prediction based on logical observations, such as predicting the next symbol based upon a given series of symbols. The only assumption that the theory makes is that the environment follows some unknown but computable probability distribution. It is a mathematical formalization of [[Occam's razor (nonfiction)|Occam's razor]] and the Principle of Multiple Explanations.
* [[Stochastic chains with memory of variable length (nonfiction)]] - a family of [[Stochastic process (nonfiction)|stochastic chains]] of finite order in a finite alphabet, such as, for every time pass, only one finite suffix of the past, called context, is necessary to predict the next symbol.
* [[Alan Turing (nonfiction)]]
* [[Alan Turing (nonfiction)]]
* [[Uniform ensemble (nonfiction)]]
* [[Uniform ensemble (nonfiction)]] - see [[Distribution ensemble (nonfiction)]]


* [https://en.wikipedia.org/wiki/Algorithmic_information_theory Algorithmic information theory] @ Wikipedia
* [https://en.wikipedia.org/wiki/Algorithmic_information_theory Algorithmic information theory] @ Wikipedia

Latest revision as of 06:13, 21 October 2019

Algorithmic information theory (AIT) is a "merger of information theory and computer science" that concerns itself with the relationship between computation and information of computably generated objects (as opposed to stochastically generated), such as strings or any other data structure. In other words, it is shown within algorithmic information theory that computational incompressibility "mimics" (except for a constant that only depends on the chosen universal programming language) the relations or inequalities found in information theory. According to Gregory Chaitin, it is "the result of putting Claude Shannon's information theory and Alan Turing's computability theory into a cocktail shaker and shaking vigorously."

Algorithmic information theory principally studies measures of irreducible information content of strings (or other data structures). Because most mathematical objects can be described in terms of strings, or as the limit of a sequence of strings, it can be used to study a wide variety of mathematical objects, including integers. Indeed, one of the main motivations behind AIT is the very study of the information carried by mathematical objects as in the field of metamathematics, e.g., as shown by the incompleteness results mentioned below. Other main motivations came from: surpassing the limitations of classical information theory for single and fixed objects; formalizing the concept of randomness; and finding accurate probabilistic inference without prior knowledge of the probability distribution (e.g., whether it is independent and identically distributed, markovian, or even stationary).

In this way, AIT is known to be basically found upon three main mathematical concepts and the relations between them:

  • algorithmic complexity
  • algorithmic randomness
  • algorithmic probability

Besides the formalization of a universal measure for irreducible information content of computably generated objects, some main achievements of AIT were to show that: in fact algorithmic complexity follows (in the self-delimited case) the same inequalities (except for a constant) that entropy does, as in classical information theory; randomness is incompressibility; and, within the realm of randomly generated software, the probability of occurrence of any data structure is of the order of the shortest program that generates it when running on a universal machine.

See also