martes, 21 de julio de 2009

Los Precursores

Disclaimer, esto es para cuasi nerds
Despues de haber leido sobre Shannon en lo de U, le mencione a Nyquist, para aclararlo, un poco de historia de los precursores de Shannon.

Aparte de ello, la unidad, ban

y, para la linea temporal (no a la Hawking)

http://en.wikipedia.org/wiki/Timeline_of_information_theory

Quantitative ideas of information

The most direct antecedents of Shannon's work were two papers published in the 1920s by Harry Nyquist and Ralph Hartley, who were both still research leaders at Bell Labs when Shannon arrived in the early 1940s.

Nyquist's 1924 paper, Certain Factors Affecting Telegraph Speed is mostly concerned with some detailed engineering aspects of telegraph signals. But a more theoretical section discusses quantifying "intelligence" and the "line speed" at which it can be transmitted by a communication system, giving the relation

W = K log m ,

where W is the speed of transmission of intelligence, m is the number of different voltage levels to choose from at each time step, and K is a constant.

Hartley's 1928 paper, called simply Transmission of Information, went further by using the word information (in a technical sense), and making explicitly clear that information in this context was a measurable quantity, reflecting only the receiver's ability to distinguish that one sequence of symbols had been intended by the sender rather than any other -- quite regardless of any associated meaning or other psychological or semantic aspect the symbols might represent. This amount of information he quantified as

H = log S^n ,

where S was the number of possible symbols, and n the number of symbols in a transmission.

The natural unit of information was therefore the decimal digit, much later renamed the hartley in his honour as a unit or scale or measure of information. The Hartley information, H0, is still used as a quantity for the logarithm of the total number of possibilities.

A similar unit of log10 probability, the ban, and its derived unit the deciban (one tenth of a ban), were introduced by Alan Turing in 1940 as part of the statistical analysis of the breaking of the German second world war Enigma cyphers. The decibannage represented the reduction in (the logarithm of) the total number of possibilities (similar to the change in the Hartley information); and also the log-likelihood ratio (or change in the weight of evidence) that could be inferred for one hypothesis over another from a set of observations. The expected change in the weight of evidence is equivalent to what was later called the Kullback discrimination information.

But underlying this notion was still the idea of equal a-priori probabilities, rather than the information content of events of unequal probability; nor yet any underlying picture of questions regarding the communication of such varied outcomes.

7 comentarios:

El Gato con Bolas dijo...

Leíste algo de la teoría algorítmica de la info? La que hizo Chaitin casi al mismo tiempo que Kolmogorov (creo que es así el nombre)?

ayjblog dijo...

no, la verdad que con ese nombre no, Kolmogorov si, se escribe asi, pero si mal no me acuerdo en control.

El Gato con Bolas dijo...

http://es.wikibooks.org/wiki/Chaitin,_Omega_y_otras_curiosidades_matem%C3%A1ticas

"Biografía

Gregory J. Chaitin Nacido en 1947 en New York hijo de inmigrantes argentinos. Con sólo trece años, estando en bachillerato, se interesó por el teorema de incompletitud de Gödel. En 1965 regresó a Argentina donde empezó a estudiar matemáticas. Una vez concluidos sus estudios empezó a trabajar para IBM compaginándolo como profesor en la facultad de Matemáticas de Buenos Aires.

A principios de los años 60 desarrolló su Teoría de la información algorítmica que relaciona la computación con la información. Sus primeros trabajos están muy relacionados con los estudios de Kolmogorov y su tratamiento estadístico de la información. La aportación más importante de Chaitin es la La constante Omega constante de Chaitin Ω: un número real entre 0 y 1, de dígitos equidistribuidos que expresa la probabilidad de que un programa aleatorio realice alguna tarea (no se cuelgue) al ser ejecutado en una máquina de Turing. Chaitin también escribe sobre metafísica, filosofía matemática, metamatemática, neurología (el problema de la conciencia y el estudio de la mente), biología (definiendo formalmente el concepto de 'vida'), filosofía digital (El universo es un autómata celular Turing-completo).

En 1995 obtuvo el título de doctor honoris causa por la Universidad de Maine. Es profesor honorario de la Universidad de Buenos Aires desde 2002, profesor visitante del Departamento de Informática de la Universidad de Auckland y miembro del comité internacional del Instituto de Sistemas Complejos de Valparaíso. Hoy en día es investigador en el Watson Research Center de New York."

El Gato con Bolas dijo...
Este comentario ha sido eliminado por el autor.
El Gato con Bolas dijo...

Kolmogorov complexity
From Wikipedia, the free encyclopedia


In algorithmic information theory (a subfield of computer science), the Kolmogorov complexity (also known as descriptive complexity, Kolmogorov-Chaitin complexity, stochastic complexity, algorithmic entropy, or program-size complexity) of an object such as a piece of text is a measure of the computational resources needed to specify the object. For example, consider the following two strings of length 64, each containing only lowercase letters, numbers, and spaces:

abababababababababababababababababababababababababababababababab
4c1j5b2p0cv4w1x8rx2y39umgw5q85s7uraqbjfdppa0q7nieieqe9noc4cvafzf

The first string admits a short English language description, namely "ab 32 times", which consists of 11 characters. The second one has no obvious simple description (using the same character set) other than writing down the string itself, which has 64 characters.

El Gato con Bolas dijo...

Y este artículo está bárbaro, formaliza la occam's razor

http://www.scholarpedia.org/article/Algorithmic_probability

El Gato con Bolas dijo...

Está bárbara la scholarpedia!

http://www.scholarpedia.org/article/Algorithmic_information_theory