Algorithmic randomness and physical entropy

Aus de_evolutionary_art_org
Version vom 11. Juni 2016, 21:19 Uhr von Gubachelier (Diskussion | Beiträge) (Die Seite wurde neu angelegt: „ == Referenz == Zurek, W.H.: Algorithmic randomness and physical entropy. Phys. Rev. A 40(8), 4731 (1989) == DOI == http://dx.doi.org/10.1103/PhysRevA.…“)

(Unterschied) ← Nächstältere Version | Aktuelle Version (Unterschied) | Nächstjüngere Version → (Unterschied)
Wechseln zu: Navigation, Suche


Referenz

Zurek, W.H.: Algorithmic randomness and physical entropy. Phys. Rev. A 40(8), 4731 (1989)

DOI

http://dx.doi.org/10.1103/PhysRevA.40.4731

Abstract

Algorithmic randomness provides a rigorous, entropylike measure of disorder of an individual, microscopic, definite state of a physical system. It is defined by the size (in binary digits) of the shortest message specifying the microstate uniquely up to the assumed resolution. Equivalently, algorithmic randomness can be expressed as the number of bits in the smallest program for a universal computer that can reproduce the state in question (for instance, by plotting it with the assumed accuracy). In contrast to the traditional definitions of entropy, algorithmic randomness can be used to measure disorder without any recourse to probabilities. Algorithmic randomness is typically very difficult to calculate exactly but relatively easy to estimate. In large systems, probabilistic ensemble definitions of entropy (e.g., coarse-grained entropy of Gibbs and Boltzmann’s entropy H=lnW, as well as Shannon’s information-theoretic entropy) provide accurate estimates of the algorithmic entropy of an individual system or its average value for an ensemble. One is thus able to rederive much of thermodynamics and statistical mechanics in a setting very different from the usual. Physical entropy, I suggest, is a sum of (i) the missing information measured by Shannon’s formula and (ii) of the algorithmic information content—algorithmic randomness—present in the available data about the system. This definition of entropy is essential in describing the operation of thermodynamic engines from the viewpoint of information gathering and using systems. These Maxwell demon-type entities are capable of acquiring and processing information and therefore can ‘‘decide’’ on the basis of the results of their measurements and computations the best strategy for extracting energy from their surroundings. From their internal point of view the outcome of each measurement is definite. The limits on the thermodynamic efficiency arise not from the ensemble considerations, but rather reflect basic laws of computation. Thus inclusion of algorithmic randomness in the definition of physical entropy allows one to formulate thermodynamics from the Maxwell demon’s point of view.

Extended Abstract

Bibtex

@article{PhysRevA.40.4731,
 title = {Algorithmic randomness and physical entropy},
 author = {Zurek, W. H.},
 journal = {Phys. Rev. A},
 volume = {40},
 issue = {8},
 pages = {4731--4751},
 numpages = {0},
 year = {1989},
 month = {Oct},
 publisher = {American Physical Society},
 doi = {10.1103/PhysRevA.40.4731},
 url = {http://link.aps.org/doi/10.1103/PhysRevA.40.4731 http://de.evo-art.org/index.php?title=Algorithmic_randomness_and_physical_entropy}
}

Used References

'L. Boltzmann, Wien Ber. 76, 373 (1977). See S. G. Brush, Kinetic Theory {Pergamon, Oxford, 1965), Vol. 1, (1966), Vol. 2, and (1972), Vol. 3, for translation of this and other relevant papers.

J. W. Gibbs, Elementary Principles in Statistica/ Mechanics (Yale University Press, London, 1928).

3J. von Neumann, Mathematical Foundations of Quantum Mechanics (Princeton University Press, Princeton, 1955).

4H. Grad, Comm. Pure Appl. Math. 14, 323 (1961).

~K. G. Denbigh and J. S. Denbigh, Entropy in Relation to Incomplete Knowledge (Cambridge University Press, Cambridge, England, 1985)~

6p. C. W. Davies, Physics of Time Asymmetry (University of California Press, Berkeley, 1977). S. G. Brush, Ref. 1, Vols. 1—3. ~A. Wehrl, Rev. Mod. Phys. 50, 221 (1978). I. Prigogine, From Being to Becoming (Freeman, San Fancisco, 1980).

' J. Ford, Phys. Today 36, (4) 40, (1983), and references therein.

"Quantum Theory and Measurement, edited by J. A. Wheeler and W. H. Zurek (Princeton University Press, Princeton, 1983); H. Everett, in The Many Worlds I-nterpretation of Quantum Mechanics, edited by B. S. DeWitt and N. CJraham (Princeton University Press, Princeton, 1973); W. H. Zurek, in Information Transfer in Quantum Measurements: Irreversi bi lity and Amplification, Vol. 94 of Nato Advanced Study In stitute Series in Quantum Optics, Experimental Gravitation and Measurement Theory, edited by P. Meystre and M. O. Scully (Plenum, New York, 1983), pp. 87—106.

12 W. Shannon and W. Weaver, The Mathematical Theory of Communication (University of Illinois Press, Urban, 1949).

13A. I. Khintchin, Information Theory (Dover, New York, 1957); R. W. Hamming, Coding and Information Theory (Prentice Hall, Englewood Cliffs, NJ, 1986).

14 S. W. Hawking, Phys. Rev. D 13, 191 (1976).

~5J. B. Hartle and S. W. Hawking, Phys. Rev. D 28, 2960 (1983).

' R. J. Solomonoff, Inf. Control 7, 1 (1964).

' A. N. Kolmogorov, Inf. Transmission 1, 3 (1965).

' G. J. Chaitin, J. Assoc. Comput. Mach. 13, 547 (1966).

' A. N. Kolmogorov, IEEE Trans. Inf. Theory 14, 662 {1968).

A. N. Kolmogorov, Usp. Mat. Nauk. 23, 201 (1968).

G. J. Chaitin, J. Assoc. Comput. Mach. 22, 329 (1975), and references therein.

G. J. Chaitin, Scj. Am. 232{5),47 (1975)

G. J. Chaitin, IBM, J. Res. Dev. 21, 350 (1977).

24A. K. Zvonkin and L. A. Levin, Usp. Mat. Nauk. 25, 602 (1970).

25P. Martin-Lof, Inf. Control 9, 602 (1966).

26G. J. Chaitin, Algorithmic Information Theory (Cambridge University Press, Cambridge, England, 1987), and references therein.

C. H. Bennett, Int. J. Theor. Phys. 21, 905 (1982).

~sP. Gacs, Dokl. Akad. Nauk SSSR 218, 1265 (1974) [Sov. Phys.—Dokl. 15, 1477 (1974)].

L. A. Levin, Dokl. Akad. Nauk SSSR 227, 1293 (1976) [Sov. Phys.—Dokl. 17, 522 (1976)].

L. Szilard, Z. Phys. 53, 840 {1929), English translation in Behav. Sci. 9, 301 (1964), reported in Quantum Theory and Measurement, Ref. 11.

~E. T. Jaynes, Phys. Rev. 106, 620 (1957); 108, 171 (1957).

32L. Brillouin, Science and Information Theory, 2nd ed. (Academic, London, 1962).

D. Hofstader, Godel, Escher, Bach (Vintage, New York, 1980). C. H. Bennett, IBM J. Res. Dev. 17, 525 (1973).

R. Landauer, IBM J. Res. Dev. 3, 183 (1961).

C. H. Bennett, IBM J. Res. Dev. 32, 16 (1988).

R. Landauer, Found. Phys. 16, 551 (1986)~

38C. H. Bennett, Sci. Am. 255(11), 108 (1987).

39W. H. Zurek, in Frontiers of Nonequilibrium Statistical Mechanics, edited by G. T. Moore and M. O. Scully (Plenum, New York, 1986), pp. 151—161.

W. H. Zurek, Nature (to be published); and unpublished.

4~W. H. Zurek, Phys. Rev. D 24, 1516 (1981);26, 1862 (1982); in Ref. 39, pp. 145—149.

4~P. Ehrenfest and T. Ehrenfest, The Conceptual Foundations of the Statistical Approach, English translation {Cornell University Press, Ithaca, 1956).

43I. Prigogine, C. George, F. Henin, and L. Rosenfeld, Chem. Scr. 4, 5 (1974).

44Such a "reversal" of the approach was advocated in the context of quantum measurements by J. A. Wheeler, see, e.g., IBM J. Res. Dev. 32, 4 (1988), and references therein. See Ref. 38 for comments on the relevance of computation to Wheeler's ideas.

45J. M. Hammersley and D. C. Handscomb, Mon te Carlo Methods (Wiley, New York, 1964).

A. G. Konheim, Cryptography, a Primer {Wiley, New York, 1981).


Links

Full Text

internal file


Sonstige Links