Ref Modeling Rents Rule and Kolmogorov Complexity

Rent's rule of a power-law scaling relation between the number of connections to a subsystem and its number of functional subunits within that subsystem applies both to engineered and biological systems having an exponent often between one half and one. Also physical constants are limiting (Gerschenfeld Ref., p. 162). Kolmogorov complexity put simply is that a good measure of the complexity of an object is the length of the shortest computer program that constructs the object.

Kolmogorov complexity is a versatile mathematical tool applicable for studying logical depth as well as the time and space complexity of computations. Kolmogorov complexity, also called algorithmic information or entropy, implies quantifying the amount of absolute information in an object as being point-wise or discrete rather than as the distributed classical average randomness produced by a random source. Traditionally we think that the better a model compresses the data from a phenomenon, the better we may learn, generalize and predict. To make these ideas rigorous means that we must find the length of the shortest effective description for the 'object' or its Kolmogorov complexity. Even though a best hypothesis does not necessarily permit a best prediction, compression appears to be one 'best strategy' for predictions. A comprehensive as well as introductory treatment at the graduate student level is the book by: Li, M. and P. M. B. Vitanyi (1997). An Introduction to Kolmogorov Complexity and Its Applications. Springer Verlag, New York. (For those with some background in computer science, the writing is very clear.)

0 0

Post a comment