All posts filed under: entropy

On Learning

Leave a comment
abstract machine / code / Cognition / diagram / difference / energy / entropy / identity / knowledge / learning / memory / problem / structure / Thought / unconscious / wittgenstein

One way of approaching the difference between knowledge and learning (so profound in our opinion that, despite their entanglement, there can be postulated neither a material nor conceptual ground which could ever serve to unify them) is by considering that even while wholly disparate, they are not in the least opposed for that reason. To learn and to know are two divergent operations, contrapositive dynamisms, which are nevertheless always both active simultaneously, as the “cutting […]

Translation: Michel Serres and the Mathematization of Empiricism

comment 1
empiricism / entropy / error / French Translation / Hermes / information / information theory / mathematics / negentropy / philosophy of science / sensation / Serres / Untranslated Theory

The following is a translation of Michel Serres’s essay “Mathematization of Empiricism.” from Hermes II: L’Interférence. Paris: Editions de Minuit, 1972. 195-200. Original translation by Taylor Adkins 11/03/07. The law known as Fechner-Weber’s law can be written S = K log I, and read: sensation grows like the logarithm of the stimulus[1]. The definition of Information, in the contemporary sense, can be written I = K log P, and read: information grows like the logarithm […]