empiricism, entropy, error, French Translation, Hermes, information, information theory, mathematics, negentropy, philosophy of science, sensation, Serres, Untranslated Theory

Translation: Michel Serres and the Mathematization of Empiricism

alchemys.jpg

The following is a translation of Michel Serres’s essay “Mathematization of Empiricism.” from Hermes II: L’Interférence. Paris: Editions de Minuit, 1972. 195-200. Original translation by Taylor Adkins 11/03/07.

The law known as Fechner-Weber’s law can be written S = K log I, and read: sensation grows like the logarithm of the stimulus[1]. The definition of Information, in the contemporary sense, can be written I = K log P, and read: information grows like the logarithm of the number of equally probable states[2]. Can the analogy of formation result in thinking the analogy of the alleged phenomena?

1. The notion of information is used in physics and communication theory in a way independent from the sense of the message that transports it. A succession of letters forming a word deprived of sense for whomever contains an easily calculable quantity of information. This magnitude does not have any relation to knowledge, in its traditional meaning. It is independent of the sense of the message, it is it thus of the observer.

The notion of sensation is used in psychology in a way independent from the value and the sense of the same scale where it is placed, even of the stimulus which produces it. A continuation of stimuli forming a sequence deprived of sense for whomever gives place to sensations[3] over the significance of which psychophysiology avoids being questioned. And, again, this calculation is independent of the gnoseological index of the stimulus and the sensation, it is it also from any perceiving subject.

In both cases, mathematization is possible in the exclusive condition to put between brackets the question of sense and the ego. The philosophical tradition would stick readily to the contrary thought, knowledge of which one could formalize only the absolute transparency of sense, and that on the contrary, nonsense is abstract. However, it is only because information is not knowledge, and is not knowledge-of-something, it is only because the sensation is rough (at least, even at the zero degree of its development), that it is easy to formalize the conditions of its emergence (at least, temporally).

Hence the parallel: just as the mathematical approach effectuates itself over the chance and the probability—the nonsense of nature, i.e. the complement of the whole of our knowledge, taken overall[4], of the same relates it to the stimulus and the feeling, as they are nonsenses, on the level of the animal receiver.

As it is the automatic integrator of nonsense, the sensory organ—the body—is the receptor of information in calculable quantity. Information and sensation are absolute (independent) magnitudes and on the inside of sense[5]: they are formalizable and abstract.

2. The comparison of two laws eventually imposes the idea that the growth of information is the general (structural?) expression of which Fechner’s formula is only a singular model, the translation of a structure in a singular region. It is possible that it is more generally still about a law of decreasing outputs, whose marginalist economy gives models in other regions. Conversely, the singular case of the sensation gives place to the manipulation of high numbers (in “natural life” and not only in the laboratory) that integrated information could have a weight on an entropic scale[6].

3. The introduction of the notion of probability in psychophysiology is not new. Since 1932, Houston ended, in his research on the law connecting the sensation to the stimulus, with a formula which corresponds to Gauss’s law of errors, from which he draws the traditional integral from probability[7]. Testing on the phenomena of brightness, even Houston does not fear to affirm that sensation is strictly proportional to the number of nervous fibers concerned[8]. In the same way, Piéron compares the physiological mechanism generated by the stimulus with the recruitment (in the aleatory sense) of aesthesions[9]. Lastly, the intensity of the stimulus would correspond, in its growth, with the increase in the number of the cortical elements sending their orders[10].
All would occur then as if the rise in the intensity of the stimulus and the increase in sensation put into play a higher number of phenomena or states concerned, of which it suffices to say that they also pose that the distribution of the stimulative phenomena or the information flow in the circuits neuronic obeys this law, simplest of the random states.

The sensory receiver would function well then, as a whole, like an automatic calculator of the quantity of information. Each sensory apparatus could be defined by this quantity.

4. Information theory allows a new approach to the problem of Error and Approximation in physical experimentation[11]. For example, Brillouin discusses, in chapter XV of its work, the traditional problem of the error to the measure a length, shared in intervals. It would be then interesting to translate Fechner’s laws[12] on the perceived intervals, initially in terms of error analysis, then in this same calculation interpreted in informational terms: statistical errors to the measure through the sensory organ of a measurable scale of intensities. That could make it possible to calculate the precision[13] of such a body, and to give an account of the perceived interval itself.

One would undoubtedly find, renewed or deepened, in the discussions of Favard[14] on significant space, that it would be necessary to compare to the insensible limits of the psychophysiologists, the sensory indetermination of Favard and the indetermination of measurement in Brillouin. The differential threshold would be bordered on the left and on the right by two interpretable intervals as errors, and formalizable as such. One could calculate the cost in the negentropy of discernability, and how it grows if one asked the receiving body to refine its intervals[15].

It is known that Information theory is capable of calculating the precision that could not exceed, in such and such conditions, an applied science: beyond ad infinitum grows the cost in negentropy of a given observation. The analogy supposed here would make it possible to obtain a (calculable) reason to be threshold of appearance of a new sensation: the cost in negentropy that the organization should spend to lower this threshold, i.e. to refine the interval, would be incompatible with the operation of the sensory apparatus, even with the total balance of the organization. Even with the state of project, of calculation appears intuitive, and seems to correspond to commonplace experiments. One would then obtain a definition of each apparatus (within the meaning of the definition of a radar) according to the negentropy that it can spend for its optimal discrimination: this definition would return to reason of the perceptible intervals.

Hence the two definitions of each apparatus: by the quantity of information which it is able to integrate (positive definition), by the negentropy that it is able to spend to reach such a limit of discernability, i.e. to operate its choices in the uncertainty of stimulative dispersion (definition by precision). While uniting, in a suitable formula, the definition through the result and the definition through the conditional capacity, there would be a good calculation of output for a given sensory apparatus. Again, each one of them would be defined by this output.

There one would obtain a new philosophical approach to the old problem of the error of the senses, which would then give up its Eastern context of illusion-already absorbed by optics, acoustics, etc.—to come to its true context, namely its random context. Conversely, one could mathematize the degree of truth (by quantity and precision) offered by an empirical theory, even at the level of its abstract beginning. If empiricism is probabilist, then it is quantifiable: that would be true, at the same time, of the significant experience and the scientific experiment.

5. There is thus a positive solution to the problem of the continuity between the sensory experience and the scientific experiment, which Bachelard, among others, had solved through the negative. It seems that the assertion of discontinuity is due to a superficial analysis of the operating conditions of the sensory apparatuses: the analysis of the scientific observation was scientistic, the analysis of the sensory observation was only common sense; consequently, discontinuity was already under the conditions of the epistemological explanation. If, on the contrary, the latter is rebalanced, by an also scientific analysis of the two types of observation, one finds continuity easily: the sensory apparatus functions in a way similar to a reading device. In the same way, one compared the brute results and direct sensible knowledge, and all the strategic pageantry of knowledge by methodically regulated experimentation: discontinuity lay again under the conditions of the epistemological explanation. If this one is rebalanced, by comparing results and results, operation and operation, only continuity and gradation are found, not rupture.

6. The information flow in the nervous circuits would then be calculable as well as the circulation of the messages in an unspecified network of communication. Sensation would then give results which one could infer, by models, the same provision of the neuronal systems. One could introduce, by this skew, topology and graph theory, in the psychophysiology of the nervous system, now understood as a simple network of a circulation of messages carrying information.

Note: If the magnitude of a star is m, I its luminous intensity, and Io the intensity of a source of reference, one obtains the formula I/Io = Io ^-4 (m – Mo) i.e. 4 (m – Mo) = log Io / I. In other words, M = K log I. It is another example of the same formulation. It was fatal that Information theory became, in the long term, the general and exact epistemology of what one formerly called observation, vaguely.


[1] H. Piéron, La Sensation, guide de vie,Gallimard, 1945, pp. 331.

[2] C. Brillouin, La Science et la théorie de l’information, ch.1.

[3] Or more than unites of sensations; cf. the notions of neural quantum (Stevens, Morgan et Volkmann, 1941) or of neuroquantum (Piéron, 1932). Another remark: the property of additivity est enlightened in these two cases (Brillouin, p.1 ; Piéron, p. 375).

[4] It would be interesting to note that the methodical strategy is all the more global as it relates to non-knowledge, that it is located in the measurement of the growth of our information. The knowledge of non-knowledge is global (from philosophy to the calculation of probabilities), the only sense which nonsense can present is molar, unifying, totalizing. The knowledge of knowledge is regional, molecular, local: strategic or tactical methods.

[5] Would the body be initially the global integrator of a global non-knowledge?

[6] Brillouin’s remark, p. 280. Also, p. 250: the law of compatible degradation with Carnot’s generalized principle can have a sense at the level of sensory information: maybe the process of habit.

[7] Piéron, p. 335.

[8] Ibid., 371, 376.

[9] From aesthesis, meaning the perception of the external world by the senses [Translator’s note].

[10] Ibid., 370, 372.

[11] Brillouin, p. 59.

[12] Our first translation was of an aleatory or probabilistic type. One can think that the discussions of the metaphysical characteristics of this law were dazzled by the traditional infinitesimal enunciation that one gives it.

[13] The term precision has the operational definition of the inverse of compared error.

[14] Favard, Espace et dimension, Albin-Michel, 1950. The physical continuum, p.17 (impossibility of recovering the principle of identity).

[15] One cannot miss being struck by the analogy of formalization. ∆ S (sensation) is proportional to ∆ I / I relative increase (intensity): in addition, if the error is ∆ x, the relative error is ∆ x / x, and the compared error ∆ L / x, L being the overall length of the segment that one measures. Fechner’s law is quite translatable for error analysis.

 

Standard

One thought on “Translation: Michel Serres and the Mathematization of Empiricism

  1. Pingback: French Translations: Works in Progress « Fractal Ontology

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s