Tuesday, September 18, 2012

Probabilities as Analogies

All that follows is just crude jotting, quite incomplete, and barely a start at anything.

C. S. Peirce remarks somewhere that every probability is in fact the ratio of a species to its genus. Since we usually think of probabilities as numbers, we have to say that we have a probability when the relation of a species to its genus corresponds in some way to the ratio of one number to another. Or, to put it in other words, probabilities are analogies, in the sense of 'A is to B as C is to D'.

If you think about it, mathematical analogies of this sort are not uncommon. To say that there are twelve inches to a foot, for instance, is to say:

inch : foot :: 12 : 1

Probabilities are of exactly this sort, except that to be probabilities they have to be analogies of a certain kind. To say that the probability of rolling a 1 on a six-sided die is to say something like:

rolling a one : dice rolls :: 1 : 6

If we take 's' to be whatever number we associate with species S and 'g' to be whatever number we associate with genus G, then, all probabilities have the following analogical form:

S: G :: s : g

We take the relation of a species of event to its genus and say that this relation corresponds to the relation between one number and another. Note that I say 'all probabilities have this analogical form', not 'all things of this analogical form are probabilities'; the analogy is constrained by the axioms of probability. Rather, my point is that every probability can be put in the form of an analogy.

If A and B are species, and MRG is the 'maximal relevant genus' including them both (in what follows I am treating each line as simply separate from the others):

P(A) = [A : MRG :: a : mrg]
P(B) = [B : MRG :: b : mrg]
P(A&B) = [AB : MRG :: a&b : mrg]
P(A|B) = [(AB : MRG :: a&b : mrg) ::: (B : MRG :: b : mrg)]

(Having both the expanding colons ::: and the parentheses is redundant; they're just ways to group which analogies are being analogized to which.)

Bayes Theorem is then the following analogy (assuming I haven't made any errors, which I may have):

(((AB : MRG :: a&b : mrg) ::: (B : MRG :: b : mrg)) :::: ((AB : MRG :: a&b : mrg) ::: (A : MRG :: a : mrg))) ::::: (((A : MRG :: a : mrg) :::: (B : MRG :: b : mrg))

Or, as an alternative (again assuming that I haven't made any stupid mistakes, for which I will not vouch):

(((AB : MRG :: B : MRG) ::: (AB : MRG :: A : MRG)) :::: (A: MRG :: B : MRG))) ::::: (((a&b : mrg :: b : mrg) ::: (a&b : mrg :: a : mrg)) :::: (a : mrg :: b : mrg)))

By assuming the MRG as always given, and mrg thus = 1, this particular analogy could be simplified, of course, as something like

((AB : B :: AB : A) ::: (A : B)) :::: ((a&b : b :: a&b : a) ::: (a : b))

None of this is particularly interesting in itself. But if probabilities are analogies, then probabilistic reasoning can be treated as a form of analogical reasoning, which would have implications for how we understand analogical reasoning.