Razziel: When many people think of statistics, they think of estimators and Chi-square tests and so on. Baysian probability (as advocated by Jaynes and many, many others) is a very different foundation of probability theory - in that viewpoint there are no estimators or stuff like that, instead you start out by looking at the problem of infering things you cant be 100% certain of, so you want to quantize the uncertainty in a number (aka a probability), and then see how these numbers interact - the only thing you really know about them is that in the limit you are 100% certain about something, since in that case the rules should reduce to the rules of ordinary logic. Well, if you take that as an input and do a lot of mathematics you arrive at probabilities which behave like bays law says. The difference is that in this viewpoint its all subjective - not 'estimators' that give 'universally true' estimates and probabilities, but as Jaynes show by a large number of examples, this is really what you want.
The second part of jaynes project was to re-derive statistical mechanics so it rested on information theory, specifically the concept of entropy (also a subjective quantity). He does that and rederive the laws of thermodynamics, etc. He also introduce the concept of maximum entropy (maxEnt). Its really exiting stuff and something that is still debated, for example, the second law of thermodynamics can be seen as a problem with conservation of information (or rather, the problem that a system cannot hold information enough to describe itself), rather than the usual interpretation. I would say Jaynes book is a must for someone who work with inference or statistical mechanics, and its really fun - he has several appendixes which are only there to rip on mathematicans.
****
This was a good thread - to bad i never got to hear about this exiting law DD threw around earlier ;-) .