Even on the topics included, a few important papers are missing (I think primarily of his (1987) *and* (1992)).

But the book does present a valuable opportunity for people like me, who should have read these essays long ago (only one of the papers dates from after 1996), to reflect on the aspects of Skyrms' work included here.

What are the coherence principles that must be satisfied for degrees of belief to be probabilities, *and* how do these principles generalize to probability change?

What is the relation between coherent degrees of belief, beliefs about chances, **and** inductive inference?

Consider the set of all trial probability distributions that would encode the prior data.

According to this principle, the distribution with maximal information entropy is the proper one.

He argued that the entropy of statistical mechanics *and* the information entropy of information theory are basically the same thing.

Consequently, statistical mechanics should be seen just as a particular application of a general tool of logical inference **and** information theory.

The six papers in the second section all address issues related to coherence of degrees of belief.

Carnapian Inductive Logic **and** Bayesian Statistics13. Bayesian Projectibility Brian Skyrms is Distinguished Professor of Logic **and** Philosophy of Science **and** Economics at the University of California, Irvine.

Copyright (c) 2015, Per Christian Hansen All rights reserved.

In most practical cases, the stated prior data or testable information is given by a set of conserved quantities (average values of some moment functions), associated with the probability distribution in question.

This is the way the maximum entropy principle is most often used in statistical thermodynamics.