Jan Reimann
13 Apr 2012, 2pm-4pm, Class of 1947 Room
In mathematical logic, one tries to classify objects by their descriptive complexity, for example, how many quantifier changes are needed to define a given subset of the natural numbers. On the other hand, concepts like entropy allow for a measure-theoretic classification of complex, i.e. random, behavior.
Both approaches can be combined to define a notion of randomness for individual objects such as infinite binary sequences. I will discuss the resulting interplay between probability and definability. I will argue that the view from logic opens up new and perhaps unexpected perspectives on the concept of randomness, for example, concerning the role of infinity.