|08:30 - 09:00||Registration|
|09:00 - 09:15||Welcome Address|
|09:15 - 10:30||Teddy Seidenfeld (Carnegie Mellon University): Eliciting Imprecise Probabilities|
|10:30 - 11:00||Coffee Break|
|11:00 - 11:45||Carl Wagner (Tennessee): Strassen Capacities as Constraints on Probability Revision|
|11:45 - 12:30||Frank Coolen (Durham): Nonparametric Predictive Inference with Imperfect Data|
|12:30 - 13:45||Lunch|
|13:45 - 14:30||Catrin Campbell-Moore (MCMP): Imprecise Probabilities and Supervaluational Logic|
|14:30 - 15:15||Brian Hill (GREGHEC, HEC Paris and CNRS): Dynamic Choice: A Problem for Imprecise Probabilities or Imprecise Probabilists?|
|15:15 - 15:45||Coffee Break|
|15:45 - 16:30||Arthur Paul Pedersen and Gregory Wheeler (Max Planck Institute for Human Development and MCMP): Demystifying Dilation|
|16:30 - 17:15||Frederik Herzberg (Bielefeld University): Aggregating Infinitely Many Probability Measures|
|17:15 - 18:00||Arthur Van Camp, Gert de Cooman and Erik Quaeghebeur (Ghent University, SYSTeMS Research Group, Centrum Wiskunde and Informatica): Connecting Choice Functions and Sets of Desirable Gambles|
There will be a joint workshop dinner, starting 7pm at "Georgenhof" (Friedrichstr. 1, 80801 München, within walking distance from the workshop venue). More info online: www.georgenhof-muenchen.de.
|09:00 - 10:15||Fabio Cozman (University of São Paolo): Graph-theoretical Models for Imprecise Probabilities: Independence Assumptions, Logical Constructs|
|10:15 - 10:45||Coffee Break|
|10:45 - 11:30||Yann Benetreau-Dupin (University of Western Ontario): Blurring Out Cosmic Puzzles|
|11:30 - 12:15||Hannes Leitgeb and Jan-Willem Romeijn (MCMP and RUG): Statistics and Full Belief|
|12:15 - 13:30||Lunch|
|13:30 - 14:15||Anthony Peressini (Marquette University): Imprecise Probability and the Temporal Evolution of Chance|
|14:15 - 15:00||Marco Cattaneo (LMU): Unreliable Probabilities and Statistical Learning|
|15:00 - 15:30||Coffee Break|
|15:30 - 16:15||Seamus Bradley (MCMP): Learning from Dilation and Belief Inertia|
|16:15 - 17:00||Namjoong Kim (Research Institute of Philosophy and Liberal Arts Education): A Dilemma for the Imprecise Bayesian|
|17:00 - 17:15||Short Break|
|17:15 - 18:30||James M. Joyce (Michigan): Imprecise Priors as Expressions of Epistemic Value|
Yann Benetreau-Dupin (University of Western Ontario), 28 June, 10:45 - 11:30.
The Doomsday argument and anthropic arguments are illustrations of a paradox. In both cases, a lack of knowledge apparently yields surprising conclusions. Since they are formulated within a Bayesian framework, the paradox constitutes a challenge to Bayesianism. Several attempts, some successful, have been made to avoid these conclusions, but some versions of the paradox cannot be dissolved within the framework of orthodox Bayesianism. I show that adopting an imprecise framework of probabilistic reasoning allows for a more adequate representation of ignorance in Bayesian reasoning, and explains away these puzzles.top
Seamus Bradley (MCMP), 28 June, 15:30 - 16:15
Imprecise probabilities (IP) is an attractive model of belief that avoids some of the problems with the spurious accuracy of the orthodox Bayesian treatment of belief. The main idea is to represent an agent's credal state with a set of probability functions (a credal set) rather than a single such function. There are two problems with IP related to how IP models change in belief. These are the problems of dilation (where belief seems to change too much in response to apparently irrelevant evidence) and belief inertia (where belief doesn't seem to change despite apparently relevant evidence). In this talk, I outline these two problems and argue that neither is a damning criticism of IP once one has an appropriately nuanced understanding of what a credal set represents.top
Catrin Campbell-Moore (MCMP), 27 June, 13:45 - 14:30.
We note a connection between supervaluational logic and imprecise probabilities. Firstly we argue that the appropriate notion of probability in a background of supervaluational logic is imprecise probabilities. Secondly we argue that the best logic of imprecise probabilities is supervaluational logic. This connection is exploited by developing a possible worlds style semantics. We further note that the imprecise probabilities - supervaluational logic can avoid liar-like paradoxes which are forthcoming in the precise probabilities - classical logic approach. This construction should be interesting to the proponent of supervaluational logic and the proponent of imprecise probabilities.top
Marco Cattaneo (LMU), 28 June, 14:15 - 15:00.
The importance of a (second-order) measure of reliability for (first-order) probability values -- discussed for example by Gärdenfors and Sahlin (1982, Synthese) -- is confirmed by statistical considerations. Statistics also offers a precise formulation of (relative) reliability in terms of likelihood function. The resulting (two-level) hierachical model will be briefly compared with the usual (one-level) imprecise probability model.
Frank Coolen (Durham), 27 June, 11:45 - 12:30.
We show how Nonparametric predictive inference (NPI) can deal with imperfect data, with main focus on Bernoulli data. Here, data considered will consists of given numbers of successes, failures, and missing observations. For the latter, no additional assumptions are included. We will explain why peculiar aspects of some dierent statistical frameworks, e.g. the possibility of dilation, do not appear in NPI. The purely frequentist nature of NPI is likely to lead to a range of discussion topics, particularly when compared to generalized Bayesian methods which, thus far, appear to dominate statistical methods based on imprecise probabilities.top
Fabio Cozman (University of São Paolo), 28 June, 09:00 - 10:15.
Research in artificial intelligence systems has often employed graphs to encode multivariate probability distributions. Such graph-theoretical formalisms heavily employ independence assumptions so as to simplify model construction and manipulation. Another line of research has focused on the combination of logical and probabilistic formalisms for knowledge representation, often without any explicit discussion of independence assumptions. In this talk we examine (1) graph-theoretical models, called credal networks, that represent sets of probability distributions and various independence assumptions; and (2) languages that combine logical constructs with graph-theoretical models, so as to provide tractability and exibility. The challenges in combining these various formalisms are discussed, together with insights on how to make them work together.top
Frederik Herzberg (Bielefeld University), 27 June, 16:30 - 17:15.
The problem of rationally aggregating probability measures in the context of decision making is studied, motivated by current interest in formal and social epistemology as well as psychology. Negative results from preference and judgment aggregation theory which show that the aggregate of several probability measures should not be conceived as the probability measure induced by the aggregate of the corresponding expected utility preferences are recalled. A generalisation of McConways (Journal of the American Statistical Association, vol. 76, no. 374, pp. 410414, 1981) theory of probabilistic opinion pooling which covers the case of the aggregation of infinite profiles of finitely-additive probability measures is proposed. The existence of non-trivial systematic aggregation functionals is established for electorates of arbitrary cardinality. Our aggregation functionals for the case of infinite electorates are neither oligarchic nor integral-based and satisfy (at least) a weak anonymity condition. The delicate set-theoretic status of integral-based aggregation functionals for infinite electorates is briefly discussed, too.top
Brian Hill (GREGHEC, HEC Paris and CNRS), 27 June, 14:130- 15:15.
One common dynamic-choice-based argument against decision rules diverging from expected utility (and hence many of those employing imprecise probabilities) purports to show that it is incompatible with the conjunction of two prima facie plausible principles: dynamic consistency and consequentialism. Dynamic consistency demands that a decision makers preferences over contingent plans agree with his preferences in the planned for contingency. However, what counts are the contingencies the decision maker envisages and plans for rather than contingencies selected by a theorist, as is standardly used in discussions of the principle. We show how this simple point resolves the purported incompatibility.
Moreover, it provides a reconceptualization of dynamic choice under non-expected utility that neutralizes many other dynamic-choice-based arguments against imprecise probabilities proposed in philosophy and economics. The perspective provides a principled justification for the restriction to certain families of beliefs in the analysis of dynamic choice problems, which blocks several standard dynamic-choice-based arguments. Furthermore, the issue of value of information under imprecise probability is revealed to have been mis-analyzed in standard treatments; proper analysis shows that it is non-negative as long as the information offered does not compromise information that the decision maker had otherwise expected to receive.top
James M. Joyce (Michigan), 28 June, 17:15 - 18:30.
As is well known, imprecise prior probabilities can help us model beliefs in contexts where evidence is sparse, equivocal or vague. It is less well-known that they can also provide a useful way of representing certain kinds of indecision or uncertainty about epistemic values and inductive policies. If we use the apparatus of proper scoring rules to model a believer's epistemic values, then we can see her 'choice' of a prior as, partly, an articulation of her values. In contexts where epistemic values and inductive policies are less than fully definite, or where there is unresolved conflict among values, the imprecise prior will reject this indefiniteness in theoretically interesting ways.top
Namjoong Kim (Research Institute of Philosophy and Liberal Arts Education), 28 June, 16:15 - 17:00.
The standard theory of subjective probability assumes that a rational agent assigns a single real number to every proposition as a degree of belief. Surely, this assumption simplifies, but is it realistic? Many philosophers will say "No." A real-life baseball fan may say that the probability of his team winning this year's world game is somewhere between .6 and .8 but not that it is, say, precisely .67895. This does not mean that they will stop using precise credence functions any time soon. However, it does mean that when they are serious about the reality of their model, philosophers will have to adopt a more complex framework. A popular suggestion is to use a credal state or a set of coherent credence functions (Jeffrey, 1983; Bradley, 2007; Joyce, 2010). This framework is often called "the imprecise credence framework."
The imprecise credence framework offers a more realistic model of degrees of belief. However, it needs to be complemented with an updating rule, a rule governing how to update one's credal state. No problem. If the agent receives certain evidence E, then she can apply strict conditionalization to each member of her prior credal state. If she receives uncertain evidence, or a credence distribution over a partition E, E, then she can apply Jeffrey conditionalization to each credence function in her prior credal state. In either way, the set of resulting credence functions will become her new credal state.
This is a very natural generalization of traditional updating rules. However, one aspect of it does not go well with the original motivation of imprecise credence distribution: it presupposes that the agent's new evidence is either fully believed or believed to a precise degree. Of course, a more realistic updating rule will allow the agent to update her credal state with imprecise evidence, or an imprecise credence distribution over a partition E, E. For example, if an agent comes to believe E to the interval degree of [.7,.8] as a result of her experience, then there must be a reasonable method in which the agent can update her credal state.
In this paper, I shall however argue that it is impossible or at least very difficult to find such an updating rule. For my argument, I will consider two particular updating rules. The first one is a pretty natural generalization of Jeffrey conditionalizaion for imprecise evidence (Jeffrey, 1983). Unfortunately, in a certain situation, this rule forces the agent to change her credal state although she receives no new evidence. The second rule is a version of Weatherson's (2007) "dynamic Keynesian model," modified for updating with imprecise evidence. According to Joyce (2010), an updating rule similar to Weatherson's leads to a wrong kind of "sharpening": in some situation, the agent comes to assign a precise credence to a proposition although it is intuitively irrational to do so. Arguably, any reasonable rule of updating with imprecise evidence will be similar to one of these two. Hence, these two problems create a dilemma.
Hannes Leitgeb and Jan-Willem Romeijn (MCMP and RUG), 28 June, 11:30 - 12:15.
This paper is concerned with the translation of statistical results into qualitative beliefs. Our approach to this problem relies on recent work concerning the relation between probabilistic and qualitative belief, based on a notion of stability or robustness. We consider a number of statistical methods and show how each of these can be connected to specific advice on what claims to commit to. Our reliance on the stability view of full belief brings numerous attractive consequences for the logic of statistical claims, and it provides a natural context for appreciating the dependence of statistical methods on prior convictions, sampling plans, and the like.top
Arthur Paul Pedersen and Gregory Wheeler (Max Planck Institute for Human Development and MCMP), 27 June, 15:45 - 16:30.
Dilation occurs when an interval probability estimate of some event E is properly included in the interval probability estimate of E conditional on every event F of some partition, which means that one's initial estimate of E becomes less precise no matter which F occurs. In this talk we present new results which provide a simple characterization of dilation formulated in terms of deviation from stochastic independence, propose a measure of dilation in these terms, and introduce a distinction between proper and improper dilation. With these tools in hand, we then turn to consider two sorts of arguments where dilating sets of probabilities have appeared. For some critics dilation is a pathological feature of imprecise probability models tout court, while others have thought that dilation reveals a narrower problem with how imprecise probabilities are updated. We shall argue that proper dilation is neither pathological in the broad sense nor in the narrow: among the issues bedeviling imprecise probability updating, dilation is not one of them.top
Anthony Peressini (Marquette University), 28 June, 13:30 - 14:15.
Understanding probabilities as something other than points, perhaps as intervals Kyburg (1999) or more generally clouds Neumaier (2003) has often been motivated by the need to find more realistic models for degree of belief, and in particular the idea that degree of belief should have an objective basis in statistical knowledge of the world (Kyburg, 1999, 2). Here I will offer another motivation growing out of efforts to understand how chance evolves as a function of time. If the world is chancy then the chance of an event e that happens at a given time t is less than 1 until it happens, but an issue remains as to whether Pe(t) continuously approaches 1 or whether it jumps discontinuously to 1 at t. It turns out that understanding Pe(t) as discontinuous has surprising and troubling consequences. I will sketch why this is so and argue that a viable options for circumventing the discontinuity problem is to understand the probabilities "imprecisely," that is, as something other than point values in the interval [0, 1]. Finally, I explore the general viability of imprecise probabilities in the context of the evolution of chance.top
Teddy Seidenfeld (Carnegie Mellon University), 27 June, 09:30 - 10:30. (Joint work with Mark J. Schervish and Joseph B. Kadane)
I review de Finettis two coherence criteria for determinate probabilities: coherence1, which is defined in terms of previsions (fair prices) for a set of random variables that are undominated by the status quo previsions immune to a sure-loss and coherence2, which defined in terms of forecasts for random variables that are undominated in Brier score by a rival set of forecasts. I review issues of elicitation associated with these two criteria that differentiate them, particularly when generalizing from eliciting determinate to eliciting imprecise probabilities.top
Arthur Van Camp, Gert de Cooman and Erik Quaeghebeur (Ghent University, SYSTeMS Research Group, Centrum Wiskunde and Informatica), 27 June, 17:15 - 18:00.
We study Seidenfeld, Schervish, and Kadanes notion of choice functions, and we want to make them accessible to people that are familiar with sets of desirable gambles. We relate both theories explicitly using their derived strict partial orderings in an enlightening fashion. We give an expression for the most conservative extension of a set of desirable gambles to a choice function. Because it is important for inference purposes, we also make a link with belief structures.top
Carl Wagner (Tennessee), 27 June, 11:00 - 11:45.
In his seminal 1964 paper, "Messfehler und Information", Volker Strassen (anticipating Dempster's work by several years) studied the lower and upper probabilities b and a induced by a multivalued mapping T from a probability space to a finite set Y, and gave an elegant characterization of the probability distributions q on Y that dominate b and are dominated by a. In this talk, Strassen's work is used to generalize Jeffrey conditioning to a rule for updating a prior p to a posterior q on Y bounded below and above by b and a, respectively, subject to the preservation of certain conditional probabilities, and the circumstances under which this rule reduces to Jeffrey conditioning are characterized.