New Ideas In Mathematical Philosophy

"Deliberation, Aggregation, Consensus" - 21-22 octobre 2013

Informations pratiques
22 octobre 2013
IJN

Franz Dietrich (CNRS & University of East Anglia, joint work with Christian List, LSE): “Opinion Pooling Generalized”
How can different individuals' probability assignments to some events be aggregated into a collective probability assignment? Although there are several classic results on this problem, they all assume that the `agenda' of relevant events forms a ?-algebra, hence, is closed under taking disjunctions (unions) or conjunctions (intersections) of events. This assumption is overly demanding: in practice, the group might care about the probability of `rain' and that of `heat' while ignoring that of `rain or heat'. We drop this assumption and explore probabilistic opinion pooling on general agendas. We characterize linear pooling and neutral pooling for general agendas, with classic results emerging as special cases if the agenda is taken to be a ?-algebra. We apply our results to probabilistic preference aggregation (re-interpretable as fuzzy or vague preference aggregation). This work is inspired from judgment aggregation theory. 

Jan Sprenger (Tilburg University, joint work with Dominik Klein, Tilburg University): “Modeling Individual Expertise in Groupd Judgments”
Group judgments are often implicitly or explicitly influenced by their members' individual expertise. However, given that expertise is seldom recognized fully and that some distortions may occur (bias, correlation, etc.), it is not clear that differential weighting is an epistemically advantageous strategy with respect to straight averaging. Our paper characterizes a wide set of conditions under which differential weighting outperforms straight averaging and embeds the results into the multidisciplinary group decision-making literature. 

Carl Wagner (University of Tennessee): “An Impossibility Theorem for Allocation Aggregation”
Among the many sorts of problems encountered in decision theory, allocation problems occupy a central position. Such problems call for the assignment of a nonnegative real number to each member of a finite (more generally, countable) set of entities, in such a way that the values so assigned sum to some fixed positive real number Familiar cases include the problem of specifying a probability mass function on a countable set of possible states of the world and the distribution of a certain sum of money, or other resource, among various enterprises. In determining an -allocation it is common to solicit the opinions of more than one individual, which leads immediately to the question of how to aggregate their typically differing allocations into a single “consensual” allocation. Guided by the traditions of social choice theory (in which the aggregation of preferential orderings, or of utilities is at issue) decision theorists have taken an axiomatic approach to determining acceptable methods of allocation aggregation. In such approaches so-called “independence” conditions have been ubiquitous. Such conditions dictate that the consensual allocation assigned to each entity should depend only on the allocations assigned by individuals to that entity, taking no account of the allocations that they assign to any other entities. While there are reasons beyond mere simplicity for subjecting allocation aggregation to independence, this radically anti-holistic stricture has frequently proved to severely limit the set of acceptable aggregation methods. The limitations are particularly acute in the case of three or more entities which must be assigned nonnegative values summing to some fixed positive number For if the set of values that may be assigned to these entities satisfies some simple closure conditions and (as is always the case in practice) is finite, then independence allows only for dictatorial or imposed (i.e., constant) aggregation. When this result yields as a corollary an impossibility theorem of Franz Dietrich on judgment aggregation. (This talk is partly based on joint work with my former doctoral student, Mark Shattuck.) 

Denis Bonnay (Université Paris-Ouest) & Mikael Cozic (Université Paris-Est) : “Weighted Averaging as a Revision Rule”
Weighted averaging has been proposed as a handy manner to combine probability distributions (e.g., Lehrer & Wagner, 1981). We discuss several issues regarding the rule of weighted averaging which arise when it is construed as a compact revision rule for testimony (rather than, say, as a mere aggregation procedure meant to sum up the priors of a group of agents): (1) compatibility with bayesianism: if we take the agent to be bayesian, what does it mean exactly for her to revise her prior probabilities using the rule of weighting averaging? In order to address bayesian worries voiced by Bradley (2006), we provide a complete characterization of the conditions under which weighted averaging is compatible with the bayesian picture. (2) partial revision: when the other agent only reveals part of her priors, the rule of weighted averaging is itself partial, saying nothing about the agent’s posteriors for events with unrevealed probabilities. Is it possible to extend the rule of weighted averaging to a full rule even in the partial case? Following Steele (2012), we suggest Jeffrey conditioning as a supplement to weighted averaging for the partial case and discuss its connection with the standard axiomatic characterization of weighted averaging in the non-partial case. 

Jan-Willem Romeijn (University of Groningen, joint work with Olivier Roy, Bayreuth University): “All Agreed: Aumann meets De Groot”
This paper shows that an opinion pooling process following the rules laid down by Morris DeGroot, and later developed by Lehrer and Wagner, can be represented in the context of Aumann's famous agreement theorem. For any opinion pooling process there is a common prior such that the approach to agreement, as described by Genneakoplos and Polymarchakis, coincides with the opinion pooling process. This equivalence can then be employed to analyse the notion of trust from DeGroot opinion pooling. It turns out that this notion comes down to a natural constraint on the likelihoods that appear in Genneakoplos and Polymarchakis's dynamic approach to agreement. The paper thereby makes precise Aumann's hunch, expressed in his seminal paper, that "the Harsanyi doctrine is implicit in much of this [DeGroot-based] literature". 

Richard Bradley (London School of Economics): “Conditioning vs Averaging”
This paper will explore the relation between Bayesian conditionalisation and linear averaging, arguing that the latter is better regarded as a method of belief formation than as a method of belief revision. 

Philippe Mongin (CNRS, HEC) & Marcus Pivato (Trent University): “Ranking Multidimensional Alternatives and Uncertain Prospects”
We introduce a two-stage ranking of multidimensional alternatives, including uncertain prospects as particular case, when these objects can be given a suitable matrix form. The first stage defines a ranking of rows and a ranking of columns, and the second stage ranks matrices by applying natural monotonicity conditions to these auxiliary rankings. Owing to the theory of additive separability developed here, this framework is sufficient to generate very precise numerical representations. We apply them to four types of multidimensional objects: (1) streams of commodity baskets through time, (2) uncertain social prospects, (3) uncertain individual prospects, and (4) monetary input-output matrices. Application (1) enters the paper mostly as an illustration, and the main results of the paper concern the other three. In application (2), we prove the strongest existing form of Harsanyi's (1955) Aggregation Theorem and cast light on the comparison between the ex ante and ex post Pareto principle, when expected utility assumptions do not hold. In application (3), we provide a novel derivation of subjective probability similar to Anscombe and Aumann (1963). Lastly, application (4) delivers a numerical measure of economic integration.