Science produces reliable information that can have a direct impact on our daily lives. The current COVID-19 crisis provides some particularly clear examples of this general observation. The machinery of science is our best hope for developing and evaluating vaccines and treatments. And technical experts are supplying important evidence based advice shaping crisis response strategies (e.g. confinement policies informed by epidemiology). For all the positives however, it is also no secret that scientific practice and how we use its insights are far from perfect, and sometimes deeply flawed. The current talk examines one particularly salient source of systematic error: The “myside bias.” This term refers to a deep seated and universal human tendency to preferentially seek out and evaluate information which supports one’s desires or pre-existing beliefs. Across a range of behavioral experiments and computational models I show how the myside bias systematically influences the production, consumption and dissemination of scientific claims. One of the surprising lessons that comes out of this work is that bias is not always bad, but can also produce benefits when considered at an organizational or societal scale. For example, while groups with a higher degree of myside bias are more error prone (clearly bad), they are also faster to make correct decisions. A better understanding of these sorts of tradeoffs associated with bias can help to better anticipate and evaluate the likely downstream consequences of changes to incentive structures or procedures shaping scientific communities, and can contribute to improved public interaction with the scientific community.
For attendance please contact in advance Brent Strickland.