The Journal of Things We Like (Lots)
Select Page
Instructors: Carl T. Bergstrom and Jevin West

Prerequisites: None, though Professor Sanjay Srivastava’s PSY 607 may provide useful background. Synopsis and learning objectives: Our world is saturated with bullshit. Learn how to detect it and defuse it. Full details on our about page. Disclaimer: We have developed this seminar to meet what we see as a major need in higher education nationwide and we have proposed to teach it at the University of Washington in the near future. At present, however, Calling Bullshit is not an official course, nor is it otherwise endorsed by the University of Washington.

Each of the lectures will explore one specific facet of bullshit. For each week, a set of required readings are assigned. For some weeks, supplementary readings are also provided for those who wish to delve deeper.


  1. Introduction to bullshit
  2. Spotting bullshit
  3. The natural ecology of bullshit
  4. Causality
  5. Statistical traps
  6. Visualization
  7. Big data
  8. Publication bias
  9. Predatory publishing and scientific misconduct
  10. The ethics of calling bullshit.
  11. Fake news

Week 1. Introduction to bullshit. What is bullshit? Concepts and categories of bullshit. The art, science, and moral imperative of calling bullshit. Brandolini’s Bullshit Asymmetry Principle.

  • Harry Frankfurt (1986) On Bullshit. Raritan Quarterly Review 6(2)
  • G. A. Cohen (2002) Deeper into Bullshit. Buss and Overton, eds., Contours of Agency: Themes from the Philosophy of Harry Frankfurt Cambridge, Massachusetts: MIT Press.

Week 2. Spotting bullshit. Truth, like liberty, requires eternal vigilance. How do you spot bullshit in the wild? Effect sizes, dimensions, Fermi estimation, and checks on plausibility. Claims and the interests of those who make them.

  • Carl Sagan 1996 The Fine Art of Baloney Detection. Chapter 12 in Sagan (1996) The Demon-Haunted World

Week 3. The natural ecology of bullshit.

Where do we find bullshit? Why news media provide bullshit. TED talks and the marketplace for upscale bullshit. Why social media provide ideal conditions for the growth and spread of bullshit.

Week 4. Causality One common source of bullshit data analysis arises when people ignore, deliberately or otherwise, the fact that correlation is not causation. The consequences can be hilarious, but this confusion can also be used to mislead. Regression to the mean pitched as treatment effect. Selection masked as transformation.

Supplementary reading

  • Karl Pearson (1897) On a Form of Spurious Correlation which may arise when Indices are used in the Measurement of Organs. Proceedings of the Royal Society of London 60: 489–498. For context see also Aldrich (1995).

Week 5. Statistical traps. Base-rate fallacy / prosecutor’s fallacy. Simpson’s paradox. Data censoring. Will Rogers effect, lead-time bias, and length time bias. Means versus medians. Importance of higher moments.

  • Simpson’s paradox: an interactive data visualization from VUDlab at UC Berkeley.
  • Alvan Feinstein et al. (1985) The Will Rogers Phenomenon — Stage Migration and New Diagnostic Techniques as a Source of Misleading Statistics for Survival in Cancer. New England Journal of Medicine 312:1604-1608.

Week 6. Data visualization. Data graphics can be powerful tools for understanding information, but they can also be powerful tools for misleading audiences. We explore the many ways that data graphics can steer viewers toward misleading conclusions.

  • Edward Tufte (1983) The Visual Display of Quantitative Information Chartjunk: vibrations, grids, and ducks. (Chapter 5)

Week 7. Big data. When does any old algorithm work given enough data, and when is it garbage in, garbage out? Use and abuse of machine learning. Misleading metrics. Goodhart’s law.

  • Jevin West (2014) How to improve the use of metrics: learn from game theory. Nature 465:871-872

Week 8. Publication bias. Even a community of competent scientists all acting in good faith can generate a misleading scholarly record when — as is the case in the current publishing environment — journals prefer to publish positive results over negative ones. In a provocative and hugely influential 2005 paper, epidemiologist John Ioannides went so far as to argue that this publication bias has created a situation in which most published scientific results are probably false. As a result, it’s not clear that one can safely the results of some random study reported in the scientific literature, let alone on Buzzfeed.

Supplementary Reading

    • Erick Turner et al. (2008) Selective Publication of Antidepressant Trials and Its Influence on Apparent Efficacy New England Journal of Medicine 358:252-260
    • Nissen et al. (2016) Publication bias and the canonization of false facts.

eLife 5:e21451

Week 9. Predatory publishing and scientific misconduct. Predatory publishing. Beall’s list and his anti-Open Access agenda. Publishing economics. Pathologies of publish-or-perish culture.

New York Times Dec. 29, 2016.

Week 10. The ethics of calling bullshit. Where is the line between deserved criticism and targeted harassment? Is it, as one prominent scholar argued, “methodological terrorism” to call bullshit on a colleague’s analysis? What if you use social media instead of a peer-reviewed journal to do so? How about calling bullshit on a whole field that you know almost nothing about? Principles for the ethical calling of bullshit. Differences between being a hard-minded skeptic and being a domineering jerk.

Week 11. Fake news.. Fifteen years ago, nascent social media platforms offered the promise of a more democratic press through decentralized broadcasting and a decoupling of publishing from advertising revenue. Instead, we get sectarian echo chambers and, lately, a serious assault on the very notion of fact. Not only did fake news play a substantive role in the November 2016 US elections, but recently a fake news story actually provoked nuclear threats issued by twitter.

New York Times Nov. 25, 2016

Download PDF
Cite as: A. Michael Froomkin, Calling Bull**** in the Age of Big Data, JOTWELL (January 12, 2017) (reviewing Instructors: Carl T. Bergstrom and Jevin West),