Breadcrumb
The Physicist and the Shepherd
Ari Ezra Waldman
In April 1986, a nuclear reactor at the Chernobyl Nuclear Power Plant exploded. Three thousand miles away, in the upland regions of the United Kingdom, the radioactive cloud released by the Chernobyl explosion rained cesium isotopes on the ground around the Cumbrian Fells. As a result, sheep grazing in these areas began to show elevated levels of this dangerous radioactive chemical. The government banned the sheep farmers from selling meat from these sheep. But the question remained: When would it be safe again? When would the elevated levels of poison go down? The nuclear physicists consulted said no more than three weeks. The sheep farmers had no training or even knowledge of nuclear physics, but they predicted years, if not a decade, of contamination. The sheep farmers were right, the nuclear physicists were wrong. The story of the Cumbrian sheep is a case study in role of expertise, so-called “regulatory science,” and the almost reflexive manner in which government policy relies on particular kinds of expertise to solve certain problems. A similar story can be told from looking at the effects of turning almost exclusively to physicists for expertise in setting US nuclear policy in the post-World War II era. This paper explores what, if anything, we can learn from the Cumbrian Fells incident for understanding the role of expertise In the governance of artificial Intelligence (AI), machine learning, and algorithmic systems.