http://www.penguinrandomhouse.com/books/533524/the-knowledge-illusion-by-steven-sloman-and-philip-fernbach/9780399184352/, by Steven Slomack and Philip Fernbach, 2017, Riverhead Books. Two cognitive scientists explain why most of what we believe we know, we actually
don't know as individuals; rather, we rely on group knowledge and knowledge embodied in our environment - yet often we are unaware that this is the case. Very relevant to the problem today of widespread mistrust or ignorance of science, even among supposedly "educated" persons.
The authors use many examples of knowledge vs. ignorance drawn from science, technology, and industry ,as well as non-science examples related to politics, social issues, and public policies. Here's an interesting passage from the Introduction:
This book is being written at a time of immense polarization on the American political scene. Liberals and conservatives find each other’s views repugnant, and as a result, Democrats and Republicans cannot find common ground or compromise. The U.S. Congress is unable to pass even benign legislation; the Senate is preventing the administration from making important judicial and administrative appointments merely because the appointments are coming from the other side.
One reason for this gridlock is that both politicians and voters don’t realize how little they understand. Whenever an issue is important enough for public debate, it is also complicated enough to be difficult to understand. Reading a newspaper article or two just isn’t enough. Social issues have complex causes and unpredictable consequences. It takes a lot of expertise to really understand the implications of a position, and even expertise may not be enough. Conflicts between, say, police and minorities cannot be reduced to simple fear or racism or even to both. Along with fear and racism, conflicts arise because of individual experiences and expectations, because of the dynamics of a specific situation, because of misguided training and misunderstandings. Complexity abounds. If everybody understood this, our society would likely be less polarized.
Instead of appreciating complexity, people tend to affiliate with one or another social dogma. Because our knowledge is enmeshed with that of others, the community shapes our beliefs and attitudes. It is so hard to reject an opinion shared by our peers that too often we don’t even try to evaluate claims based on their merits. We let our group do our thinking for us. Appreciating the communal nature of knowledge should make us more realistic about what’s determining our beliefs and values.
This would improve how we make decisions. We all make decisions that we’re not proud of. These include mistakes like failing to save for retirement, as well as regrets like giving into temptation when we really should know better. We’ll see that we can deploy the community of knowledge to help people overcome their natural limitations in ways that increase the well-being of the community at large.
I'm only partway through Chapter 1, but even so far it's quite interesting. Many examples are drawn from applied physics, with two related to nuclear weapons. The first, used to lead off the Introduction, is the
Castle Bravo test explosion of the "Shrimp" H-bomb in 1954, the power of which was underestimated by the scientists involved, by nearly a factor of 3; this calculation error led to fallout on two populated atolls, later resulting in thyroid tumors and birth defects. The second example, leading off Chap. 1, is how Louis Slotin, an otherwise experienced and careful physicist, ignored protocols during a test of beryllium spheres w/ plutonium core in 1946; by letting a screwdriver slip that he was holding in his hand, he
started a fission reaction with enough hard radiation to kill himself (he died some days later) and make others in the room very sick (and likely contributing to premature deaths from cancer for three of the men). These examples are used as teasers for the implied question "How can we humans be so smart, yet also so stupid?" I haven't read far enough to know how the authors will specifically try to explain what went wrong in these two cases.