Catherine Elgin is professor of the philosophy of education at Harvard Graduate School of Education. She earned her B.A. at Vasssar College and her Ph.D. at Brandeis University. She is the author of Considered Judgment, Between the Absolute and the Arbitrary, With Reference to Reference, and co-author (with Nelson Goodman) of Reconceptions in Philosophy and Other Arts and Sciences. She is editor of The Philosophy of Nelson Goodman, and co-editor (with Jonathan E. Adler) of Philosophical Inquiry. She has received fellowships from the National Endowment of the Humanities, the American Council of Learned Societies, the John Dewey Foundation, the Spencer Foundation, the Andrew Mellon Foundation, the Bunting Institute at Radcliffe, and the Newhouse Center for the Humanities at Wellesley College.
The pessimistic induction is this: the history of science is a history of failure. Time after time, well supported theories have been discredited. Therefore, it is overwhelmingly likely that currently accepted theories will be discredited too. The optimistic riposte is that science is not just a history of failures. The advancement of inquiry has enabled scientists to improve the methods and correct the errors of their forebears. This may not vindicate current theories, but it provides reason to think that science is making real epistemic progress. Conclusion: we should neither abandon hope of achieving scientific understanding, nor be confident that we have already got it. Our situation thus calls for intellectual humility. That attitude, I will urge, is not one of passivity in the face of human frailty. It is an active orientation toward a domain of inquiry and our prospects of understanding it.
It might seem that the entire payoff for exercising intellectual humility is that we either uncover previously undetected errors or provide added assurance that our conclusions are correct. This is surely one payoff. But there is another important epistemic benefit. In taking the possibility that we might be wrong seriously, we can treat that possibility as itself worthy of investigation. To conduct such an investigation, we need to identify the potential fault lines our currently accepted account. If current understanding of the phenomena is wrong, where is the error likely to be located? What are the weakest links in our argument? If there is an error, what would show it? By attempting to answer such questions, I will urge, we enrich our understanding of the phenomena and our methods for investigating it.
A critical question is how a discipline can exploit its intellectual humility. I will argue that a community of inquiry, like a Kantian moral community, makes the laws that bind it. A scientific community accommodates itself to the permanent possibility of error by setting high standards for the acceptability of findings, integrating mechanisms for detecting and correcting errors, and insisting that evidence be statistically significant, that experiments be reproducible, and that reasons be publicly articulable and assessable. Its members hold themselves and one another to these standards because they they think that by satisfying these standards they will maximize their prospects of achieving the sort of understanding they seek. The self-monitoring, self-critical character of the practice is a manifestation of institutional intellectual humility.
Such institutional intellectual humility becomes personal intellectual humility when practitioners recognize that it does not just foster but in part constitute the understanding their discipline provides. Then living up to those standards is a mark of intellectual integrity. Because moral/epistemic values such as open-mindedness, trustworthiness, and humility are interwoven into the fabric of science, they are components of good science. They should be inculcated in science education.