Sometimes I'm frustrated by how few people understand science, or how few people take a scientific perspective on new information. A little skepticism, some willingness to consider alternative hypotheses, some willingness to reconsider one’s position in the light of new evidence — these are skills that seem even more important as the world fills up with conspiracy theories, deepfakes, and other assorted misinformation and disinformation.
But on thinking a little more carefully, I realize that the situation isn’t particularly surprising. A scientific attitude is not in any sense “natural,” and is difficult both to acquire and to maintain. Indeed, I recognize that I didn't really understand science myself until I was in graduate school and started doing research. Most of what I learned in science classes before that was not really teaching me scientific thinking or scientific method, although it used much the same vocabulary.
Instead, science was just another subject like history or geography, and the way I learned science was mostly indistinguishable from how I learned those other subjects. There were some facts, and I worked to learn and remember the facts, and I did my best to recall the facts appropriately when asked. The facts in science were perhaps a little more interesting than the facts in history or geography, which probably explains why I started to think of myself as having a scientific inclination. But there was really no opportunity to engage with new information in a way that required me to approach the unknown in a way where I was both open to new information but also skeptical and testing that new information. Scientific experiments in high school and college now look to me like an odd form of cargo cult science (to reuse Richard Feynman's memorable metaphor). Yes, we were carrying out scientific procedures. Yes, we were recording data and using it to reach conclusions. But everyone knew what the phenomena were that we were studying, or else they wouldn't have been in the curriculum. Everyone knew, or could figure out, what the result of each experiment should be. It was really a form of theater more than it was a form of science – a kind of vivid interactive real-world role-playing, “this is what it might have been like to be a scientist discovering this phenomenon.”
I was fortunate to work with brilliant people in graduate school, working out solutions to problems that no-one had solved or improving previous solutions. Developing a scientific attitude was not part of the curriculum so much as part of the atmosphere, learned by apprenticeship and cultural participation rather than as explicit lessons. Part of what I learned was the importance of advocating for one’s discoveries or one's theories, but also the importance of testing them, considering contrary evidence, and at least acknowledging competing theories.
Is it realistic to think that more people might learn to have a scientific attitude? I think so, but it would require some major reforms to education. I take inspiration from the one year where we taught our oldest child at home for 7th grade, many years ago. That time allowed us all to engage his interests in a relatively free-form way. His curiosity would motivate the learning and potentially the experiments that we would do.
This is not an argument for homeschooling for its own sake. There is nothing magical about having parents as teachers that inherently improves the scientific aspects of education; homeschooling can all too easily just replicate the cargo-cult pretend experiments that so often happen in conventional schools now. But I do think that in our specific case, the freedom from institutional constraints and the opportunity to be focused on one student’s interests made it much easier to do genuine investigations, experiments, and discoveries. The drawback was that it demanded a lot from all of us — which is part of the reason we only did it for one year.
Looking back, I also see times in my high school experience and in my children's high school experiences where I can discern some genuine scientific work happening. In my high school, the chemistry teacher acquired a cast-off infrared spectrophotometer that sat like a shipwreck against a back wall of the lab. I was ultimately unsuccessful in my efforts to restore it to its original function; but in retrospect, I benefited greatly from the opportunity to understand, explore, and troubleshoot the device, which came with no user manual. (In the late 1970s, there was no web… so you couldn’t simply find the manual online).
Likewise, I think about my youngest child’s high school metallurgy project, which grew out of his work building and operating a backyard forge and teaching himself blacksmithing. (In contrast to my high school experience, he learned many of these things from YouTube). The experience and investigation clearly led to him knowing more on the subject than his teacher did… which might be another way of gauging success in this space.
These anecdotes suggest that individual, self-directed learning is an important ingredient, perhaps even essential, to learning “real” science. Unfortunately, most education is currently organized around mass production principles. Accordingly, a pessimistic conclusion might be that we are unlikely to improve in this dimension any time soon.
That said, there are already many forces prompting various reconsiderations and reinventions of the traditional classroom model. Digital technologies both prompt a reconsideration of what students should learn and enable different approaches to how it’s learned. Much of what happened in the pandemic was improvised and unpleasant for all, but it had the salutary effect of dramatically changing classroom behaviors overnight. As a side effect, many people became aware of the somewhat arbitrary nature of longstanding habits in education. Accordingly, an optimistic conclusion might be that we will have new opportunities in this dimension before too long.