Serious concerns about the general public's lack of technological know-how were highlighted by a National Academy of Engineering report earlier this year.1 It began: "Although the United States is increasingly defined by and dependent on technology and is adopting new technologies at a breathtaking pace, its citizens are not equipped to make well-considered decisions or to think critically about technology. As a society, we are not even fully aware of or conversant with the technologies we use every day. In short, we are not 'technology literate.'"
Let's bring this issue closer to home. Substitute "the United States" and "citizens" with "biological science" and "researchers," respectively. Now, are the statements still true? Are biologists ignorant of the technology that underpins much of the recent progress?
Unfortunately, we don't know. Anecdotal evidence aside, no analysis exists (that I could find) that details what scientists know, and don't know, about technologies they employ on a daily basis.
This is shameful, for two reasons. First, if we are going to advocate the indisputably desirable goal of increased technological knowledge among the citizenry, we have a duty to look to our own laurels first. And second, if there is a problem with technical literacy among biologists, the consequences are far weightier than for the general public. My suspicion is that many in the research community are technologically challenged. If correct, practical implications abound.
One concern is purchasing decisions. How many instruments and systems are wastefully overspecified or distressingly underspecified because the purchasing process was based on insufficient technical knowledge? Worse, how many white elephants, or at least white boxes, are left to gather dust in the corner? And how much further might lab dollars go by rejigging existing equipment or by purchasing a refurbished older model rather than the latest gizmo?
A second concern is customization. Once adopted, any technology needs to be optimized to fit the particular needs of a lab. This is self-evident with older technologies and techniques, the ones that we feel comfortable tinkering with. Indeed, the accretion of small improvements is a main source of progress in the lab. The same benefits can be won for a mass spectrometer or an automated DNA sequencer, although in these cases virtual tinkering is advised, probably in close liaison with the manufacturer.
Lastly, there's output. In the absence of real understanding, new technologies are likely to generate more heat than light. Data for its own sake isn't really worth having. And if you don't understand how your instrument works, you may discount important anomalous results as system errors.
What's needed is education. Biology students should be taught mathematics, electronics and engineering. And all scientists need to educate themselves on the technologies underpinning the instrumentation they use. Workshops organized jointly with manufacturers provide one practical solution, and perhaps a win-win outcome.
Another thing we need are data. Baseline numbers on, for example, the proportion of researchers who can describe how a UV spectrometer works would be interesting. I'd give four-to-one odds that less than half could.
--Richard Gallagher, Editor (firstname.lastname@example.org)