Remember the dire – and categorical – predictions about AIDS voiced in the mid-1980s?
If not, let me give you a refresher course. Within 10 to 20 years tens of millions of Americans would be infected with HIV and hundreds of thousands of us would be dying each year from AIDS. Not only would the virus rapidly spread beyond the groups initially affected by it – gay males, IV drug users, and hemophiliacs – but it would be transmitted as readily through heterosexual intercourse as it had been through bathhouse orgies or the sharing of dirty needles. After all, look at what was happening in Africa where AIDS was decimating prostitutes and their johns in the slums of Nairobi, Lagos and Harrare.
Even lesbians were not immune from the threat. Solemn warnings were issued advising the use of dental dams during oral sex.
Very shortly, these predictions hardened into a litmus test of right-thinking. If you were a liberal or progressive, you accepted them without demur. If not, you were either an ostrich sticking your head in the sand and/or a twisted neo-fascist homophobe not-so-secretly hoping the disease would rid America of homosexuality once and for all.
I was not then, nor am I now, a medical expert, but even at the time the received wisdom on the subject seemed to overlook some critical factors that made projections about the disease less certain than proponents claimed. Didn’t it seem probable that AIDS’ nearly 100 percent mortality rate in those early years might be traced to the fact that the first groups manifesting the illness had highly compromised immune systems even before HIV arrived on the scene? Once the disease burned through those groups and made its way into “healthier” populations blessed with more income, health insurance, and larger, more effective social support systems, wasn’t it likely the mortality rates would decline? Furthermore, was it possible – even likely – that small but critical genetic differences might make Americans less susceptible to the virus than the inhabitants of Central Africa, many of them also suffering from compromised immune systems caused by malnutrition and repeated bouts of malaria?
Today, AIDS still poses a serious health threat in the United States, though, thankfully, not nearly as big a one as once predicted. Mortality rates have indeed declined and while the virus continues to spread at a rate much higher than we would like, it certainly has not come to infect tens of millions of Americans. Some, but by no means all, of this can be attributed to better drug therapies and public education campaigns about high-risk behaviors. But most of the gap between what was predicted and what has come to pass must be attributed to the fact that the scientific community and the mainstream media alike were relying on linear projections of a highly complex public health phenomenon about which many, many factors were then, even as now, still unquantified and unquantifiable. In such circumstances, linear projections are all but useless.
I bring up the AIDS predictions as a cautionary tale to keep in mind when evaluating predictions about other pending threats. Avian flu, for one, a topic that’s far more complex and fraught with uncertainty than the mainstream media would have you believe.
Global Warming, for another. I know it is virtually heresy at this point for anyone to the left of the American Petroleum Institute to question the categorical statements about Global Warming issuing from the mouths of Al Gore or Robert Kennedy, Jr. But that’s precisely the problem. Any time the word “heresy” springs to mind about any subject, we can be certain that we have entered the precincts of religious faith rather than scientific empiricism.
I fully accept that the earth has, over the past several decades, undergone rapid warming. I accept the possibility that human activity, in particular the burning of fossil fuels since the onset of the Industrial Revolution, is playing some part in this warming process. But when Al Gore comes on the TV, as he did recently, and gravely announces that “2005 was the warmest year on record” and that “the facts are all in” about the primary role played by human agency in global warming, my alarm bells go off. While “true” in some narrow sense, both statements lack what might be termed “nuance.”
After all, there simply are no accurate year-by-year records about the climate dating back much more than 150 years – and none about remote parts of the world until much later than that. Meanwhile, it only became possible – or practical – to collect comprehensive, global data on year-to-year changes in ocean and atmospheric temperatures or on the composition of the different atmospheric strata until after World War II – and for some of that data, not for another 20 or 30 years. Yes, we can gauge the relative levels of carbon dioxide in the atmosphere above a particular region from core samples taken from glaciers, but that data does _not_ easily translate into correlations between worldwide carbon dioxide levels and global temperatures in any given era.
What we do know for certain is that over the past one million years the earth has undergone long cyclical variations of temperature punctuated by much shorter periods of rapid cooling and warming – often very rapid cooling and warming. We know for certain that 1,000 years ago, Erik the Red – Leif Erikson’s father – was able to raise livestock and grow cereal crops in Greenland and that it isn’t possible to cultivate grain there today – so much for 2005 being the “warmest year on record.” We also know that early in the 14th century, the world plunged into what is known as the Little Ice Age and then underwent a period of rapid warming that began in the first decades of the 19th century – long before the effects of the nascent Industrial Revolution could have had any affect on global temperatures. If we are to be absolutely honest about it, we have to admit that we do not know at this time whether the current warming trend is part of the much longer cycling of temperatures going back to the beginning of the Pleistocene – a fluctuation whose root cause remains a mystery – or whether the buildup of carbon dioxide is playing any role in it, or even if it may not be entirely the result of human activity. Clearly, there is a strong argument for playing it safe and greatly reducing our reliance on fossil fuels. But there are many other powerful and absolutely incontrovertible arguments – ranging from national defense to global inequality to pollution – for doing so, without ignoring nuance, without choosing “truthiness” over “truth.”
There are a lot of ideologues and blinkered partisans in this country more than willing to “simplify” – which is to say distort – the truth in pursuit of goals they no doubt consider worthy. Add to them the corporate fronts willing to make any claim or distort any argument if the price is right. Yes, we live in the Age of Spin. But what differentiates progressives and liberals from the aforementioned folk – what _should_ differentiate them anyway – is the steadfast refusal to engage in such tactics, even in the name of a noble cause. An inability to live with uncertainty is the hallmark of fundamentalist thinking. The ability – and willingness – to accept that uncertainty is an inescapable part of the human condition and that the truth is almost always shadowed by a measure of indeterminacy is the hallmark of a true humanist. Ultimately we cannot defeat obscurantism by compromising intellectual integrity.
Even when it’s convenient to do so.