The St. Croix Review speaks for middle America, and brings you essays from patriotic Americans.
Joseph Bast is president of the Heartland Institute. This article is republished with permission of the Heartland Institute. The online original article, posted at http://www.heartland.org/, is much longer and includes many links to reviews, articles, and research on environmental issues that do not appear here.
Michael Crichton's book, State of Fear (Harper Collins, 2004, $27.95), is a surprising book. Tucked inside a lively and entertaining tale of a philanthropist, a scientist, a lawyer, and two remarkable women who travel around the world trying to foil the plots of evil-doers is a detailed expose of the flawed science and exaggerations at the base of the global warming scare. It is also a devastating critique of mainstream environmentalism today and an eloquent call for change.
Like Crichton's previous block-busters, The Andromeda Strain and Jurassic Park, this book blends science and fiction in ways that teach as well as entertain readers. Crichton, who earned an M.D. from Harvard University and has written several nonfiction books, backs up his claims with footnotes, an appendix, and an annotated bibliography. Clearly he wants the science in his book to be taken seriously.
Which raises the question: How much of the science in State of Fear is accurate, and how much is fiction?
The answer: Michael Crichton is right! His synthesis of the science on climate change is extremely accurate and the experts he cites are real. The Heartland Institute has been participating in the debate over climate change for more than a decade, and we have worked with many of the experts listed in the book's bibliography.
This feature on The Heartland Institute's web site is dedicated to following the debate over the science in State of Fear. It collects some of the many reviews, op-eds, and letters the book has generated, and also links to research on environmental issues and the environmental movement.
Early in the book, Crichton has one of his characters define global warming as "the heating up of the earth from burning fossil fuels." (p. 80) Not so, says another character, who defines global warming as follows:
. . . global warming is the theory that increased levels of carbon dioxide and certain other gases are causing an increase in the average temperature of the earth's atmosphere because of the so-called "greenhouse effect." (p. 81).
The second definition is correct. "Global warming" really is only a theory, not a fact. Over the course of the book, other characters document the following flaws in the theory of global warming:
* Most of the warming in the past century occurred before 1940, before CO2 emissions could have been a major factor (p. 84).
* Temperatures fell between 1940 and 1970 even as CO2 levels increased (p. 86).
* Temperature readings from reporting stations outside the U.S. are poorly maintained and staffed and probably inaccurate; those in the U.S., which are probably more accurate, show little or no warming trend (pp. 88-89).
* "Full professors from MIT, Harvard, Columbia, Duke, Virginia, Colorado, UC Berkeley, and other prestigious schools . . . the former president of the National Academy of Sciences . . . will argue that global warming is at best unproven, and at worst pure fantasy" (p. 90).
* Temperature sensors on satellites report much less warming in the upper atmosphere (which the theory of global warming predicts should warm first) than is reported by temperature sensors on the ground (p. 99).
* Data from weather balloons agree with the satellites (p. 100).
* "No one can say for sure if global warming will result in more clouds, or fewer clouds," yet cloud cover plays a major role in global temperatures (p. 187).
* Antarctica "as a whole is getting colder, and the ice is getting thicker" (p. 193, sources listed on p. 194).
* The Ross Ice Shelf in Antarctica has been melting for the past 6,000 years (p. 195, pp. 200-201); "Greenland might lose its ice pack in the next thousand years" (p. 363).
* The Intergovernmental Panel on Climate Change (IPCC) is "a huge group of bureaucrats and scientists under the thumb of bureaucrats," and its 1995 report was revised "after the scientists themselves had gone home" (pp. 245-246).
* James Hansen's predictions of global warming during a Congressional committee hearing in 1988, which launched the global warming scare, were wrong by 200 percent (.35 degrees Celsius over the next 10 years versus the actual increase of .11 degrees); in 1998, Hansen said long-term predictions of climate are impossible (pp. 246-247).
* There has been no increase in extreme weather events (e.g., floods, tornadoes, drought) over the past century or in the past 15 years; computer models used to forecast climate change do not predict more extreme weather (pp. 362, 425-426).
* Temperature readings taken by terrestrial reporting stations are rising because they are increasingly surrounded by roads and buildings which hold heat, the "urban heat island" effect (pp. 368-369); methods used to control for this effect fail to reduce temperatures enough to offset it (pp. 369-376).
* Changes in land use and urbanization may contribute more to changes in the average ground temperature than "global warming" caused by human emissions (pp. 383, 388).
* Temperature data are suspect because they have been adjusted and manipulated by scientists who expect to find a warming trend (pp. 385-386);
* Carbon dioxide has increased a mere 60 parts per million since 1957, a tiny change in the composition of the atmosphere (p. 387).
* Increased levels of CO2 act a fertilizer, promoting plant growth and contributing to the shrinking of the Sahara desert (p. 421).
* The spread of malaria is unaffected by global warming (pp. 421-422, footnotes on p. 422).
* Sufficient data exist to measure changes in mass for only 79 of the 160,000 glaciers in the world (p. 423).
* The icecap on Kilimanjaro has been melting since the 1800s, long before human emissions could have influenced the global climate, and satellites do not detect a warming trend in the region (p. 423); deforestation at the foot of the mountain is the likely explanation for the melting trend (p. 424).
* Sea levels have been rising at the rate of 10 to 20 centimeters (four to eight inches) per hundred years for the past 6,000 years (p. 424).
* El Ninos are global weather patterns unrelated to global warming and on balance tend to be beneficial by extending growing seasons and reducing the use of heating fuels (p. 426).
* The Kyoto Protocol would reduce temperatures by only 0.04 degrees Celsius in the year 2100 (p. 478).
* A report by scientists published in Science concludes "there is no known technology capable of reducing [global] carbon emissions . . . totally new and undiscovered technology is required" (p. 479).
* Change, not stability, is the defining characteristic of the global climate, with naturally occurring events (e.g., volcanic eruptions, earthquakes, tsunamis) much more likely to affect climate than anything humans do (p. 563).
* Computer simulations are not real-world data and cannot be relied on to produce reliable forecasts (p. 566).
One character in State of Fear concludes:
The threat of global warming is essentially nonexistent. Even if it were a real phenomenon, it would probably result in a net benefit to most of the world (p. 407).
The characters in State of Fear also debate deforestation, endangered species, sustainable development, DDT, and many other hot environmental topics. Here are some highlights from those discussions:
* California's forests have continuously changed their composition: "each thousand-year period was different from the one before it"; Native Americans actively managed the changes with fire and agriculture (pp. 404-406).
* Nobody knows how many species there are in the world, and estimates of extinction rates are simply expressions of opinion and not science (p. 422).
* Silicon breast implants did not cause disease and power lines do not cause cancer (p. 456).
* Mankind does not know how to manage ecosystems, as is demonstrated by the gross mismanagement of Yellowstone National Park (pp. 484-486).
* Banning DDT was "arguably the greatest tragedy of the 20th century" since DDT was a proven lifesaver that posed no threat to human health (pp. 487-488).
* The biggest cause of environmental destruction is poverty, not prosperity (p. 564).
Michael Crichton is also very critical of the environmental movement. In the fiction part of the book he has characters say the following:
* PETA, the animal rights group, funds ELF, an eco-terrorist group, and mainstream environmental groups may be funding them as well. "Frankly, it's a disgrace" (p. 182).
* Environmentalists have used "media manipulation" and scare tactics as part of a "global warming sales campaign" to raise money and acquire political influence (p. 245).
* Environmentalists refuse to take into account the possible harms caused by the policies they recommend, with the result that they advocate spending billions of dollars to save a single hypothetical life (pp. 488-489).
* Environmentalism organizations today "have big buildings, big obligations, big staffs. They may trade on their youthful dreams, but the truth is, they're now part of the establishment. And the establishment works to preserve the status quo" (p. 565).
Crichton is careful not to accuse all environmentalists of being insincere. Only the leaders of environmental organizations, who should know better, are portrayed as deliberately misleading the media and general public in order to advance their careers. As for the rest of us, one character says: "Caring is irrelevant. Desire to do good is irrelevant. All that counts is knowledge and results" (p. 483).
In his "Author's Message" at the end of State of Fear, Crichton summarizes some of his own views on the issues his characters address earlier in the book. He also says:
We need a new environmental movement, with new goals and new organizations. We need more people working in the field, in the actual environment, and fewer people behind computer screens. We need more scientists and many fewer lawyers.
This is right on! Did you know the Sierra Club spends only about 7 percent of its budget on "outdoor activities"? (It said so right on the back of the reply form that accompanied its direct mail letters.) Is it right to call such an organization an "environmental" group when it is actually a direct-mail house connected to a Washington D.C. lobbying shop?
Beyond this, the message of State of Fear has serious public policy consequences:
* Most of the environment and health protection regulations in the U.S. ought to be reformed so they address real rather than imaginary risks, and concentrate on what works instead of the liberal orthodoxy of big government solutions to every problem.
* The U.S. is quite right to stay out of the Kyoto Protocol -- the global warming treaty -- and ought to be doing more to persuade other countries of the world that the protocol is unnecessary, premature, and unworkable.
* Government should stop funding radical environmental groups -- indeed, all environmental groups for that matter -- and should investigate the ties between eco-terrorist organizations, supposedly mainstream environmental advocacy groups, and the foundations that fund them. When homes and businesses are torched by environmental extremists, law enforcement authorities should determine whether tax-exempt foundations helped buy the gasoline and matches those outlaws used to commit their crimes. *
"I'd go out with women my age, but there are no women my age." --George Burns
Norman D. Howard's entire professional career has been involved in offshore engineering, designing and building fixed and floating structures for offshore use.
Reading global warming advocacy literature, and even global warning skeptic material, I am surprised at the scant attention given to the global marine environment. As an engineering professional working within the offshore environment, I find that the perception of most people is not oriented towards the oceans, but is clearly land based. This is not surprising as people live on land. However, our planet's surface is nearly three-fourths seawater.
Over the past three or four years, the following set of questions has been developed to open discussions with GW enthusiasts and skeptics alike. Curiously, virtually no one answers these questions correctly.
See how you do. Write your answer before reading on:
If you melted all of the North Pole ice cap, how many feet would the oceans rise?
Where does most of the world's photosynthesis take place?
What is the second most photosynthesis source?
Where is most of the earth's carbon dioxide located?
Where is the second largest carbon dioxide reservoir?
Hearing the answers that many give to these questions gives rise to the opinion that common sense is completely lacking in the general population. The simple answers to these questions become quite clear when they are put into perspective.
The North Pole ice cap consists of floating ice. It already is displacing its own weight. Archimedes' rule still applies. Melting the North Polar ice cap will not cause the oceans to rise at all! Incidentally, if the ice melts in your ice tea, does the level of fluid increase? Of course not!
Most of the photosynthesis on the planet does not take place in the rain forests, nor anywhere on land, but in the ocean. Remember, the oceans cover three-fourths of the planet, and major biological activity takes place down to more than 100 meters deep. Terrestrial biological activity is mostly within 30 meters (often much less) of the surface, with few exceptions.
Most of the planet's CO2 is in the rocks! The white cliffs of Dover, for instance, are essentially pure calcium carbonate. Throughout the planet, quantities of carbonate minerals and materials exist in large volumes.
Because free CO2 in the atmosphere is soluble in seawater, and the air-sea interface (remember, three-fourths of the planet) is an equilibrium boundary, the oceans themselves contain the second largest volume of CO2. Third, the atmosphere does contain a volume of CO2.
The transport of CO2 from seawater to diatoms, shellfish, and many creatures' exoskeletons fixes the CO2 into mineral form. Likewise, photosynthesis in sea-borne plants fixes CO2 as well. When the creatures die, they settle to the seafloor, forming great basins of mineral carbonates.
Plate tectonics moves the seafloor into collision with other plates. When this occurs, one plate may subduct under the adjacent plate, plunging the seafloor rocks deep into the planet's crust, greatly heating the rocks until they melt and disassociate the CO2 from the calcium. Volcanoes then subsequently can release vast amounts of CO2 when they erupt. CO2 in the air is removed by photosynthesis, and is also removed by dissolving in seawater, which completes the CO2 cycle.
Recognition of the CO2 cycle in the planet's mineralogy, and the important role that the ocean plays in this cycle is curiously missing from all discussions of GW that I have read.
The net mass of the atmosphere is equivalent to about 33 feet of seawater. This is often expressed as atmospheric pressure (14.7 psi). The oceans, covering three-fourths of the planet, average about 12,000 feet over that area. If we were to imagine covering the entire planet uniformly with seawater, the average depth would be about 9,000 feet. Dividing 9,000 feet by 33 feet we see that the mass ratio of the oceans to the air is about 270 to one. As water has slightly less than six times the heat capacity of air, the heat capacity of the ocean is about 1,600 times the heat capacity to the air.
Because the effect of weather and the ocean currents is to mix the fluids of the biosphere, heating the atmosphere must necessarily heat the oceans. This temperature feedback process is virtually never discussed in the GW advocacy literature. Enormous increases in atmospheric temperatures would be required to make even a tenth of a degree change in the global temperature.
Nevertheless, it is clear that the earth has been warming! Ice sheets covered the poles of the planet as far South as 42 degrees North in the Northern hemisphere approximately 12,000 years ago. So, the earth has been warming for at least 12,000 years, an extremely beneficial condition for the development of civilization! Thus a full heating and cooling cycle would appear to be about 25,000 years (Perhaps an astrophysical phenomenon).
Information technology informs the scientist that in order to make scientifically valid statements concerning any cyclic phenomenon, at least two cycles of the oscillation must be observed. Thus to really describe planetary catastrophic warming or cooling, at least 50,000 years of detailed records must be analyzed. Clearly, this is not the case with controlled sample data available.
As an example of this concept, I asked my stockbroker if (given the stock market record of the past sixty some years since the Second World War) he would put all of his money on any given stock with the expectation it would go up by 50 percent by a week from next Tuesday. He, of course, said no, that the record did not support such a gamble. So it is with making short-term (20 to 30 years) predictions of long-term (tens of thousands of years) phenomena: definitive statements are speculative at best.
As a marine engineering specialist, I remain skeptical of claims that human-caused catastrophic global warming is taking place in the immediate future. The common sense and scientific questions that remain unanswered concerning the marine environment are extremely troubling.
It occurs to me that "politically correct" rent seeking behavior of those seeking to obtain government sponsored grants and contracts is a major contributing factor bearing on manmade global warming. Aahhhhh! "Grantsmanship!" *
"I was always taught to respect my elders and I've now reached the age when I don't have anybody to respect." --George Burns
Americans should ask themselves why Barack Obama, Nancy Pelosi, and Harry Reid wanted to pass a massive healthcare bill before the August recess. The plan would not have taken effect until 2013. Why the rush?
If there is real urgency, individuals and families going bankrupt, businesses closing because of spiraling healthcare costs, why postpone the relief until after the next Presidential elections in 2012 or the congressional elections in 2010?
It's almost as if the Democrats had something to hide. Why isn't there time to go through the normal process of debate, listening to pros and cons in committees?
There is a pattern to be discerned. The President said just after his inauguration that if we didn't pass the $878 billion stimulus bill instantly (within days, as did happen) that the nation would fall into an economic calamity from which we might never recover. The Congressional Budget Office estimates that it will be September 2010 before three-quarters of the stimulus money will be spent. Likewise the massive, climate change bill was passed by the House before members could even read the bill (there was no text of the bill at the time of the vote) and it likewise is not scheduled to take effect for several years.
Americans should watch the deeds of the administration and its supporters, and be wary of the words they use.
But there are words to note, words spoken to friendly listeners before the fierce controversy occurred, before ordinary people were paying attention. (Thank you to the Heritage Foundation for collecting these quotes. If you would like to see and hear the following quotes made please use the following link: spreading-disinformation-about-obamacare).
Rep. Jan Schakowsky (D-IL) at a Health Care for America Now rally said:
And next to me was a guy from the insurance company who argued against the public health insurance option, saying it wouldn't let private insurance compete. That a public option will put the private insurance industry out of business and lead to single-payer. My single-payer friends, he was right. The man was right.
Rep. Barney Frank (D-MA) said to Single Payer Action, a national nonprofit organization:
I think that if we get a good public option it could lead to single-payer and that is the best way to reach single-payer. Saying you'll do nothing till you get single-payer is a sure way never to get it. . . . I think the best way we're going to get single-payer, the only way, is to have a pubic option and demonstrate the strength of its power.
Washington Post blogger Ezra Klein reported from the Democratic National Convention before the Presidential election last year that:
They have a sneaky strategy, the point of which is to put in place something that over time the natural incentives within its own market will move it to single-payer.
Liberal New York Times columnist Paul Krugman wrote:
The only reason not to do [single-payer] is that politically it's hard to do in one step . . . . You'd have to convince people completely to give up the insurance they have, whereas something that lets people keep the insurance they have but then offers the option of a public plan that may evolve into single-payer.
And finally there are the words of the President speaking before the American Medical Association on June 15th:
What are not legitimate concerns are those being put forward claiming a public option is somehow a Trojan horse for a single-payer system. . . . So, when you hear the naysayers claim that I'm trying to bring about government-run health care, know this -- they are not telling the truth.
The President and liberal Democrats are up to no good. *
"This is the sixth book I've written, which isn't bad for a guy who's only read two." --George Burns
These quotes are from website Quotations by Author at http://www.quotationspage.com/quotes/George_Burns.
The following is a summary of the August 2009 issue of the St. Croix Review:
In the Editorial "Read the Bill Before Voting Congressman!" Barry MacDonald looks at the dreadful results of rushed legislation.
Mark W. Hendrickson, in "Obama's Two Achilles' Heels," sees how Obama could quickly lose his popular appeal; in "Team Obama's Auto Coup," he points out seven reasons why Obama should have stayed out of the car business; in "Opening Pandora's Box: Classifying CO2 as a 'Pollutant'," he demonstrates the folly of regulating CO2; in "A Closer Look at the IPCC," he believes the United Nation's panel is a "political" body willing to resort to scientific fraud; in "Economic Strangulation: The Environmentalist/Democrat War Against Energy," he writes that the purpose of green energy is to cripple the economy.
Herbert London, in "Do I Live in America?" borrows a literary device to imagine he has just awakened up from a slumber begun in 1965; in "The Ugly American," he writes that in the "age of Obama" we have not the Ugly but the "Apologetic American"; in "The Iranian Election in Historic Terms," he describes the dilemma President Obama faces in deciding how to use "soft" power with the Iranian mullahs; in "Is Iowa at the Cusp of Change," he sees grassroots anger building in Iowa among Democrats and Republicans at what is seen as a power-grab by President Obama; in "Obama on D Day," he show why, "there has never been a force for good more notable than the United States military."
Allan Brownfeld, in "The Sotomayor Nomination -- Hopefully a Last Gasp for Identity Politics," he cites speeches in which she talks about her "Latina soul," and "Latina voice"; in "The Ricci Decision and a More Color-blind Society," he writes about the Supreme Court's resolution of the case of the white firefighters who were denied promotion because of their race; in "The Time Has Come to Finally Confront an Unresolved Act of Radical Violence: the 1970 San Francisco Police Station Bombing" he relates the evidence against William Ayres and Bernadine Dohrn and calls for justice.
Paul Kengor, in "Economic Stimulus 101: Reaganomics vs. Obamanomics," compares the current president's out-of-control spending with President Reagan's careful stewardship.
In "A Prescription for American Health Care," John Goodman sees crushing tax increases or broken promises in the future for American taxpayers as the huge bills for Social Security, Medicare and Medicaid come due. He believes that we must free doctors and patients from government control; and he shows how the free market is already working in delivering inexpensive and quality health care.
In "Culture Makes or Breaks an Ordered Free Society," John Howard asserts a vital need for moral guidance for American society.
Robert L. Wichterman, in "The Myth of a 'Wall of Separation' between Government and Religion," looks at the neglected part of the First Amendment to the Constitution, the section prohibiting government interfence in the free exercise of religion.
In "Mr. Jefferson," Robert Thornton defends our third president from the politically correct attacks of present-day biographers.
In "Baseball, America, and the 21st Century," Andrew J. Harvey goes to bat for baseball as the quintessentially American game.
Jiggs Gardner, in "The Curious Case of Somerset Maugham," knows why this fluent, polished writer never became more than second rate.
John Ingraham reviews Little Pink House, A True Story of Defiance and Courage, by Jeff Benedict. Jeff Benedict tells the story of Susette Kelo, who owned the pink house, and the infamous Supreme Court decision, Kelo Vs. New London.
Little Pink House, A True Story of Defiance and Courage, by Jeff Benedict, 2009, 377 pp.
This compelling book tells the story of the fight over eminent domain abuse in Connecticut that resulted in the infamous Supreme Court decision in 2005, Kelo vs. New London, and the author has done a remarkable job. He makes the story so interesting that it's hard to put down, and he does so by shaping the narrative so adroitly that the many strands in its complex web are clearly marked in enough depth for the reader's understanding without going into stultifying detail. We are told enough but not too much. The chronological organization, from 1992 to 2007, seems logical, but it was a deliberate choice by the author; anyone who has tried to explain a complicated story with so many actors will realize how difficult it is to stick to chronological order. The pacing is quickened by the way the author shifts from one set of actors to another, making us feel present at the time, anxiously awaiting the next turn of events. The organization in itself is masterly.
Benedict begins with Susette Kelo, a forty-year-old working-class woman divorced once and about to divorce her second husband, buying her dream house, a somewhat neglected old place in a working-class neighborhood on the Thames River in New London in the spring of 1997. That this is her dream house will turn out to be very important, because of all the homeowners involved in the struggle, she has the simplest and most rooted motivation: she won't give up because her house means everything to her. Later, when a lawyer from the Institute for Justice assesses the litigants, he puts Susette first because he recognizes her single-minded determination.
Just as Susette is buying her house, the Republican governor is planning to make some political hay in heavily Democratic New London by initiating an urban renewal project along the city's waterfront. To keep it out of the city's hands and under his control, he gives the task to the head of the state's Department of Economic and Community Development, who in turn chooses an influential Democrat and lobbyist as consultant to move the project through the political labyrinth. He decides to use the inactive New London Development Corporation, familiar and unthreatening to the city's politicians, as his instrument, and he chooses Claire Gaudiani, president of Connecticut College, to run it. She is ambitious, superficially brainy, lively, provocative, conventionally unconventional, the kind of woman who easily impresses men. Soon it is decided that the NLDC must attract a major corporation to the project, and the early part of the story tells how Mrs. Gaudiani brokers a deal between Pfizer, the pharmaceutical giant with large research labs across the river in Groton, and the state. Pfizer will spend $300 million on a 24 acre site, and the state will spend $100 million improving the area, developing a nearby historic fort as a park, and assembling 90 adjacent acres for a hotel, conference center, office space, and upscale housing and stores, properties that would pay more taxes to the city than their present owners would.
That was the nub of the struggle, the additional land that Pfizer and the NLDC wanted cleared and developed to complement its 24-acre holding, and the book concisely but amply documents the fight, with all its twists and turns. In the end, Susette Kelo lost when the Supreme Court decided against her in 2005. Eventually, the state, embarrassed by all the bad publicity, settled with her and the other litigants, and her pink house (focal point of the resistance) was taken apart and reconstructed elsewhere in the town, to be a memorial to the struggle. She bought a house across the water in Groton and went on with her life, but her ordeal had a positive result: 42 states (not Connecticut) amended their laws to restrain the use of eminent domain to take property for anything but distinctively public purposes.
Although Pfizer built on the 24 acre piece and the old fort was spruced up for a park, the delay caused by the controversy killed the rest of the project, and the disputed area today is, as Benedict says, "a barren wasteland of weeds, litter, and rubble." What an ironic end to all the expensive machinations of the self-important figures so vividly portrayed here! The neighborhood could have been integrated into the project, and an architect provided such a plan, but the NLDC ignored it. None of those supposedly savvy, highly paid politicians and planners and consultants and lawyers and fixers had the sense to see that a lot of money and trouble (and finally, the whole project) would be saved by leaving the neighborhood in place, and one wonders: were they blinded by power, hypnotized by their own lying verbiage (the hypocrisies in their printed statements and official letters have to be read to be believed, but that is public speech nowadays), or simply stupid?
Whatever you've read or heard about the Kelo case, you don't know the half of it unless you read this fascinating book. *
"His mother should have thrown him away and kept the stork." --Mae West
Andrew Harvey is an associate professor of English at Grove City College, and is a contributing scholar with the Center for Vision & Values. This article is republished from V & V, a website of the Center for Vision & Values.
"Whoever wants to know the heart and mind of America had better learn baseball." --Jacques Barzun
Right off the bat, Jacques Barzun's pitch about baseball strikes us today as coming out of left field. First asserted in the 1950s, his famous assessment of baseball's place in American culture then seemed to cover all the bases and was as uncontroversial as it was incontrovertible. Now, however, many voices would cry foul at Barzun's claim and attempt to controvert him by either dismissing his thesis as off-base or by picking him off as a screwball French critic.
But those bush-league voices can neither de-mythologize baseball nor play hardball with a big-league cultural historian such as Barzun. In truth, Barzun knocked it out of the park by identifying baseball as the heart and mind of America.
First, Barzun's argument does not necessarily require baseball to be the national pastime. For much of the 20th century, and especially when Barzun wrote his God's Country and Mine. A Declaration of Love, Spiced with a Few Harsh Words, baseball was unarguably how America passed its time. No matter how one measured the cultural phenomena of baseball -- playgrounds and schoolyards, professional attendance records, media attention -- no single other sport or entertainment outstripped it. In the 21st century, baseball finds itself in a far less privileged position. Its status dimmed by the growth in market share of other professional sports, especially football, and more dramatically by the domination of entertainment media.
If the playground is the test, sure there may no longer be a backstop, but the kids aren't playing on the gridiron either -- or engaged in a sport at all. They are gaming. It's the Wii gaming console you got them for Christmas, the "Guitar Hero" as a birthday present that occupies their time. Or Facebook.com. Or whatever.com. The cultural shift that has occurred is not that football has replaced baseball as the national pastime but that nothing has become our national pastime. That is, no one sport or activity captures the whole-hearted attention of multiple generations of Americans the way baseball once did.
That accounts for the pastime, but what about baseball as at least our national sport?
Can baseball still reveal "the heart and mind of America" when its Opening Day arrives, and no one seems to care? Those who try to brush-back baseball's American pedigree love to point out that America did not even compete for the recently completed championship round of the World Baseball Classic (Japan beat Korea). Add that baseball has become the focal point of the sporting world in Cuba and the Dominican Republic more so than in the United States. The question arises: are these countries becoming more American because they have taken to baseball with such ardor? The answer is simply, yes. That baseball arose in 19th-century America was not a historical accident, and that baseball has acquired such cultural hegemony in such scattered locales across the globe by the end of the 20th century is similarly no accident. Some would call it cultural imperialism, and not fail to notice the traces of American political and military domination in both post-WWII Asia and Central America. But in all cases, Cuba included, it must be qualified as a glorious imperialism.
Just as England celebrated the return of Charles II to the throne in 1660 as "The Glorious Revolution" because it was accomplished without bloodshed, the spread of America's game throughout the world in the 20th century was not by force but by the virtues of the game itself. Baseball as a game depends on distinguishing fair from foul. The countries who have successfully adopted baseball have adopted at least, within the sport, the best and most noble virtues that typify America -- principally, the meritocratic and dynamic yoking of individual excellence with harmonious teamwork. These virtues, the virtues that are borne by the game between the white lines, are what charmed Barzun. And what continue to charm.
The beauties of the game itself remain unchanged, and it is the timelessness of those beauties that make Barzun's declaration timeless. These beauties are why baseball survived the Black Sox scandal, overcame racism, and integrated on its own without government intervention, persevered despite the advent of the designated hitter, and will outlast the consequences of the steroid era. In its complicities, tolerations, repudiations, and expiations, baseball has mirrored both America's vices and virtues. It can switch-hit.
For some, I realize, Barzun sounds like the ultimate seamhead -- celebrant of baseball's mythic and glorious past -- and his reading of baseball as the ultimate allegory for Americans' "heart and mind" no longer seems to be in the ballpark. But our language at least says it ain't so. I have managed here in these few paragraphs well over a dozen baseball idioms -- baseball expressions that successfully convey meaning beyond the game itself. We may not feel about baseball the way our fathers did, but no other game has so enriched the way we think. Moreover, the histories of baseball and of America are entwined forever and both continue to thrive. Who is to say that Barzun is no longer in the ballpark because he already hit a home run? *
"He is simply a shiver looking for a spine to run up." --Walter Kerr
Robert M. Thornton writes from Fort Mitchell, Kentucky.
Years ago, noted historian Charles Callan Tansill taught a course in American biography and noticed the students would lose interest if he tried "to make our Founding Fathers into plaster saints." They wished him to "humanize them without demeaning them." Their viewpoint reminded him of a couplet from a poem by James Whitcomb Riley:
In fact, to speak in earnest, I believe it adds a charm
To spice the good a trifle with a little dust of harm.
In recent years, unfortunately, historians have gone far beyond sprinkling a "little dust of harm" in their biographies of the Founding Fathers; they desire not to humanize them, but to demean them. These debunkers are becoming a bore.
None of the Founding Fathers have been spared attacks, but perhaps our third president, Thomas Jefferson, has been the chief victim. He has taken a pounding for allegedly having children by his slave, Sally Hemings, although the DNA test did not prove this. Jefferson has also been denounced as a racist, sexist, chauvinist, atheist, misogynist, hypocrite, and enthusiastic supporter of the excess of the French Revolution. One respected writer demands that Jefferson be pulled down off his pedestal and tossed in the ash-heap of history. Is deranged too harsh a description for such a proposal?
How can it be, ask Jefferson's critics, that the author of the Declaration of Independence was himself a slave owner?1 Their answer is that he was a hypocrite and never really believed what he wrote. But why did Jefferson, a member of the landed gentry, pretend he was opposed to slavery if he did not think it was wrong? The hypocrite pretends to believe something he does not in order to gain popularity but for Jefferson to speak, however mildly, against slavery was hardly the way to win the affections of his fellow Virginians.
Douglas Wilson suggests that a better question would be:
How did a man who was born into a slave-holding society, whose family and admired friends owned slaves, who inherited a fortune that was dependent on slaves, and slave labor, decide at an early age that slavery was morally wrong and forcefully declare it ought to be abolished?
During these "politically correct" days, Jefferson's critics use what historians call "presentism," which means they are "applying contemporary or otherwise inappropriate standards to the past." They show "the widespread inability to make appropriate allowances for prevailing historical conditions." Or, put another way, "they look at the past in the light of what we know and believe today." They fail to follow the advice of the historian that "to understand the past we must look at it always when we can through the eyes of contemporaries." It is hardly proper, declared Walker Percy, "to judge a man's views of the issues of his day by the ideological fashions of another age." Unfortunately, in the "debunking and revisionist spirit of the times" exporting contemporary standards of equity to the past helps make the case against traditional American heroes.
Slavery was the norm in Jefferson's time and place and "complex legal and financial factors were involved, which, while not exculpatory, do suggest less harsh condemnation" of the man. In his early years as a political leader, Jefferson made several attempts to prevent the spread of slavery and move towards ending it completely. He was defeated each time and finally put his energies elsewhere, leaving the task to the next generation. We may agree he should have tried harder but his decision not to do so is hardly "evidence of pathological virulent racism." And if he had forced the issue, his usefulness in other matters might have been diminished. If in 1776 he had refused to compromise and said unless slavery were ended immediately he would withdraw from public life, would we be better off today? I think not.
Jefferson is also denounced for not freeing his own slaves as did several prominent Virginians, including George Washington. It is distasteful to be reminded that in Jefferson's time money was in short supply and wealth was measured by property which included buildings, crops, land, animals -- and, sadly, slaves. It would have been a great financial sacrifice for Jefferson to have freed his slaves, partly because he lived beyond his means. Perhaps that is not a good excuse but how many of his critics today, smug in their feelings of moral superiority, would willingly surrender a large portion of their wealth in cash, savings, and stocks and bonds?
It must be remembered, too, that in 18th century Virginia slaveholders who might have wished freedom for their slaves had to come to terms with a "tangle of legal restrictions and other obstacles." Freeing slaves was not necessarily advantageous for them. Many were without skills and semi-literate at best, so they would have been easy victims of white scoundrels motivated by greed and racism. Where would emancipated slaves have lived and how would they have earned their livelihood? They would have enjoyed little, if any, freedom and likely would have come to grief in a hostile environment unless they were able to pass for white.
So then, writes Wilson, well-meaning slaveholders were caught on the horns of a dilemma. "We have the wolf by the ears," wrote Thomas Jefferson in 1820, "and we can neither hold him, nor safely let him go. Justice is in one scale, and self-preservation in the other." The "presentism" of our times involves us in mistaken assumptions about historical conditions in the 18th century. We err in thinking that any slaveholders "wanting to get out from the moral stigma of slavery and improving the lot of his slaves had only to set them free."
In Jefferson's only book, Notes on the State of Virginia, (1781) he stated:
I advance it therefore as a suspicion only, that the blacks, whether originally a distinctive race, or made distinct by time and circumstances are inferior to the whites in the endowments both of mind and body.
But, explains Daniel Boorstin:
. . . he always expressed an open-minded hopefulness that the facts would someday produce unambiguous proof of the equality of the Negro. He never lost his eagerness for an entirely satisfactory demonstration "that the want of talents observed in them is merely the effect of their degraded condition, and not proceeding from any difference in the structure of the parts on which intellect depends."
"Be assured," wrote Jefferson to Henri Gregoire on February 25, 1809:
. . . that no person living wishes more sincerely than I do, to see a complete refutation of the doubts I have myself entertained and expressed on the grade of understanding allotted to them by nature, and to find that in this respect they are also on a par with ourselves. My doubts were the result of personal observation on the limited sphere of my own State, where the opportunities for the development of their genius were not favorable, and those of exercising it still less so. I expressed them therefore with great hesitation; but whatever be their degree of talent it is no measure of their rights. Because Sir Isaac Newton was superior to others in understanding he was not therefore lord of the person or property of others. On this subject they are gaining daily in the opinion of nations, and hopeful advances are making towards their re-establishment on equal footing with the other colors of the human family.
Recently Douglas L. Wilson found a portion of a letter from Jefferson to Robert Pleasants, a Quaker from Henrico County, Virginia. It was known they had corresponded in the summer of 1796 regarding the education of slaves but key pieces were missing. The portion of the letter found Jefferson in mid-thought on page two responding in agreement to Pleasants' interest in establishing an educational scheme for slaves within the context of emancipation. Lucia C. Stanton, author of Slavery at Monticello, says:
This is one of the few times Jefferson is known to have written regarding the education of slaves. What is surprising about this particular letter is that he seems willing to have slaves educated with free white children, which is not wholly consistent with his known views on the subject. Scholars will have to evaluate this find carefully in the context of his writings on race and slavery.
Speaking at Monticello in 1997, Monticello trustee and former Justice John Charles Thomas said:
Jefferson was not perfect. But Jefferson . . . did do one thing; he gave expression to an idea that has -- through all these hundreds of years -- impelled America towards unity and equality. . . . Even the ideas of a flawed man can spark the creation of a more perfect union. . . .
"All men are created equal" is one of the master ideas of our society, wrote Theodore Roszak, in The Cult of Information (1986). From the power of this familiar idea:
. . . generations of legal and philosophical controversy have arisen, political movements and revolutions have taken their course. It is an idea that has shaped our culture in ways that touch each of us intimately: it is part, perhaps the most important part, of our personal identity.
This master idea:
. . . has nothing to do with measurements or findings, facts or figures of any kind. The idea of human equality is a statement about the essential worth of people in the eyes of their fellows. [It] arose in the minds of a few morally impassioned thinkers as a defiantly compassionate response to conditions of gross injustice that could no longer be accepted as tolerable. [It was born from an] absolute conviction that catches fire in the mind of one, of a few, then of many as the ideas spread to other lives where enough of the same experience can be found waiting to be ignited.
Perfect equality of rights is not likely to be realized but as Stefen Zweig wrote in Erasmus of Rotterdam (1934):
. . . that which in the concrete world can never be victorious remains in that other as a dynamic force, and unfulfilled ideals often prove most unconquerable. [They] may represent a need which, though its gratification be postponed, is and remains a need. [Such ideals are] neither worn out nor compressed in any way [and continue to] work as a ferment in subsequent generations, urging them to the achievement of a higher morality.
Those ideals which remain unfulfilled are "capable of everlasting resurrection."
*****
Whatever his faults, Jefferson does not deserve the harsh treatment he has received in recent years. Biographers should be forgiving, said Joseph Epstein wisely, and he did not mean glossing over a person's shortcomings, but rather not writing in an unfair or derogatory manner calculated to bring forth hatred, not understanding. He should not be his subject's "conscientious enemy," wrote George F. Will (The Woven Figure: Conservatism and America's Fabric, 1994 -- 1997), and should remember "a cool apprising eye need not be a jaundiced one." Jay Tolson in Pilgrim in the Ruins -- A Life of Walker Percy (1992) wrote that the
. . . examination of an exemplary life does not oblige one to ignore flaws. The interest of an exemplary person depends largely on the complexity and sometimes the enormity of his or her flaw. The problem with modern biography . . . is its lack of tragic sense, and a resulting tendency to see the subject's failings against some implied perfectionist ideal rather than against the limitations of a human life. [Little wonder, then, he continues] that contemporary biography often seems to have no higher goal than, as Elizabeth Hardwick once said, "to diminish the celebrated object and aggrandize the biographer."
It all comes down to how we should remember the leading figures of our history. Shall it be "by their greatest achievements and most important contributions or by their personal failure and peccadilloes"? *
1) In his Declaration of Independence (1922) Carl L. Becker wrote that:
The final form of the Declaration was not the same as Jefferson's first draft; and it seemed to be obvious that a book on the Declaration should contain some account of the changes made in the original text and the reasons for making them. I found that, first and last, a good many changes were made. Some of them were merely verbal, intended to improve the form; others, and these the more drastic, designed to ease the document through Congress -- something added to please, something omitted to avoid giving offense to this or that section of pubic opinion. The most notable instance was the deletion of Jefferson's famous "philippic" against the slave trade. Jefferson himself thought this long paragraph one of the best parts of the Declaration; and certainly nothing could have been more relevant in an argument based upon the natural rights of man than some reference to slavery -- that "cruel war against human nature itself." But Congress struck it out. There were many slaveholders in Congress (Jefferson being one), and although none of them objected to the abstract doctrine of natural rights, many of them were naturally (human nature being what it is) sensitive to a concrete example of its violation so pointedly relevant as to be invidious.
"The principle of spending money to be paid by posterity, under the name of funding, is but swindling futurity on a large scale." --Thomas Jefferson
John A. Howard is a Senior Fellow at the Howard Center for Family, Religion & Society. This essay was a presentation made to the Philadelphia Society.
When I was President of the Philadelphia Society thirty years ago we devoted the whole program of our Annual Meeting to the subject of Religion. The various sessions were devoted to such topics as Religion and Freedom, Religion in Contemporary Culture, and Religion and Contemporary Politics. Since then, religion has been minimized in our deliberations, unwisely, I think.
Unsure that I can adequately address the assigned topic for this panel, I am going to assert a privilege of extreme old age and just say what I believe needs to be said "to deepen the intellectual foundations of our free and ordered society and broaden the understanding of its basic principles and traditions." I trust you all will recognize that phrase as the purpose of the Philadelphia Society enunciated in our By-Laws.
For at least half a century, most conservative scholars have plied their trade in their own little corners of the American reality, either oblivious of, or indifferent to, the paramount requirements of an ordered free society. Whatever breakthroughs they may achieve in their own field of work, it will profit them little if the society is plummeting toward a terminal crash.
Some forty years ago, while I was President of Rockford College, I gave a talk about education at a large national conference. The next speaker, America's all-purpose genius, Buckminster Fuller, lumbered up to the podium and was quiet for a moment. Then he said:
Before I begin my presentation I want to say something to that fellow who just finished. You folks in the colleges are ruining this country. What you do is identify the bright students as they come through and make them experts in something. That isn't all bad, but it leaves a residue of people of mediocre intelligence and the dunderheads to become the generalists needed to serve as college presidents.
When the laughter subsided, he continued, "and the Presidents of the United States!" That little witticism contained a wallop of earthly wisdom greater than any other single sentence I have ever read or heard.
A generalist has a broad and solid understanding of human nature, also a competent knowledge of the primary social institutions of the society, their interdependence, their vulnerabilities, and the principles that govern what they are able to do and unable to do. Such a person has some chance of accurately anticipating the consequences of the decisions to be made in his life and work.
Let's apply this concept to education, certainly a primary institution. It is the field in which I have labored, full-time or part-time, for sixty-one years, twenty-one of them as a college president. My dissertation centered on educational philosophy. Any serious study of the history of education will reveal that until the middle of the 20th century for virtually all societies the core purpose of schooling has been to train the young how to live responsibly and usefully in their own society.
That first and minimal requirement involves imparting to each new generation the ideals which specify the nature and purpose of the society, why those ideals are of utmost importance, and the obligations the citizens must fulfill, as well as the taboos they must observe, in order for those ideals to prevail. Those ideals must eventually guide the lives of the young people, who, in America, used to absorb them very much as they learn to speak the language. As long as this acculturation takes place, the society is viable.
Speaking of this process as it relates to education, Robert Hutchins, for sixteen years the President of the University of Chicago, stated in a 1956 lecture:
The pedagogical problem is how to use the educational system to form the kind of man that the country wants to produce. But in the absence of a clear ideal, and one that is attainable by education, the pedagogical problem is insoluble; it cannot even be stated. The loss of an intelligible and attainable ideal lies at the root of the troubles of American education.1
With the dearth of generalists, there is today almost no public understanding of the ethos that prevailed in America from 1620 to 1945 and there is an equal shortage of public understanding about the educational philosophy that sustained that ethos. From the arrival of the Pilgrims in New England in 1620, the American experiment in self-government was an embodiment of Christendom. That does not imply that everyone was a Christian. Rather, it designates a society in which the behavior of the people generally accords with the behavioral standards prescribed by Christianity.
That sweeping claim about the enduring regime of Christendom, contradicting what "everybody knows" about our history, is not easy to substantiate in two and a half minutes, but I will present a few mini-quotes which I hope will, at least, provoke some second thoughts.
President James Madison said:
We have staked the whole future of American civilization not upon the power of government: far from it. We have staked the whole future of all our political institutions upon the capacity of mankind for self-government; upon the capacity of each and all of us to govern ourselves according to the Ten Commandments of God.2
President John Quincy Adams said:
The highest glory of the American Revolution was this; it connected in one indissoluble bond the principles of civil government with the principles of Christianity.3
De Tocqueville in his searching appraisal of the American society in the 1830s wrote:
Christianity directs American life. Of all the countries of the world, America is the one in which the marriage tie is the most respected.4
Later he wrote:
By their practice, Americans show they feel the urgent necessity to instill morality into democracy by means of religion. What they think of themselves in this respect enshrines a truth which should penetrate deep into the consciousness of every democratic nation.5
In World War I, the United States Government provided a New Testament to every doughboy sent overseas.
It was Christendom that delivered a society that was basically honest, lawful, conscientious, cooperative, kind, helpful, and productive. People under the age of 75 can't begin to imagine what life in America was like prior to World War II.
As an 8-year-old child I would walk my younger brother at night half a mile across a park and the railroad tracks to the Community House for a children's program. My parents hadn't the slightest concern for our safety. At the public grade school I attended, the day began in an all-school assembly with a prayer, a patriotic song, and a reading of some uplifting message. Occasionally, our family would go into Chicago, customarily leaving the car unlocked. If the driver forgetfully left the key in the ignition, the key, the car, and any packages were there when we returned. In my company of the First Infantry Division in World War II, almost everybody had two parents or one had died. Divorce was rare in those days and a source of embarrassment. It was assumed that people belonged to a church or synagogue.
Since the patterns of behavior that prevailed were rooted in religion, any effort to weaken or revoke any of the prescribed standards had a tough go, because it was simply taken for granted that God was more important than anything else. It wasn't until the early 20th Century when socialism and then Communism spread in America that Christendom began to wane. In the 1912 election, the socialists elected 56 mayors and drew 900,000 votes. In 1919, the Communist Party became the center of revolutionary influence.
In his speech after receiving the Templeton Prize, Solzhenitsyn said:
The world had never before known a godlessness so organized, militarized, and tenaciously malevolent as that of Marx and Lenin, and at the heart of their psychology, hatred of God is the principal guiding force, more fundamental than all their political and economic pretensions.
No dictatorship or godless form of government can tolerate any authority superior to its own, so religion and family are authorities that must be eradicated or, at least, discredited and smothered.
Conservative intellectuals who are partisans of the free and ordered society, whatever may be the focus of their scholarship, must also become active agents working to contain and defuse and discredit the massive assault on Christendom and the family and the rule of law and the core purpose of education. Without a much, much larger contingent of persistent, persuasive, ubiquitous, and humble conservative voices, the now dominant forces of greed, envy, lust for power, and unbridled gratification of the senses will ultimately have total control, and our cherished free and ordered society will expire in a cataclysm of moral chaos. May all of us, prayerfully, do our utmost to prevent annihilation. *
1) Hutchins, Robert M., Some Observations in Education, 1956, p. 31.
2) McDowell and Beliles, America's Presidential History, pp. 263-4.
3) Federer, William J., America's God and Country: Encyclopedia of Quotations (St. Louis, MO; America Search 2000, 2000) p. 514.
4) de Tocqueville, Democracy in America, Garden City, NY, Doubleday and Company, 1969, p. 547.
5) de Tocqueville, op. cit., p. 542.
"I feel so miserable without you: it's almost like having you here." --Stephen Bishop
John C. Goodman is the president, CEO, and Kellye Wright Fellow at the National Center for Policy Analysis. This article is reprinted by permission from Imprimis, a publication of Hillsdale College. The following is adapted from a speech delivered in Naples, Florida, on February 18, 2009, at a Hillsdale College National Leadership Seminar.
I'll start with the bad news: When we get through the economic time that we're in right now, we're going to be confronted with an even bigger problem. The first of the Baby Boomers started signing up for early retirement under Social Security last year. Two years from now they will start signing up for Medicare. All told, 78 million people are going to stop working, stop paying taxes, stop paying into retirement programs, and start drawing benefits. The problem is, neither Social Security nor Medicare is ready for them. The federal government has made explicit and implicit promises to millions of people, but has put no money aside in order to keep those promises. Some of you may wonder where Bernie Madoff got the idea for his Ponzi scheme. Clearly he was studying federal entitlement policy.
Meanwhile, in the private sector, many employer-sponsored pension plans are not fully funded. Nor is the federal government insurance scheme behind those plans. We have a potential taxpayer liability of between 500 billion and one trillion dollars for those private pension plans, depending on the markets. And on top of that, roughly one-third of all Baby Boomers work for an employer who has promised post-retirement health care. As with the auto companies, almost none of that is funded either. Nor are most state and local post-retirement health benefit plans. Some California localities have already declared bankruptcy because of their employee retirement plans and the first of the Baby Boomers is still only 63 years old.
What all this means is that we're looking at a huge gap between what an entire generation thinks is going to happen during its retirement years and the funds that are there -- or, more accurately, are not there -- to make good on all those promises. Somebody is going to be really disappointed. Either the Baby Boomers are not going to have the retirement life that they expect or taxpayers are going to be hit with a tremendously huge bill. Or both.
How did this crisis come about? After all, the need to deal with risk is not a new human problem. From the beginning of time, people have faced the risks of growing old and outliving their assets, dying young without having provided for their dependents, becoming disabled and not being able to support themselves and their families, becoming ill and needing health care and not being able to afford it, or discovering that their skills are no longer needed in the job market. These risks are not new. What is new is how we deal with them.
Prior to the 20th century, we handled risks with the help of family and extended family. In the l9th century, by the time a child was nine years old, he was usually paying his own way in the household. In effect, children were their parents' retirement plan. But during the 20th century, families became smaller and more dispersed -- thus less useful as insurance against risk. So people turned to government for help. In fact, the main reason why governments throughout the developed world have undergone such tremendous growth has been to insure middle class families against risks that they could not easily insure against on their own. This is why our government today is a major player in retirement, health care, disability, and unemployment.
Government, however, has performed abysmally. It has spent money it doesn't have and made promises it can't keep, all on the backs of future taxpayers. The Trustees of Social Security estimate a current unfunded liability in excess of $100 trillion in 2009 dollars. This means that the federal government has promised more than $100 trillion over and above any taxes or premiums it expects to receive. In other words, for Social Security to be financially sound, the federal government should have $100 trillion -- a sum of money six-and-a-half times the size of our entire economy -- in the bank and earning interest right now. But it doesn't. And while many believe that Social Security represents our greatest entitlement problem, Medicare is six times larger in terms of unfunded obligations. These numbers are admittedly based on future projections. But consider the situation in this light: What if we asked the federal government to account for its obligations the same way the private sector is forced to account for its pensions? In other words, if the federal government suddenly closed down Social Security and Medicare, how much would be owed in terms of benefits already earned? The answer is $52 trillion, an amount several times the size of the U.S. economy. What does this mean for the future? We know that Social Security and Medicare have been spending more than they are taking in for quite some time. As the Baby Boomers start retiring, this deficit is going to grow dramatically. In 2012, only three years from now, Social Security and Medicare will need one out of every ten general income tax dollars to make up for their combined deficits. By 2020 -- just eleven years down the road -- the federal government will need one out of every four income tax dollars to pay for these programs. By 2030, the midpoint of the Baby Boomer retirement years, it will require one of every two income tax dollars. So it is clear that the federal government will be forced either to scale back everything else it's doing in a drastic way or raise taxes dramatically.
I have not even mentioned Medicaid, but it is almost as large a problem in this regard as Medicare. A recent forecast by the Congressional Budget Office -- an economic forecasting agency that is controlled by the Democrats in Congress, not by some conservative private sector outfit -- shows that Medicare and Medicaid alone are going to crowd out everything else the federal government is doing by mid-century. And that means everything -- national defense, energy, education, the whole works. We'll only have health care. If, on the other hand, the government continues with everything else it is doing today and raises taxes to pay for Medicare and Medicaid, the Congressional Budget Office estimates that, by mid-century, a middle-income family will have to pay two-thirds of its income in taxes!
The only sensible alternative to relying on a welfare state to solve our health care needs is a renewed reliance on private sector institutions that utilize individual choice and free markets to insure against unforeseen contingencies. In the case of Medicare, our single largest health care problem, such a solution would need to do three things: liberate the patients, liberate the doctors, and pre-fund the system as we move through time.
By liberating the patients I mean giving them more control over their money -- at a minimum, one-third of their Medicare dollars. Designate what the patient is able to pay for with this money, and then give him control over it. Based on our experience with health savings accounts, people who are managing their own money make radically different choices. They find ways to be far more prudent and economical in their consumption.
As for doctors, most people don't realize that they are trapped in a system where they have virtually no ability to re-price or re-package their services the way every other professional does. Medicare dictates what it will pay for, what it won't pay for, and the final price. One example of the many harmful effects of this system is the absence of telephone consultations. Almost no one talks to his or her doctor on the phone. Why? Because Medicare doesn't pay a doctor to talk to you on the phone. And private insurers, who tend to follow Medicare's lead, don't pay for phone consultations either. The same goes for e-mail: Only about two percent of patients and doctors e-mail each other -- something that is normal in every other profession.
What about digitizing medical records? Doctors typically do not do this, which means that they can't make use of software that allows electronic prescriptions and makes it easier to detect dangerous drug interactions or mistaken dosages. Again, this is something that Medicare doesn't pay for. Likewise patient education: A great deal of medical care can be handled in the home without ever seeing a doctor or a nurse -- e.g., the treatment of diabetes. But someone has to give patients the initial instruction, and Medicare doesn't pay for that.
If we want to move medicine into the 21st century, we have to give doctors and hospitals the freedom to re-price and re-package their services in ways that neither increase the cost to government nor decrease the quality of service to the patient.
In terms of quality, another obvious free market idea is to have warranties for surgery such as we have on cars, houses and appliances. Many are surprised to learn that about 17 percent of Medicare patients who enter a hospital re-enter within 30 days -- usually because of a problem connected with the initial surgery -- with the result that the typical hospital makes money on its mistakes. In order for a hospital to make money in a system based on warranties, it must lower its mistake rate. Again, the goal of our policy should be to generate a market in which doctors and hospitals compete with each other to improve quality and cut costs.
We won't be able to make any of this work in the long run, however, unless we pre-fund the system. Today's teenagers are unlikely to receive medical care during retirement if they must rely on future taxpayers, because taxpayers of the future are unlikely to be agreeable to living in poverty in order to pay their elders' medical bills. This means that everyone must start saving now for post-retirement health care. I would propose that everyone in the workforce put a minimum of four percent of his or her income -- perhaps two percent from the employer and two percent from the employee -- into a private account, invested in the marketplace, that would grow through time. These private accumulations would eventually replace taxpayer burdens.
In summary, if health care consumers are allowed to save and spend their own money, and if doctors are allowed to act like entrepreneurs -- in other words, if we allow the market to work -- there is every reason to believe that health care costs can be prevented from rising faster than our incomes.
Let me offer a few examples of how the free market is already working on the fringes of health care. Cosmetic surgery is a market that acts like a real market -- by which I mean that it is not covered by insurance, consumers can compare prices and services, and doctors can act as entrepreneurs. As a result, over the last 15 years, the real price of cosmetic surgery has gone down while that of almost every other kind of surgery has been rising faster than the Consumer Price Index -- even though the number of people getting cosmetic surgery has increased by five- or six-fold.
In Dallas there is an entrepreneurial health care provider with two million customers who pay a small fee each month for the ability to talk to a doctor on the telephone. Patients must have an electronic medical record, so that whichever doctor answers the phone can view the patient's electronic medical record and talk to the patient. This company is growing in large part because it provides a service that the traditional health care system can't provide. Likewise, walk-in clinics are becoming more numerous around the country. At most of these clinics a registered nurse sits in front of a computer terminal, the patient describes his symptoms, and the nurse types in the information and follows a computerized protocol. The patient's record is electronic, the nurse can prescribe electronically, and the patient sees the price in advance.
We're also seeing the rise of concierge doctors -- doctors who don't want to deal with third-party insurers. When this idea started out in California, doctors were charging 10-15 thousand dollars per year. But the free market has worked and the price has come down radically. In Dallas, concierge doctors charge only $40 per employee per month. In return, the patient receives access to the doctor by phone and e-mail, and the doctor keeps electronic medical records, competes for business based on lowering time costs as well as money costs, and is willing to help with patient education.
Finally, consider the international market for what has become known as medical tourism. Hospitals in India, Singapore, and Thailand are competing worldwide for patients. Of course, no one is going to get on a plane without some assurances of low cost and high quality -- which means that, in order to attract patients, these hospitals have to publicize their error rates, their mortality rates for certain kinds of surgery, their infection rates, and so on. Their doctors are all board-certified in the United States, and they compete for patients in the same way producers and suppliers compete for clients in any other market. Most of their patients come from Europe, but the long-term threat to the American hospital system can't be denied. Leaving the country means leaving bureaucratic red tape behind and dealing instead with entrepreneurs who provide high-quality, low-cost medicine.
As these examples suggest, liberating the medical market by freeing doctors and patients is the only way to bring health care costs under control without sacrificing quality. Continuing on our current path -- allowing health care costs to rise at twice the rate of income under the aegis of an unworkable government Ponzi scheme -- is by comparison unreasonable. *
"Some cause happiness wherever they go; others, whenever they go." --Oscar Wilde
Paul Kengor is professor of political science and executive director of the Center for Vision & Values at Grove City College. This article is republished from V & V, a website of the Center for Vision & Values. Paul Kengor is author of God and Ronald Reagan: A Spiritual Life (2004) and The Crusader: Ronald Reagan and the Fall of Communism (2007). His latest book is The Judge: William P. Clark, Ronald Reagan's Top Hand (Ignatius Press, 2007).
President Obama says the economy is the worst since the Great Depression. Actually, it is the worst since the Reagan recession of 1982-83. Further, the 2009 market crash is not the worst since 1929 but since 1987 -- also on Ronald Reagan's watch.
What did Reagan do -- or, more importantly, didn't he do -- in response to these "crises"? How was Ronald Reagan's response different from what Barack Obama is doing?
In both cases, Reagan did the exact opposite of Obama's massive government spending infusions. In fact, it's worth noting that Bill Clinton -- listen up, Democrats -- didn't invoke Obama's method when he faced recessions at the very start and end of his presidency. (That's another article for another time.)
As for the Reagan recession, the president waited extremely patiently -- to the point where he drove his advisers nearly nuts -- for his huge 1981 tax cuts to take effect. He didn't spend money because he believed spending had been out-of-control, particularly since FDR's New Deal and LBJ's Great Society, which created systemic deficits. Reagan felt that high spending, high regulation, and high taxes had sapped the American economy of its vitality, and particularly its ability to rebound from recession. The economy needed to be freed in order to perform.
Reagan's prescription rested on four pillars: tax cuts, deregulation, reductions in the rate of government spending, and a stable, carefully managed growth of the money supply. The federal income tax reduction was the centerpiece: Reagan secured a 25 percent across-the-board reduction over a three-year period, beginning in October 1981. The upper income marginal tax rate was dropped from 70 percent, which Reagan believed was punitive and stifling, to 28 percent.
By 1983, America had begun its longest peacetime economic expansion in history, cruising right through the 1987 market plunge.
What did Reagan do about the October 1987 crash? Basically nothing -- certainly nothing like a massive government "stimulus."
"Some people are talking of panic," Reagan calmly confided to his diary. "Chrmn. of Stock Exchange is acting very upset."
Those are Reagan's only diary references to the financial crisis. With the economy freed, he was confident it would bounce back. Reagan let the economy correct itself.
Okay, but Reaganomics created huge deficits, right?
That's the big criticism. It isn't accurate. It needs to be understood -- now more than ever.
First off, know these crucial facts: The deficit under Ronald Reagan increased 35 percent, from an inherited deficit (from President Jimmy Carter) of $104 billion in 1980 to a final deficit of $141 billion in 1989. The deficit peaked at $236 billion in l983, particularly because of the plummet in tax revenue during the recession. It began dropping steadily in 1986, continuing through the 1987 crash. (Source: Congressional Budget Office figures, "Historical Tables.")
Compare that to what's happening now, where the direct opposite of Reaganomics is being pursued by the liberal Democratic president and Congressional leadership.
President Obama inherited a record Bush deficit of $400 billion, but is generating a far worse $1.8-trillion deficit in his first year. (Source: Congressional Budget Office, March 20, 2009.) We've never seen anything like this. This unthinkable explosion is a direct result of the stunning government spending unleashed by Obama and the Democratic leadership in just eight weeks -- an unheard of development in 233 years of American history.
Ronald Reagan increased the deficit by 35 percent in eight years, whereas Barack Obama has increased the deficit by 450 percent in eight weeks. Reagan created an extra $37 billion in annual deficit. Obama has already created an extra $1.4 trillion in annual deficit.
But what, exactly, caused the Reagan deficits? There were several factors: the recession of 1982-83, the Reagan defense spending -- implemented to turn the screws on the Soviets -- the domestic social spending by the Democratic Congress, and more. Some reasons were Reagan's fault; others were Congress' doing -- both share blame in differing degrees.
Importantly, and despite what you've heard, Reagan's tax cuts didn't create the deficit. Tax revenues actually boomed from roughly $600 billion in 1981 to $1 trillion in 1989.
The primary cause of the deficit was recession and spending, mainly spending -- as is always the case. It is especially the case right now under Obama, with the spending component utterly out-of-control.
The crucial lesson for today is that the best "stimulus" is one that relies on the tried-and-true American way: letting free individuals and entrepreneurs stimulate the economy through their own earnings and economic activity. Wealth confiscation and redistribution by government collectivists and central planners never works; unfortunately, it is that failed, extremely destructive method that Americans elected in November 2008.
For three decades now, the minority of Americans who make up the hard left have been trashing Reaganomics. Well, on November 4, 2008, for the first time in American history, they convinced enough voters to join them in electing the extreme opposite. At long last, they will pay for the economic consequences of their ideology, as will their children and grandchildren. *
"Dependence begets subservience and venality, suffocates the germ of virtue, and prepares fit tools for the designs of ambition." --Thomas Jefferson