Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby(Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.
Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.
The death of Trayvon Martin is, of course, a devastating event for his family. That a 17-year-old boy returning from a visit to a nearby store for a snack should have his life taken is difficult to understand and accept. On many levels, the incident was, as President Obama has said, "tragic."
Still, this event has provoked demagoguery that ignores the complex facts of the case itself and has provided an opportunity for provocateurs to proclaim that race relations in America are similar to those of the segregated Old South, as if the notable progress we have made in recent years had never happened. Consider some of the things we have heard.
Jesse Jackson referred to the trial as "Old South Justice." NAACP President Benjamin Jealous declared: "This will confirm for many that the only problem with the New South is it occupies the same time and space as the Old South." He invoked the memory of 14-year-old Emmett Till, who was killed in 1955 after supposedly whistling at a white woman "and whose murderers were acquitted." An article in The Washington Post drew parallels between this case and that of Emmett Till as well as the bombing of the 16th Street Baptist Church in Birmingham, Alabama in 1963 and the 1933 case of the Scottsboro Boys, nine young black men accused of raping two white girls.
"Trayvon Benjamin Martin is dead because he and other black boys and men like him are seen not as a person but a problem," the Rev. Dr. Raphael Warnick, the senior pastor at Ebenezer Baptist Church in Atlanta, told a congregation once led by the Rev. Martin Luther King, Jr.
In Sanford, Florida, the Rev. Valerie J. Houston drew shouts of support and outrage at Allen Chapel A.M.E. as she denounced
. . . the racism and the injustice that pollute the air in America. Lord, I thank you for sending Trayvon to reveal the injustice, God, that lives in Sanford.
One of those organizing demonstrations against the verdict and promoting the idea that our society is little better than it was in the years of segregation is the Rev. Al Sharpton, always ready to pour fuel on a fire, and now provided by MSNBC with a nationwide pulpit. How many today remember Sharpton's history of stirring racial strife? In 1987, he created a media frenzy in the case of Tawana Brawley, a black teenager who claimed she was raped by a group of white police officers. A grand jury found Brawley had lied about the event in Wappingers Falls, New York and the case was dropped. The event which Sharpton used to indict our society for widespread racism never happened.
In 1991, Sharpton exacerbated tensions between blacks and Orthodox Jews in the Crown Heights neighborhood of Brooklyn. A three-day riot, fueled by Sharpton's inflammatory statements, erupted when a Guyanese boy died after being struck by a car driven by a Jewish man. At the boy's funeral, Sharpton complained about "diamond cutters" in the neighborhood in what a Brandeis University historian described as the most anti-Semitic incident in U.S. history. Two men died and three were critically injured before order was restored. Clearly, Al Sharpton does not come to a discussion of the Trayvon Martin-George Zimmerman case with clean hands.
Few of those urging demonstrations against the alleged "racism" in the jury verdict finding Mr. Zimmerman not guilty have spent very much time examining the law and the trial itself.
Mr. Zimmerman, a neighborhood watch volunteer, claimed that he shot Mr. Martin only after the teenager knocked him to the ground, punched him, straddled him and slammed his head into concrete. The murder charge required a showing that Zimmerman was full of "ill will, hatred, spite, or evil intent" when he shot Mr. Martin. But prosecutors had little evidence to back up that claim, according to most legal experts. They could point only to Zimmerman's words during his call to the police dispatcher the night he spotted Martin walking in the rain with his sweatshirt's hood up and grew suspicious. Zimmerman appeared calm during the call and did not describe Martin's race until he was asked.
Lawyers point to what they said were errors by the prosecution. The testimony of Officer Chris Serino, the Sanford Police Department's chief investigator on the case, for example, told the jury he believed Zimmerman's account was truthful. Dr. Shiping Bao, the medical examiner who performed the autopsy on Martin, came across, legal experts report, befuddled, shuffling through his notes because he could remember very little. "It was horrific," said Richard Sharpstein, a prominent Miami criminal defense lawyer. "It was a deadly blow to this case because the case depended on forensic evidence to contradict or disprove George Zimmerman's story."
The performance was the opposite of that by Dr. Vincent Di Maio, a nationally recognized forensic pathologist, who took the stand for the defense. Dr. Di Maio said the evidence and injuries to George Zimmerman were consistent with the defense's account, that Trayvon Martin was leaning over the defendant when he was shot. The evidence of Zimmerman's injuries may have helped his case, but it was not legally necessary. He needed to show only that he feared great bodily harm or death when he pulled out his gun, which he was carrying legally. "Classic self-defense," said his attorney.
It is quite different to have sympathy for the Martin family, to regret the incident or to be critical of Florida's laws about concealed weapons, or its "Stand Your Ground" law, that never entered the legal proceeding - than to argue that the law was not properly applied in this case. The prosecution failed to prove Zimmerman guilty beyond a reasonable doubt, hence, the non-guilty verdict.
Many black commentators regret that Al Sharpton, Jesse Jackson, Ben Jealous, and others have made this case about race. Columnist Armstrong Williams declares that:
. . . the Zimmerman case was not about race. Mr. Zimmerman is Hispanic, normally one of the protected minorities in America. In order to make the story about race, The New York Times and some other media outlets, called him a "white Hispanic" (his father is white and his mother of Peruvian heritage). When was the last time anybody in America heard a Hispanic called a "white Hispanic?" Calling Mr. Zimmerman a "white Hispanic" is like calling Adam Clayton Powell or Barack Obama a "white black." But the media needed to create hysterics and so injected race into the equation to make it more salable to the American people as a political circus. After all, who cares about two white men or two black men in a fight that results in death?
In Williams's view:
A young man was killed by another young man under circumstances where there is so much racial static in the background that it's difficult for many to be remotely objective. . . . Compare the reaction of the O.J. Simpson verdict by many American blacks to the reaction to the Zimmerman acquittal. In both cases the prosecution did not make its case beyond a reasonable doubt to convict the defendant. Yet blacks generally cheered the result in the Simpson case, while viewing the Zimmerman verdict as a travesty of justice. In our court system of trial by jury, you can't have it both ways. There cannot be a different standard for a white man killing a black man than for a black man killing a white man and a white woman.
Liberal columnist Richard Cohen writes that:
I don't like what George Zimmerman did, and I hate that Trayvon Martin is dead. But I also can understand why Zimmerman was suspicious and why he thought Martin was wearing a uniform we all recognize. I don't know whether Zimmerman is a racist. But I'm tired of politicians and others who have donned hoodies in solidarity with Martin and who essentially suggest that, for recognizing the reality of urban crime in the U.S. I am a racist.
Cohen argues that:
What Zimmerman did was wrong. It was not, by a verdict of his peers, a crime. Where is the politician who will own up to the painful complexity of the problem, and acknowledge the widespread fear of crime committed by young black males? This does not mean that racism has disappeared, and some judgments are not the product of individual stereotyping. It does mean, though, that the public knows young black males commit a disproportionate amount of crime. In New York City, blacks make up a quarter of the population yet they represent 78 percent of the shooting suspects - almost all of them young men. We know them from the nightly news.
Those statistics represent the justification for New York's controversial stop-and-frisk program, which amounts to a kind of racial profiling. "After all," writes Cohen:
. . . if young black males are your shooters, then it ought to be young black males whom the police stop and frisk. Still, common sense and common decency, not to mention the law, insist on other variables, such as suspicious behavior. Even still, race is a factor without a doubt. It would be senseless for the police to be stopping Danish tourists in Times Square just to make the statistics look good.
Last year, the New York City Police Department recorded 419 homicides, nearly a 20 percent decrease from the year before and the lowest rate per 100,000 residents since the department began keeping statistics. If New York had the same homicide rate as Washington, D.C., it would be investigating 800 more murder cases for the year. If it had Detroit's statistics, nearly 4,000 more New Yorkers would be murdered every year.
Editorially, The Washington Post states that, "Without question, the Big Apple is doing something right." Mayor Michael Bloomberg and Police Chief Raymond Kelley say the stop-and-frisk policy has saved 5,000 lives in the past ten years. "New York has never been safer in its modern era," the mayor says.
The policy, of course, is controversial and is the subject of a federal action lawsuit because the vast majority of those stopped are young men of color. Mayor Bloomberg responds:
They keep saying, "Oh, it's a disproportionate percentage of a particular ethnic group." That may be, but it's not a disproportionate percentage of those who witnesses and victims describe as committing the murders. In that case, incidentally, I think we disproportionately stop whites too much and minorities too little.
Expressing the anguish of many who hate all forms of racism, but are not prepared to turn a blind eye to the reality of urban crime, Richard Cohen concludes:
I wish I had a solution to this problem. If I were a young black male and were stopped just on account of my appearance, I would feel violated. If the police are abusing their authority and using race as the only reason, that has got to stop. But if they ignore race, then they are fools and ought to go into another line of work.
Another liberal commentator, columnist Ruth Marcus, was particularly critical of those who compared Trayvon Martin with Emmett Till:
The comparison is unfair. No doubt race played a part in Martin's death. . . . But there is no evidence that race played a role in Zimmerman's acquittal. If anything, the racial undertones worked against Zimmerman, increasing public pressure on prosecutors to bring the most serious - and, in hindsight the most difficult to support - charges against him. Contrast the Zimmerman trial with that of Till's murderers. The courtroom was segregated. No hotel would rent rooms to black observers. The local sheriff welcomed black spectators to the courtroom with what was described as a cheerful use of the vilest racial epithets. The New South is not perfect, but it is not the Old.
What is rarely noted is the fact that the vast majority of the victims of young black men who kill are other young black men and women. Those engaged in calling for marches and vigils to express outrage over the verdict in the Zimmerman case say hardly a word about the black-on-black crime which plagues the nation's inner cities. In an interview with black journalist Juan Williams, comedian Bill Cosby noted that the NAACP's headquarters is in Baltimore, a city with one of the highest murder rates in the nation. "I've never once heard the NAACP say, "Let's do something about this," said Cosby, "They never marched or organized or even criticized the criminals."
The over-heated declarations that our current society is similar to that in which Emmett Till was murdered in 1955 - or in which the Scottsboro Boys were convicted in 1933 - turns reality on its head. Al Sharpton doesn't really believe it. Jesse Jackson knows it's untrue. Ben Jealous is unwilling to give up the public spotlight he receives by portraying such a false picture.
Those of us old enough to have lived through the years of segregation remember an era of segregated schools, segregated bus and train stations, "white" and "black" restrooms (visit the Pentagon and see the proliferation of rest rooms which were constructed in the years when it was illegal in Virginia for men and women of different races to use the same facilities), water fountains reserved for "whites" and "colored." In many parts of the country blacks could not vote or sit on juries. Black travelers never knew when they would be able to stop for a meal. There was no pretense that racial equality of any kind existed.
Today, we live in an imperfect society, but one in which all citizens, regardless of race, have equal rights. It is against the law to discriminate on the basis of race. Men and women can go as far as their individual abilities can take them. Black Americans hold every conceivable position in our society - from CEO of major corporations, to chief of police in major cities, to university president, to governor - to President of the United States.
None of this would be true if ours were indeed a "racist" society. This is not to say that in a society of more than 300 million people, examples of racism cannot sometimes be found. Using the trial of George Zimmerman to say that it is still 1933 or 1955, as some are now doing, is to paint a picture of contemporary society that cannot be recognized. When it comes to the status of race relations in America today, who are we going to believe, shrill voices such as Al Sharpton's, or our own eyes? The Trayvon Martin-George Zimmerman case has brought out the worst in some. The rest of us must move resolutely forward, continuing on the path of creating a genuinely color-blind society, which has long been the goal of men and women of good will of all races.
The revelation that the U.S. Government is openly operating a massive surveillance program - with less oversight than previously thought - raises many questions. U.S. officials say that the program, known as Prism, which was revealed in the leaks by Edward Snowden, an employee of Booz Allen Hamilton, was legal and authorized under the Foreign Intelligence Surveillance Act (FISA). This gives the National Security Agency (NSA) the power to obtain e-mails and phone records relating to non-U.S. nationals, but details about the individuals targeted under the act remain secret.
Documents leaked to The Washington Post and The Guardian newspapers claimed the government had direct access to the servers of major technology firms such as Apple and Google. According to Snowden, individual operatives had the power to tap into anyone's e-mails at any time.
Senator Diane Feinstein (D-CA), the chairwoman of the Senate Intelligence Committee, accused the 29-year-old Snowden of "an act of treason." House Speaker John Boehner labeled Snowden a "traitor." He said: "The disclosure of this information puts Americans at risk. It shows our adversaries what our capabilities are. And it's a giant violation of the law."
Others - on both the right and left - have hailed Snowden as an idealistic "whistleblower." This is the position taken by, among others, Senator Rand Paul (R-KY) and former Rep. Dennis Kucinich (D-Ohio). The conservative Washington Times, expressing sympathy for Snowden, declared:
In a democracy, matters of widespread public interest are meant to be discussed and decided on by our elected representatives, and done in the open. . . . The latest revelations will have no pernicious effect because our enemies assume Uncle Sam has been listening. Al Qaeda operatives use codes, dead drops, and encryption to carry out attacks, such as the Boston bombings, under the nose of the mass surveillance. That's what spies and terrorists do. Google, Facebook, and the other companies play along, denying that the government is directly tapping into their servers. . . . Such extreme secrecy isn't about making sure that China or the Taliban never learn about U.S. surveillance capabilities, but about keeping ordinary Americans in the dark. . . . The Founding Fathers never would have entrusted power over such information to a handful of men. Neither should we.
Placing the merits of the government surveillance program aside, Snowden himself is hardly a hero. By deciding to unilaterally leak secret NSA documents, Snowden violated his explicit and implicit oaths to respect the secrecy of the information with which he was entrusted. He betrayed oaths he had voluntarily entered into. As New York Times columnist David Brooks pointed out:
He betrayed the Constitution. The Founders did not create the United States so that some solitary 29-year-old could make unilateral decisions about what should be exposed. Snowden unilaterally short-circuited the democratic structures of accountability, putting his own preferences above everything else.
Beyond this, the question arises of why, in the span of three years, leakers at the lowest levels of the nation's intelligence ranks, gained access to large caches of classified material. The similarities between Snowden and Bradley Manning, a U.S. Army private on trial for sending hundreds of thousands of secret files to the WikiLeaks website, are clear.
In the case of Snowden, the fact that he was not a U.S. Government employee, but was an employee of a private company, focuses national attention on whether or not a company such as Booz Allen Hamilton should have access to the nation's top secret information. Is not the gathering and handling of intelligence an inherently governmental function?
Booz Allen Hamilton, which hired Snowden, a high school dropout, to work at the NSA, is a leader among more than 1,900 private firms that have supplied tens of thousands of intelligence analysts in recent years. According to The Washington Post:
. . . in the rush to fill jobs, the government has relied on faulty procedures to vet intelligence workers. . . . Intelligence officials, government auditors, and contracting specialists have warned for years that the vulnerability to spies and breaches was rising, along with contracting fraud and abyss.
When you increase the volume of contractors exponentially but you don't invest in the personnel necessary to manage and oversee that workforce, your exposure increases, said Steven Schooner, co-director of the government procurement law program at George Washington University. This is what happens when you have staggering numbers of people with access to this kind of information.
The reliance on contractors reflects a major shift toward outsourcing intelligence in the past 15 years. . . . Private contractors for the CIA recruited spies, protected CIA directors, helped snatch suspected extremists off the streets of Italy and interrogated suspected terrorists in secret prisons abroad.
Booz Allen Hamilton had $5.8 billion in revenue last year. Almost all of its work was for the government, nearly a quarter of that was for intelligence agencies.
By 2011, more than 4.2 million government and contract workers had security clearances and more than a third of them had top-secret access. A review by the Government Accountability Office found that of 3,500 security clearance reviews, almost 9 in 10 lacked documentation. Of those, nearly a quarter were still approved.
Glenn Voelz, an Army Intelligence officer previously assigned to the Joint Chiefs of Staff at the Pentagon, warned in 2009 "the rapid and largely unplanned integration of many non-governmental employees into the workforce presents new liabilities that have been largely ignored to this point."
Some say that outsourcing intelligence to private companies saves the government money. But Edward Snowdon, despite not having a college degree, made $200,000 a year. Booz Allen Hamilton Chairman Ralph W. Shrader was paid $1.2 million in base salary and a total of $3.1 million in fiscal 2012. Four named executive presidents had total pay packages in the range of $2 million to $3 million. Already, many are speaking of an "Intelligence Industrial Complex," echoing President Eisenhower's warning about the "Military Industrial Complex."
The revelations about the government surveillance programs present us with the opportunity for a free and open debate about how much secrecy is healthy in a democratic society - as well as whether the "inherently governmental" intelligence function should be performed by for-profit private companies, whose incentive structure is quite different from that of the CIA or the NSA.
Those of us concerned about the growth of government power - and the right to privacy - have every reason to be concerned. Those who seek to expand power and diminish freedom always have a variety of good reasons to set forth for their purposes. In the case of Olmstead vs. United States (1927), Justice Louis Brandeis warned that:
Experience should teach us to be most on our guard to protect liberty when the government's purposes are beneficent. Men born to freedom are naturally alert to repel invasion of their liberty by evil-minded rulers. The greatest dangers to liberty lurk in the insidious encroachment of men of zeal, well meaning but without understanding.
Limiting our freedom in the interest of "national security" may at times be necessary. But doing so - in secrecy - and using private, profit-making companies to implement such a program seems inconsistent with our larger values. And, in theory, we are at peace. Congress has not declared war. Recently, a top Pentagon official said that the evolving war against al Qaeda was likely to continue "at least 10 to 20 years." Can our free society be on a war footing for decades, increasing government power to pursue it, without eroding our freedom? Before the recent leaks of classified material, few were asking such questions. Hopefully, a much needed national debate will now begin. *
Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.
The national media, as we all know, has a penchant for scandal, the more gruesome the better.
Consider the case of Jodi Arias, the young woman on trial for murdering her boyfriend - allegedly inflicting 29 stab wounds, a slit throat and a shot to the head. She claims it was self-defense.
This case has saturated the media. Anderson Cooper seems to discuss it almost nightly on his CNN program. The cable network HLN airs a daily show entitled, "HLN After Dark: The Jodi Arias Trial." ABC News conducted a jailhouse interview with Arias and the case was featured on an episode of "48 Hours Mystery: Picture Perfect in 2008." Inside Edition also interviewed Arias at the Maricopa County, Arizona jail.
While the media has devoted itself to promoting the Arias trial, another trial was taking place in Philadelphia, with Dr. Kermit Gosnell, an abortion doctor, charged not with a single murder - as in the Arias case - but with killing seven newborn babies as well as a 41-year-old refugee from Nepal who was getting an illegal late-term abortion.
Dr. Gosnell is black and his clinic is in a minority neighborhood. Most of his victims were African-Americans. The facts of the case are shocking by any standard. Gosnell is accused of using unfathomable abortion procedures on his inner-city patients who were well into their third trimester at an unsanitary, bloody clinic called the Woman's Medical Society.
While he is only charged with killing seven live babies, prosecutors believe Gosnell killed hundreds of infants and destroyed related records, according to a grand jury report. During the trial, ex-clinic employee Steven Massof testified that he "snipped" babies'spinal cords to kill them after delivering them live. "It would rain fetuses," Massof said, according to NBC 10 Philadelphia. "Fetuses and blood all over the place."
Gosnell wasn't licensed to practice obstetrics and gynecology and illegally peddled painkillers during the day and murdered babies at night, according to the grand jury. In Feb. 2010, the FBI and other law enforcement agencies finally raided his facility, following reports that he had been writing illegal prescriptions. Here's what they found:
There was blood on the floor. A stench of urine in the air. A flea-infested cat was wandering through the facility, and there were cat feces on the stairs. Semi-conscious women scheduled for abortions were moaning in the waiting room or the recovery room, where they sat on dirty recliners with blood-stained blankets.
On Jan. 31, 1998, a then 15-year-old Robyn Reid, in the company of her grandmother, sought an abortion from Gosnell's clinic. Once in the clinic, Reid, an 87-pound teenager, changed her mind. Gosnell ripped off her clothes and restrained the girl. When she regained consciousness 12 hours later at her aunt's home, she discovered that an abortion had been performed against her will.
According to the Media Research Center, there has been no network coverage of the Gosnell trial on ABC, CBS, NBC, MSNBC or PBS and only one brief mention on CNN [as of April 10].
It's unbelievable that Dr. Gosnell's trial for his actions inside his 'house of horrors' hasn't drawn one network story. . .
said the Media Research Center president Brent Bozell.
Forbes columnist Mike Ozanian said that the controversy surrounding Rutgers University basketball coach Mike Rice, who was shown in a video abusing players and using vulgar language during practice, had received far more attention than the Gosnell trial.
"What troubles me is why Rice and Rutgers deserve more attention from the media than the trial of doctor Kermit Gosnell," he said.
. . . How much of the story have you seen on the evening news? I bet not nearly as much as you have seen about Rice. Gosnell apparently made a fortune running a slaughterhouse.
The Washington Times notes that:
Not every murder trial receives prominent national coverage, but the Gosnell case would seem to contain all the ingredients of must-see television: a formerly respected community leader accused of unspeakable acts; the death of a young immigrant woman; a parade of former employees offering graphic testimony on the gruesome deaths of more than 100 just-born infants; and even the implication by the doctor's lawyers that the charges have been motivated by racism.
Perhaps a strange form of political correctness is at work here. Is there fear that publicizing this story would paint pro-choice advocates in a poor light? Is there a level of indifference to a story when both the victims and the perpetrator are black?
The trial is being covered by the Associated Press, and AP wire stories have appeared on network websites. The proceedings are also being covered by some religious websites, such as LifeNews.com, as well as newspapers and television in the Philadelphia and Delaware markets. Yet the story has been totally ignored by the national media [as of April 10].
Are Jodi Arias and Mike Rice really more interesting - and compelling - than the trial of Kermit Gosnell? Or is a strange manifestation of political correctness - and media bias - the real reason? The public is ill served if this is the case, as it certainly seems to be.
It is now ten years since the U.S. invaded Iraq. Based on false information about alleged weapons of mass destruction, the U.S. embarked upon a war with a country which had never attacked us, and which had nothing to do with 9/11. It was as if, some pointed out, after Pearl Harbor we launched an attack upon Mexico.
Both Democrats and Republicans in Congress acted irresponsibly. They passed a vague Authorization for Use of Military Force instead of the congressional declaration of war the Constitution requires. The media - liberals and conservatives alike - displayed willful credulity, never seeking independently to discover the truth.
Now, Iraq is in chaos. In 2010, Prime Minister Nouri al-Maliki, a Shiite, formed a coalition government with parties representing Kurds and secular Sunnis. Since then, he has driven the Sunni Vice President into exile and the Sunni finance minister and Kurdish foreign minister no longer visit Baghdad. Iran's influence is growing. Iraq has been allowing Iran to fly weapons through its airspace to the Syrian regime of Bashar al-Assad.
Philip Carter, an Iraq veteran and senior fellow at the Center for a New American Security, notes that:
We now know that Iraq had no weapons of mass destruction on March 19, 2003, when the U.S. troops invaded. . . . The Bush administration compounded that error with its failure to admit the existence of the insurgency, let alone plan for it, and its failure to provide adequate resources. . . . Senior administration officials made matters worse with their arrogant statements about the war and the troops' plight - such as when then-Deputy Defense Secretary Paul Wolfowitz casually dismissed then-General Eric Shinseki's troop predictions as "wildly off the mark," or when Defense Secretary Donald Rumsfeld glibly told troops scavenging for vehicle armor in Kuwait that "you go to war with the army you have."
Finally, voices are being heard questioning the aggressive use of American power abroad in the post-Cold War world, when who is an enemy is less than clear, and who is a friend is also uncertain. Republicans, who took the country to war in Iraq with the acquiescence of Democrats, seem particularly torn.
"A real challenge for the Republicans as they approach 2016 is what will be their brand?" said Richard N. Haass, the president of the Council on Foreign Relations and a former aide to the first President Bush. "The reason Rand Paul is gaining traction is overreaching in Iraq. What he is articulating. . . is an alternative."
The growing split in the Republican Party could be seen at the Conservative Political Action Conference in March. Sen. Rand Paul (R-KY) told the conference that the filibuster he conducted earlier in the month over the Obama administration's drone policy was aimed at the limits on presidential power and American power abroad. "No one person gets to decide the law," he said.
Neo-conservatives - the ones who led the country to war in Iraq and promoted the false notion of weapons of mass destruction in Baghdad - are concerned about voices such as Rand Paul. Dan Senor, the spokesman in Iraq for the Bush administration and a prominent neo-conservative voice, who now urges an attack on Iran, warned of a push to reorient the party toward a "neo-isolationist foreign policy. That policy, he said:
. . . is sparking discussion among conservative donors, activists, and policy wonks about creating a political network to support internationalist Republicans.
Sen. John McCain (R-AZ), another strong supporter of both the invasion of Iraq and a strike against Iran, another country that has not attacked us, has dismissed Sen. Paul and those who agree with him as "wacko birds." Other Republicans, however, have praised Paul and his filibuster. Senators Ted Cruz (R-TX) and Mike Lee (R-UT) joined the filibuster. Reince Priebus, chairman of the Republican Party, said Paul was "able to capture some national attention in standing up to the president. My view is that he is an important voice in our party."
Sen. Paul calls himself a "realist," not a neoconservative - and not an isolationist. "This is a divide that has been festering and deepening for a generation," said Thomas Donnelly, a fellow at the American Enterprise Institute. Ford O'Connell, a Republican strategist, says:
You are starting to see a bit of the split between the libertarian-leaning lawmakers and essentially what you see as defense hawks. We are a war-weary nation. While the GOP is still seen as the national defense party, what you are seeing is a rising trend of libertarianism. You are also seeing the Republican Party reset on where it is on national security. Essentially what the libertarians are saying is, "Hey, we have to be more careful about the future because we've just been through 10 years of war here."
Isolationism is a dangerous policy both for the U.S. and for the world, as is interventionism - especially based upon false premises, when U.S. interests and world peace are not directly involved.
Some neoconservatives are prepared to go to war haphazardly, as we did, at their urging, in Iraq, Embracing that philosophy has hardly proven wise - for the Republican Party or the country. But not taking a leadership role in the world is not a legitimate option for the U.S. It would make us - and the world - far less stable and secure.
As The Economist points out:
Not every problem is solved by America noisily taking charge. A sharper critique, as advanced in a new national-security strategy from the Project for a United and Strong America, a bipartisan group of ex-envoys and senior officials, compares the emerging world order to a fiercely competitive marketplace, in which Americans must invest, via engagement, to defend the open, rules-based international order vital to American interests.
With regard to the posture being taken by President Obama, The Economist argues that:
Speaking softly suits Mr. Obama. His desire to see other powers stop free riding on American security guarantees is understandable. In a world of shifting power balances, it is sensible to appeal to the self-interests of others, especially after the overreach of the Bush era. But he is taking a risk. Step back too far from big sticks, and when America speaks it may not be heard.
Finally - ten years after the misguided invasion of Iraq - a real debate seems to be starting about what America's role in the post-Cold War world should be. All of us will benefit from such a debate. It is long overdue.
Until recently, Dr. Benjamin Carson was highly regarded - and was widely promoted as a role model for African-American young people. Growing up in poverty in Detroit, he went to Yale and the University of Michigan Medical School, and at 33 became director of pediatric neurosurgery at Johns Hopkins. He gained fame for a series of operations separating conjoined twins, complex procedures that did not always succeed. His 1996 autobiography, Gifted Hands, became a movie starring Cuba Gooding, Jr.
"He is one of the acknowledged leaders of pediatric neurosurgery," said Dr. Donlin Long, a retired chairman of neurosurgery at Johns Hopkins, who first brought Dr. Carson to the department.
In February, speaking at the National Prayer Breakfast, with President Obama in attendance, Carson criticized the Obama administration's health care overhaul. Later, speaking at the Conservative Political Action Conference, he was interrupted by sustained applause when he said, "Let's just say if you magically put me in the White House . . ."
Since then, he has been the victim of vitriolic attack, particularly from black liberal commentators who still find it difficult to understand that an individual's race has nothing to do with his political philosophy and how he views the world.
MSNBC's Toure Neblett declared that:
When you're publicly admitting your party doesn't care enough about black America, then it's time for a new black friend. Enter Dr. Ben Carson.
And Cynthia Tucker, formerly an Atlanta Journal-Constitution editor and now at the University of Georgia's journalism school, said:
It's no wonder that conservatives have started to trumpet him as their Great Black Hope. Psychologists believe that romantic interest increases when people mirror each other's gestures. Carson perfectly reflects the beliefs of his suitors.
Commenting on the attacks upon him, Carson laments that:
That's what you can find on a third grade playground. White liberals are the most racist people there are . . . they put you in a little category, a little box. You have to think this way. How could you dare come off the plantation?
In speeches and writings, Carson describes growing up with a divorced mother whose education stopped at the third grade and who worked two, and sometimes three jobs. The New York Times reports that:
He was teased as a "dummy" because his grades were so bad. But his mother insisted that he and an older brother turn off the television and read, writing weekly book reports that she could only feign understanding.
Dr. Carson says that he was a "flaming liberal" in college and became conservative through his own climb to success. "One thing I always believed strongly in was personal responsibility and hard work," he said. "I found the Democrat Party leaving me behind on that particular issue."
With his wife Candy, Carson founded the Carson Scholars Fund, which awards $1,000 to students to help pay for college and has endowed Ben Carson Reading Rooms at schools that serve disadvantaged students. He belongs to a Seventh Day Adventist Church and draws on the Bible's description of tithing to argue in favor of a flat tax. He advocates an alternative to the Affordable Care Act. Most people, he believes, could pay most of their medical bills through health savings accounts, with the government making the contributions for the poor.
One need not agree with any of Ben Carson's political views to recognize that an individual's opinion on public issues reflects that individual's considered judgment - not his race. Black Americans are as diverse in their views as are Americans of other races.
John McWhorter, a respected black academic, notes that:
. . . while Democrats seem to think they have the lock on racial enlightenment, it is actually often Republicans who have the larger understanding of what it is to assess people according to the content of their character. Take affirmative action, for example. To proclaim that it's okay to evaluate black and Latino college applicants partly on grades but also on their contribution to "diversity" - while white and Asian students are required to just put up or shut up - is racist.
In McWhorter's view:
That kind of policy makes sense only as a temporary fix, which is what it should always have been. However, to then slide into the idea that race-based admissions must continue until racism doesn't exist (i.e., never) is to essentially assert that blacks and Latinos are the world's first humans to require perfect conditions to succeed. That's racist, and the right understands that.
In fact, black conservatism is nothing new. It goes back to Frederick Douglass and Booker T. Washington through George Schuyler and Max Yergan - and more recently Clarence Thomas, Thomas Sowell and Walter Williams - and a host of distinguished men and women committed to the principles of individual freedom and a genuinely color-blind society.
The current assault on Ben Carson is an attempt to stifle free speech and diversity within the black community. What some self-proclaimed civil rights spokesmen seem to fear is that Ben Carson and other black conservatives will expose them as speaking only for themselves and misrepresenting the constituency in whose name they repeatedly - and falsely - speak.
Free speech for Ben Carson, whatever one thinks of his views, should not be a controversial proposition.
Americans used to frequently quote Voltaire's declaration that, "I disapprove of what you say, but I will defend to the death your right to say it." That is no longer the case at too many of our colleges and universities.
What some have called the "heckler's veto" has been one factor limiting free speech. Nat Hentoff once pointed out that:
First Amendment law is clear that everyone has the right to picket a speaker, and to go inside a hall and heckle him or her - but not to drown out the speaker, let alone rush the stage and stop the speech before it starts. That's called the "heckler's veto."
Now, even a hint of vocal opposition to a speaker seems to be enough to eliminate the possibility of that speaker being heard.
Recently, two respected individuals who were invited to be commencement speakers at Johns Hopkins University and Swarthmore College withdrew in the face of opposition from some vocal students.
In the case of Swarthmore, Robert Zoellick, an alumnus and former president of the World Bank, accepted and then turned down an invitation, after students objected to his support of the Iraq war and his record at the World Bank.
Zoellick, an official in George W. Bush's administration, withdrew after students started a campaign on Facebook calling him "an architect of the Iraq war" and a "war criminal." In fact, while Zoellick did support the war, he had no role in planning it. He was Bush's U.S. trade representative and later worked to resolve the conflict in Darfur as a State Department official. He ran the World Bank from 2007 until 2012.
As the attacks on Zoellick grew, Swarthmore's student paper, the Daily Gazette, mocked the political correctness that characterized the controversy. On April Fool's Day, it wrote that the school "would not be offering degrees to any member of the Class of 2013 who does not plan to found a vegan coffee shop after graduation," calling other professional choices "antithetical to Swarthmore values."
In the case of Johns Hopkins, Dr. Ben Carson, the world-renowned Johns Hopkins neurosurgeon, withdrew as commencement speaker after controversy over his statement in opposition to gay marriage, in which he lumped homosexuality with pedophilia and bestiality, for which he later apologized twice. He said he withdrew because:
My presence is likely to distract from the true celebratory nature of the day. Commencement is about the students and their successes, and it is not about me.
Josh Wheeler, director of the Thomas Jefferson Center for the Protection of Free Expression at the University of Virginia, notes that:
Overall, there seems to be an increased sensitivity to things in the past we might have let roll off our backs. Nowadays, people aren't afraid to express their objections, which isn't a bad thing, but people are more willing to censor [speech] to remove the offending speech or language.
Wheeler calls this phenomenon the "heckler's veto," the ability of a small but vocal group to limit the choices of a much larger majority. He argues that:
We shouldn't ignore [protest] but at the same time to allow a minority to determine what we see or hear is very concerning from a free-speech point of view. Too often, it's easier to eliminate the problem than deal with the controversy.
Many public figures - with a variety of points of views - have been treated in a similar manner. Former Republican vice presidential nominee Sarah Palin faced protests from students and controversy over her fees when she was invited to speak at California State University-Stanislaus in 2010, but she went ahead with her appearance. There were weeks of protest by anti-abortion advocates preceding President Obama's commencement address at Notre Dame University in 2010. In April, protests flared at Yeshiva University's Cardozo School of Law after it gave its "International Advocate for Peace Award" to former President Jimmy Carter. Some alumni called on the school's graduates to withdraw their financial support to protest Carter's criticism of Israel.
In March 2006, in violation of its own policies, New York University refused to allow a student group to show the controversial Danish cartoons of Mohammed at a public event. Even though the purpose of the event was to show and discuss the cartoons, an administrator suddenly ordered the students either not to display them or to exclude 150 off-campus guests from attending. "NYU's actions are inexcusable," declared Greg Lukianoff, president of the Foundation for Individual Rights In Education (FIRE).
The very purpose of this event is to discuss the cartoons that are at the center of a global controversy. To say that students cannot show them if they wish to engage anyone outside the NYU community is both chilling and absurd. The fact that expression might provoke a strong reaction is a reason to protect it, not an excuse to punish it.
Lukianoff declared that:
This is a classic case of the heckler's veto. NYU is shamelessly clamping down on an event purely out of fear that people who disagree with the viewpoints expressed may disrupt it.
Beyond the heckler's veto, many universities have adopted speech codes to suppress speech that others find offensive. Alan Charles Kors and Harvey Silverglate, in their work "The Shadow University" (1998), refer to a number of cases where speech codes have been used by universities to suppress academic freedom, as well as freedom of speech.
In one case they describe, the so-called "water buffalo" incident at the University of Pennsylvania, a freshman faced expulsion when he called African-American sorority members who were making substantial amounts of noise and disturbing his sleep during the middle of the night "water buffalo" (the charged student claimed not to intend discrimination, as the individual in question spoke the modern Hebrew language and the term "water buffalo" or "behema" in modern Hebrew, is slang for a rude of disturbing person. Moreover, water buffalo are native to Asia rather than Africa). Some saw the statement as racist while others simply saw it as a general insult. The college eventually dropped the charge, amid national criticism.
Texas Tech had a speech code which prohibited "insults," "ridicule," and "personal attacks" and restricted free speech to a 20-foot diameter gazebo referred to as a "Free Speech Zone."
In Sept. 2012, Christopher Newport University in Virginia forbade students to protest an appearance by Rep. Paul Ryan, the Republican vice presidential candidate. Students must apply 10 days in advance to demonstrate in the college's tiny "free speech zone" - and Ryan's visit was announced on a Sunday - two days before his Tuesday visit.
In a study of 392 campus speech codes, FIRE found 65 percent of colleges had policies "that in our view violated the Constitution's guarantee of free speech."
Incoming Harvard freshmen were pressured by campus officials to sign an oath promising to act with "civility" and "inclusiveness" and affirming that "kindness holds a place on par with intellectual attainment." Harry R. Lewis, a computer science professor and former dean of Harvard College, said:
For Harvard to "invite" people to pledge themselves to kindness is unwise, and sets a terrible precedent. It is a promise to control one's thoughts.
In 2009, Yale banned students from making t-shirts with an F. Scott Fitzgerald Quotation - "I think of Harvard men as sissies" - from his 1920 novel This Side Of Paradise - to mock Harvard at their annual football game. The t-shirt was blocked after some gay and lesbian students argued that "sissies" amounted to a homophobic slur. "What purports to be humor by targeting a group through slurs is not acceptable," said Mary Miller, a professor of art history and the dean of Yale College.
Recently, two gay activists at George Washington University demanded that the Rev. Gregory Shaffer, a Catholic chaplain, be fired because he supports his church's teachings about homosexuality and same-sex marriage.
A 2010 study by the American Association of Colleges and Universities of 24,000 college students and 9,000 faculty and staff members found that only 35.6 percent of the students and only 18.5 per cent of the faculty and staff strongly agreed that it was "safe to hold unpopular positions on campus."
With speech codes and the heckler's veto - the First Amendment seems to be increasingly endangered on the nation's campuses. Voltaire would weep. *
Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.
In the 1960s, the term "imperial presidency" became popular and served as the title of a 1973 volume by historian Arthur M. Schlesinger, Jr., to describe the growing centralization of power in the modern chief executive.
Things have progressed a great deal since those days. Who ever would have thought that a president of the United States would arrogate to himself the power to kill American citizens - without a trial or judicial finding of any kind, making himself, in effect, judge, jury and executioner?
Yet, this appears to be the power now claimed by President Obama. The confirmation hearing for John Brennan as director of the CIA has produced a much-needed discussion of the Obama administration's counterterrorism policy. It also shows that President Obama's claims of executive power are different from the views he expressed about George Bush's counterterrorism policies.
Early in his first term, Obama rejected the protests of the CIA and ordered the public disclosure of secret Justice Department legal opinions on interrogation and torture that had been written in the Bush administration. In the case of his own Justice Department's legal opinions on assassination and the "targeted killing" of terrorism suspects, however, Obama has taken a different approach. Though he entered office promising the most transparent administration in history, he has refused to make those opinions public, notably one that justified the 2011 drone strike in Yemen that killed an American, Anwar el-Awlaki. His administration has withheld them even from the Senate and House Intelligence committees.
Now, with the disclosure of a Justice Department document offering a detailed legal analysis of the targeted killings of Americans - similar to the leaks of interrogation memos in 2004 under President Bush, public discussion about whether and when a president can order the execution of a citizen based on secret intelligence and without a trial has finally emerged.
The American Civil Liberties Union has been consistent. It was harshly critical of Bush administration interrogation policies and now calls the Obama material "chilling." Amnesty International has also been consistent, saying there is increasing evidence that administration practices were "unlawful, violating the fundamental human right not to be arbitrarily deprived of one s life."
Republicans and Democrats, with a few rare exceptions such as Sen. Ron Wyden (D-OR), have not been quite as consistent. Democrats who criticized Bush interrogation policies, now support Obama's targeted killing. And Republicans, who defended the Bush policies, are now criticizing those of the current president. Unfortunately, the philosophy of "my party, right or wrong" seems to dominate much of congressional thinking.
According to the White Paper, the Constitution and the Congressional authorization for the use of force after the attacks of 9/11 gave the president the right to kill any American citizen that an "informed, high-level official" decides is a "senior operational leader of al Qaeda or an associated force" and presents an "imminent threat of violent attack."
Yet, it never defines who an "informed, high level official" might be and the authors of the memo have redefined the word "imminent" in a way that diverges from its customary meaning.
Anwar al-Awlaki was clearly a dangerous and deadly figure. He had become a leader of the terrorist group al Qaeda in the Arabian Peninsula and was believed to be directly involved in the near-miss "underwear bomber" plot to bring down an airliner on Christmas Day 2009, as well as planting two bombs on Chicago-bound cargo planes in 2010. Still, the fact remains that U.S. citizens have constitutional rights. Do we really want one man - the president - to be able to decide that those rights no longer apply?
Awlaki was certainly a legitimate target. But it should take more than the president or a "high level official" to make the judgment without appropriate judicial oversight. When the government wants to invade a citizen's right to privacy with wiretaps and other forms of electronic surveillance, a judge from a special panel - the Federal Intelligence Surveillance Court (FISA) - has to give approval. There should be at least as much judicial review when it comes to taking a life. It is not difficult to imagine a future president extending the power to kill Americans to U.S. soil. All it would take would be to label a group as "domestic terrorists."
During the 2012 presidential campaign, Republican candidate Mitt Romney expressed agreement with much of what President Obama has done with his powers as chief executive, including an embrace of the president's claim to sole authority to expand drone strikes to kill terrorists abroad. Expressing little interest in Congress' role in declaring war, Romney reserved for the president the right to deploy U.S. military power to world hot spots and to engage in unilateral action against Iran. He supported President Obama's position on indefinite detention of U.S. citizens deemed "enemy combatants" who, the administration argues, are not entitled to habeas corpus.
Republicans and Democrats, it seems, are united in embracing the Imperial Presidency we now have. This is a clear contradiction to the American political tradition of constitutional government, checks and balances, and division of powers. The written and spoken words of the men who launched our nation give us numerous examples of their fear and suspicion of unchecked centralized power. Samuel Adams asserted that:
There is a degree of watchfulness over all men possessed of power or influence upon which the liberties of mankind depend. It is necessary to guard against the infirmities of the best as well as the wickedness of the worst of men.
Therefore, he declared, "Jealousy is the best security of public liberty."
James Madison argued that checks and balances are an essential bulwark for liberty; by setting branch against branch, the structure of our government minimizes encroachments on fundamental rights. Today, the separation of powers is hard to find, as the Imperial Presidency grows. Perhaps the concern over a president acting as judge, jury, and executioner will cause increasing numbers of Americans to rethink their support for such unbridled power in the hands of the chief executive.
It has been a decade since the first strike by an armed U.S. drone killed an al Qaeda leader and five associates in Yemen. Since then, there have been approximately 500 "targeted killing" drone attacks in Pakistan, Yemen, and Somalia - countries where the U.S. is not fighting a conventional war.
Weaponized drones have produced results. They have eliminated 22 of al Qaeda's top 30 leaders. They lessen the need to send our troops into harm's way, reducing the number of U.S. casualties.
The costs of drone strikes have received less attention. The numbers of innocent civilians killed are far greater than the number of terrorists killed. A recent study connected by Stanford Law School and the New York University School of Law, found that the number of innocent civilians killed by U.S. drone strikes is much higher than what the U.S. government has reported, at least 700 people since 2004, including 200 children.
In Djibouti, a small state on the Gulf of Aden, the U.S. has turned a former French Foreign Legion outpost, Camp Lemonnier, into the most important base for drone operations outside the war zone of Afghanistan. An investigation by The Washington Post found that Predator drones take off round the clock on missions over nearby Somalia and Yemen. Their pilots are in Creech, at an Air Force hub 8,000 miles away in Nevada.
It is President Obama himself who approves a "kill list" of targets, one of whom has been a U.S. citizen, radical Muslim cleric Anwar al-Awlaki, until a drone killed him. In this - and other - cases, as conservative columnist Timothy Carney points out, "Obama was the judge and jury. Obama's drones were the executioners."
Both Republicans and Democrats have embraced this use of drones, and the almost unprecedented power given to the president to determine who is worthy of execution, even U.S. citizens who, under the Constitution, have a right to be charged and tried in a court of law.
Critics, on both the left and right, are expressing concern about this unprecedented executive power.
Kurt Volker, who served as U.S. ambassador to NATO under President George W. Bush and is now executive director of the McCain Institute at Arizona State University, argues that drones have made killing too easy:
What do we want to be as a nation? A country with a permanent kill list? A country where people go to the office, launch a few kill shots, and get home in time for dinner? A country that instructs workers in high-tech operations centers to kill human beings on the far side of the planet because some government agency determined that those individuals are terrorists?
In Volker s view:
In establishing a long-term approach, a good rule of thumb might be that we should authorize drone strikes only if we would be willing to send in a pilot or soldier to do the job if a drone were not available. . . . More people have been killed in drone attacks than were incarcerated at Guantanamo Bay. . . . This is not to say that the U.S. should never use drones for targeted attacks. We should. But we should also be creating standards and practices that are entirely defensible, even - and perhaps especially - if others were to adopt them.
Rep. Keith Ellison (D-MN) believes that:
Congress must require an independent judicial review of any executive branch kill list. The U.S. legal system is based on the principle that one branch of government should not have absolute authority. Congress should object to that concentration of power, especially when it may be used against U.S. citizens. A process of judicial review would diffuse executive power and provide a mechanism for greater oversight.
Beyond this is the question of the use of drones within the U.S. itself. While domestic drones are now uncommon, the Federal Aviation Administration has predicted that within 20 years, 30,000 commercial and government drones could be flying in U.S. skies.
Drones could be equipped with surveillance technologies to identify people or license plates. "In the near future," report researchers at the Congressional Research Service:
. . . law enforcement organizations might seek to outfit drones with facial recognition or soft biometric recognition, which can recognize and track individuals based on attributes such as height, age, gender and skin color.
Analyzing past court cases, the researchers conclude that police would likely have to obtain a search warrant to use nano drones or heat-sensing imaging to spy on people within their homes. But the report says that it is unclear how courts will treat drone surveillance of a person's backyard, swimming pool, deck or porch.
Lawmakers have already introduced several bills to limit how police could use drones to gather information. Rep. Austin Scott (R-GA) and Sen. Rand Paul (R-KY) introduced the Preserving Freedom from Unwarranted Surveillance Act to require that police obtain a warrant in most circumstances before using drones.
"The current state of the law is inadequate to address the threat, as drone technology becomes cheaper, the threat to privacy will become more substantial," says Amie Stepanovich, a lawyer with the Electronic Privacy Information Center.
At the present time, unmanned aerial vehicles - drones - are used only by the military and law enforcement agencies. They are to become available for personal and commercial use as early as September 2015, raising new concerns. "Drones operated by private entities open new doors to spying, harassment, and stalking," says Stepanovich.
The New York Times notes that:
The idea of watchful drones buzzing overhead like Orwellian gnats may seem far-fetched to some. But Congress, in its enthusiasm for a new industry, should guarantee the strongest protection of privacy under what promises to be a galaxy of new eyes in the sky.
The questions which are being asked about the use of drones at the present time abroad - and in the future at home - are being asked by only a handful of conservatives and liberals concerned about the potential abuse of executive power and a potential threat to privacy. Most Republicans and most Democrats have been strangely silent on the subject. It is high time that we had a real national conversation about how to apply our traditional values to this new technology before things get even further out of hand.
In December, Congress received good or excellent marks for its job performance from only 9 percent of likely U.S. voters. This is the lowest approval rating in 38 years of polling.
Despite the low esteem in which it is held, Congress continues to do business in exactly the same self-serving way that has led to its declining reputation. This is true of both Democrats and Republicans.
In January, the "fiscal cliff" bill passed by Congress contained a windfall for lobbyists, extending supports for Puerto Rican rum distillers, Hollywood studios, tribal-lands coal, electric-scooter makers, and other corporate interests that Congress will subsidize through the tax code into the future.
One such windfall is the production tax credit (PTC) for wind energy. Extending the decades-old subsidy will cost more than $12 billion through 2022, according to Congress' Joint Committee on Taxation. Even the wind industry agreed in December that, after two decades of direct assistance, Congress should set the PTC to phase out by 2019. The only change Congress made was to make the subsidy more generous.
The "cliff" bill also extended the 2008 farm bill, which The Washington Post described as "a monstrous money-waster." Or consider another congressional action in January, when Senators who play a major role in healthcare financing helped Amgen, the world's largest biotechnology company, evade Medicare cost-cutting controls by delaying price restraints on a class of drugs used by kidney dialysis patients, including Sensipar, a drug made by Amgen. Amgen has 74 lobbyists in Washington and pushed aggressively for this provision. The delay will cost the Medicare program up to $500 million over a two-year period. Those who pushed this delay on price restraints were Senators Max Baucus (D-Montana), who leads the Senate Finance Committee, and Orin Hatch (R-Utah), the ranking Republican on the committee. The current lobbyists for Amgen include former chiefs of staff for both Senators Baucus and Mitch McConnell (R-KY). A top aide to Mr. Hatch, who was involved in the negotiations, worked as a health policy analyst for Amgen.
In mid-January, about a dozen members of Congress gathered in New York to discuss, among other things, "Why does America hate us?" Rep. Charlie Dent (R-PA) asked: "Did you hear about the poll? Congress is now rated slightly above or below cockroaches and colonoscopies." (Actually, it was below.)
"We are incentivized to do crazy things," said Rep. Jim Himes (D-CT), who pointed out how angry diatribes delivered on the House floor made celebrities out of lawmakers.
Rep. Peter Welch (D-VT) noted that each party ignores the truth when it does not suit its purpose. "Congress is a fact-free zone," he declared.
Congress, we often fail to understand, responds to the incentive structure we have established. This was explained very well by Nobel Prize winning economist James Buchanan, who died at the age of 93 in January.
Dr. Buchanan, who taught at George Mason University, was a leading proponent of public choice theory, which assumes that politicians and government officials, like everyone else, are motivated by self-interest - getting reelected or gaining more power - and do not necessarily act in the public interest.
He argued that their actions could be analyzed, and even predicted, by applying the tools of economics to political science in ways that yield insights into the tendencies of governments to grow, increase spending, borrow money, run large deficits, and let regulations proliferate.
It was Buchanan's view that the pursuit of self-interest by modern politicians often leads to harmful results. Courting voters, for example, legislators will approve tax cuts and spending increases for projects and entitlements favored by the electorate. This leads to ever-rising deficits, public debt burdens, and increasingly large governments.
Buchanan accurately forecast that deficit spending for short-term gains would evolve into a "permanent disconnect" between government outlays and revenue. No matter which party is in power, as he predicted, government power - and deficits continue to grow.
Because Congress refuses to eliminate non-germane amendments to key pieces of legislation, special interests are regularly rewarded. In the relief bill for Superstorm Sandy, such amendments included: $25 million to improve weather and hurricane intensity forecasting; $118 million for Northeast Corridor upgrades; $10 million for FBI salaries and expenses; $2 billion for Federal Highway Administration to spend on roads across the country, and $16 billion for the Community Development Fund that would go not only to Sandy-effected states, but to any major disaster declarations since 2011.
While some of these items may be worthy recipients of federal money, they should be subject to the normal budgetary process and not inserted into a bill designed to help victims of the mega-storm which ripped through parts of the eastern United States in October.
Congress, it is safe to say, will continue doing business as usual until we change the incentive structure we have created. Only 9 percent of Americans may have a favorable view of Congress, but incumbents usually have little trouble being re-elected, in the largely one-party districts that state legislatures have gerrymandered in their behalf. As long as we permit this incentive structure to continue, we should prepare ourselves for more of the same.
Many of the social problems we face - from poor academic achievement levels, to teenage pregnancy, to drug use and crime - cannot be properly understood without considering the impact of the absence of fathers in more and more homes.
At the present time, 15 million American children, or 1 in 3, live without a father. In every state, the portion of families where children have two parents, rather than one, has dropped significantly over the past decade. Even as the country added 160,000 families with children, the number of two-parent households decreased by 1.2 million. In Baltimore, 38 percent of families have two parents. In St. Louis, the portion is 40 percent.
The problem is particularly acute in the black community, although those who point this out are usually shunned as being politically incorrect. Among blacks, nearly 5 million children, or 54 percent, live only with their mother.
A report from the Institute for American Values notes that over the past fifty years "the percentage of black families headed by married couples declined from 78 percent to 34 percent." In the thirty years from 1950 to 1980, households headed by black women who never married jumped from 3.8 per thousand to 69.7 per thousand.
"For policymakers who care about black America, marriage matters," wrote the authors, a group of black scholars. They called marriage in black America an important strategy for "improving the well-being of African-Americans and for strengthening civil society."
In his book Enough, the respected black journalist Juan Williams points out that:
The answer to the question of how to create opportunities for the poor is to get them to take school seriously - to set high academic expectations for their children and to insist on high expectations from teachers in good schools. It is also a personal matter of self-control that begins with understanding the power of the family and putting love, romance, and children (as well as knowing how to be good parents) in their proper order.
Linda Chavez, the former head of the U.S. Civil Rights Commission, argues that the "chief cause of poverty today among blacks is no longer racism - it is the breakdown of the traditional family."
Those who make this point, Juan Williams laments, are often accused of "blaming the poor." He states:
They say this answer puts pressure on the poor. They say this with a straight face, even though nearly 70 percent of black children are born to single women, damning a high number of them to poverty, bad schools, and bad influences. They say this knowing that in 1964, in a far more hostile and racist America, 82 percent of black households had both parents in place and close to half of those households owned a business.
President Obama himself made this point in a 2008 Father's Day speech in Chicago:
If we are honest with ourselves, we'll admit that too many fathers are . . . missing . . . from too many lives and too many homes. They have abandoned their responsibilities, acting like boys instead of men. And the foundations of our families are weakening because of it. You and I know how true this is in the African-American community. We know that more than half of all black children live in single parent households, a number that has doubled since we were children.
Children who grow up without a father are five times more likely to live in poverty and commit crime. They are nine times more likely to drop out of schools and 20 times more likely to end up in prison.
The Urban Institute finds that the percent of black women who are married declined from 53 percent to 25 percent over the past half century. Marriage is declining among whites and Hispanics as well, although less dramatically. The drop in marriage for white women in the past half century has been from 65 percent to 52 percent, and among Hispanic women from 67 percent to 43 percent.
A recent Department of Education study shows that a child's grades were more closely correlated to how many times the father came to a school event than any other factor. Children with involved fathers measure as having higher IQs by age three, higher self-esteem, and in the case of daughters, grow up to be less promiscuous.
A new study from the University of Virginia and the Institute for American Values, "The State of Our Unions," tracks the decline of marriage among Americans of all races who have high school but not college educations. By one estimate cited in the report, which was written by five family scholars, the cost to taxpayers when stable families fail to form is about $112 billion annually - or more than $1 trillion per decade.
In the 1980s, only 13 percent of children were born outside of marriage among moderately educated mothers. By 2010, the number had risen to 44 percent. The lead author of the new study, Elizabeth Marquardt, writes:
Marriage is not merely a private arrangement; it is also a complex social institution. Marriage fosters small cooperative unions - also known as stable families - that enable children to thrive, shore up communities, and help family members to succeed during good times, and to weather the bad times. Researchers are finding that the disappearance of marriage in Middle America is tracking with the disappearance of the middle class in the same communities, a change that strikes at the very heart of the American Dream.
We ignore the absence of fathers in a growing number of families and the decline of marriage itself at our peril. When, in 1965, Daniel Patrick Moynihan issued his report about the alarming rise of African American children born out of wedlock, it set off a major national discussion, The latest findings - equally disturbing and reflecting trends among Americans of all races - are being largely ignored. This may tell us a lot about the strange set of priorities that dominates what passes for public discourse. *
Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.
On the brink of the so-called "fiscal cliff," the House voted 257-167 to approve the Senate bill that would avoid major tax hikes and spending cuts. In response, the stock market soared. What went largely unremarked was the fact that neither Republicans nor Democrats made any effort to confront the real financial challenges that lie ahead. As we seem repeatedly to say when it comes to Congress dealing with the problems facing the nation, "they kicked the can down the road."
What we face - and no one in Washington seems prepared to confront - are massive structural deficits as the baby boomers start to retire in large numbers.
In 1900, 1 in 25 Americans was over the age of 65. In 2030, 17 years from now, 1 in 5 Americans will be over 65. Because we have many programs that provide guaranteed benefits to the elderly, this has major budgetary implications.
In 1960, there were about five working Americans for every retiree. By 2025, there will be just over two workers per retiree. In 1975, Social Security, Medicare and Medicaid made up 25 percent of federal spending. Today, they add up to 40 percent. Within a decade, these programs will take over half of all federal outlays.
We have postponed the problem by borrowing heavily for decades. Our debt stands at 100 percent of GDP. Federal spending on everything other than entitlements, defense, and interest on the debt has been shrinking for many years. A recent report from the National Governors, Association points out that Medicaid is now the single largest item on state budgets and has grown by over 20 percent for each of the past two years.
This trend is escalating. The Peter G. Peterson Foundation calculates, using Congressional Budget Office numbers, that by 2040 we are likely to spend 10 percent of the GDP on interest payments alone - versus 1.4 percent today.
Congress and President Obama set up the artificial "fiscal cliff" scenario that would, allegedly, force them to do the right thing. Completely ignored, however, were the deep structural reforms that will eventually be needed.
Casting the budget problem as a question of whether the richest 1 or 2 percent of the population should pay more taxes is avoiding the real issues before us. The real problem we face is the bipartisan promises made to Americans of both high government benefits and low taxes at the same time. This may be democracy at work. Most Americans may want high benefits and low taxes. No one in Washington seems prepared to tell them the hard truth - that we cannot afford benefits we are unwilling to pay for.
The nonpartisan Congressional Budget Office puts it this way:
With the population aging and healthcare costs per person likely to keep growing faster than the economy, the U.S. cannot sustain the federal spending programs that are now in place with the federal taxes as a share of GDP that it has been accustomed to paying.
Washington Post columnist Robert Samuelson faults both parties for the situation we face:
The main reason that we keep having these destructive and inconclusive budget confrontations is not simply that many Republicans have been intransigent on taxes. The larger cause is that Obama refuses to concede that Social Security, Medicare and Medicaid are driving future spending and deficits. So when Republicans make concessions on taxes (as they have), they get little in return. . . . Just as many Republicans don't want to raise taxes a penny, many Democrats don't want benefits cut a penny.
One reasonable example is the proposal to shift the standard consumer price index (CPI) to a "chained" CPI to adjust Social Security benefits. From 2013 to 2022, this change is estimated to reduce Social Security spending by $100 billion. Over that decade, total Social Security benefits are estimated at trillions, but the cut would be a tiny percentage. Yet, Democrats in Congress rejected any serious consideration of this proposal.
As the population ages and health care costs soar, to avoid the country sinking into debilitating debt, revenue must rise and spending - particularly on Medicare, Social Security. Medicaid and military healthcare - must be brought under control. No solution that pleases extremists on either side - those who reject any increase in revenue and those who oppose any decrease in benefits - has, at this time, any chance of becoming law.
One thoughtful member of Congress, Senator Michael Bennett (D-CO) voted against the "fiscal cliff" deal because it did not have any meaningful deficit reduction. He said:
Going over the cliff is a lousy choice and continuing to ignore the fiscal realities that we face is a lousy choice. . . . The burden of proof has to shift from the people who want to change the system to the people who want to keep it the same. I think if we can get people focused to do what we need to do to keep our kids from being stuck with this debt that they didn't accrue, you might be surprised at how far we can move this conversation.
Senator Bennet laments that:
Washington politics no longer follows the example of our parents and our grandparents who saw as their first job creating more opportunities, not less, for the people who came after them. . . . The political debate now is a zero sum game that creates more problems than solutions.
Unfortunately, things will probably have to get worse than they are at the present time before either Democrats or Republicans will begin to make the hard decisions necessary to set our society on a sustainable economic path for the future. Real leadership is hard to discover in today's Washington.
The fact that our government is increasingly out of control is becoming clear to more and more Americans. We continue to embark upon huge new public programs that, regardless of whether one agrees or disagrees with their merits, we refuse to pay for. How long can any society continue to spend well beyond its means without dire consequences? Eventually, if things continue on their present trajectory, we will see.
The tendency to spend without raising funds to pay the bills continues whichever party is in power. Current discussion of a "fiscal cliff" does not confront the reality of politicians of both parties busy subsidizing the variety of special interest groups from which they raise money to attain office. As many have said, "We have the best Congress money can buy."
The Founding Fathers, if they suddenly arrived in contemporary America, would be disappointed with what they saw. But it is unlikely that they would be surprised. They were, after all, thoughtful students of human nature, and how it influences the nature of the government under which we live.
Many things have changed in society. The Framers of the Constitution could never have foreseen the creation of automobiles, airplanes, television, computers, cell phones and other elements of our modern life. But while things around us have undergone dramatic change, what has remained the same is man himself.
Human nature, for better or worse, is unchanged. If this were not true, we could not, in the 21st century, identify with the writings of Plato or Aristotle, or the characters in Shakespeare. The teachings of Moses and Jesus are as relevant to modern man as they ever were.
The Founding Fathers were not utopians. They understood man's nature. They attempted to form a government that was consistent with, not contrary to, that nature. Alexander Hamilton pointed out that:
Here we have already seen enough of the fallacy and extravagance of those idle theories that have amused us with promises of an exemption from the imperfections, weaknesses, and evils incident to society in every shape. Is it not time to awake from the deceitful dream of a golden age, and to adopt as a practical maxim for the direction of our political conduct that we, as well as the other inhabitants of the globe, are yet remote from the happy empire of perfect wisdom and perfect virtue?
Rather than viewing man and government in positive terms, the framers of the Constitution had almost precisely the opposite view. John Adams declared that, "Whoever would found a state and make proper laws for the government of it must presume that all men are bad by nature." As if speaking to those who place ultimate faith in egalitarian democracy, Adams attempted to learn something from the pages of history.
We may appeal to every page of history we have hitherto turned over, for proofs irrefragable, that the people, when they have been unchecked, have been as unjust, tyrannical, brutal, barbarous and cruel as any king or senate possessed of uncontrollable power. . . . All projects of government, formed upon a supposition of continued vigilance, sagacity, and virtue, firmness of the people when possessed of the exercise of supreme power, are cheats and delusions. . . . The fundamental article of my political creed is that despotism, or unlimited sovereignty, or absolute power, is the same in a majority of a popular assembly, an aristocratic council, an oligarchical junto, and a single emperor. Equally bloody, arbitrary, cruel, and in every respect diabolical.
That government should be clearly limited and that power is a corrupting force was the essential perception of the men who made the nation. In The Federalist Papers, James Madison wrote:
It may be a reflection on human nature that such devices should be necessary to control the abuses of government. But what is government itself but the greatest of all reflections on human nature? If men were angels, no government would be necessary. If angels were to govern men, neither external nor internal controls on government would be necessary. In framing a government which is to be administered by men over men, the great difficulty lies in this: you must first enable the government to control the governed, and in the next place oblige it to control itself.
Yes, if the Founding Fathers arrived in Washington tomorrow, they would be disappointed - but they would not be surprised.
The official national debt of $16 trillion is growing at the rate of $4 billion a day. This, together with what the government owes its various trust funds, is more than 100 percent of gross domestic product. The state's debts are about $3 trillion, and their unfunded liabilities are approximately another $4 trillion. "Debts of this magnitude," says Michael Greve, a scholar at the American Enterprise Institute, "will not be paid. . . . Our politics aims at inspiration on the cheap.''
Senator Tom Coburn (R-OK), one of the few members of Congress who appears serious about cutting unnecessary spending, points out that for the past two years the Government Accountability Office (GAO) has shown Congress more than $200 billion in duplicative spending alone. There are, for example, 47 job-training programs across nine agencies that cost $18 billion but are not producing results. GAO found dozens of other areas of costly duplication that if streamlined could improve outcomes while saving money.
GAO identified 209 federal Science, Technology, Engineering, and Mathematics (STEM) programs run by 13 federal agencies, costing taxpayers more than $3 billion annually. According to Sen. Coburn:
Duplication is only part of the problem. Our budget is full of outrageous examples of waste and mismanagement. For instance, our government has borrowed $20 million from future generations in order to send millionaires unemployment checks. Funds meant to help agencies coordinate intelligence and counterterrorism efforts at the Department of Homeland Security "fusion centers" have been spent on flat-screen TVs, SUVs for personal use, and intelligence that has been described as useless and irrelevant. What Washington is lacking is not options for savings but the political courage to make specific decisions. Millions of families and individuals in America are already living in the world of hard decisions and priorities. It's long past time for Washington politicians to join them.
Often, politicians speak of cutting spending and tax breaks that, in reality, amount to savings of very little money. Mitt Romney and Big Bird is one example. And President Obama declared that, "I want to stop giving tax breaks to companies that ship jobs and factories overseas." He discussed this in his "economic patriotism" ad. Conservative columnist Timothy Carney provided this assessment:
Sounds fine, but what is he actually talking about? Not much, it turns out. One expense businesses incur is moving labor or equipment to new locations. Obama is proposing that the cost of moving facilities out of the U.S. shouldn't be deductible. Those costs should be part of the company's profits. This targeted tax penalty would raise $14 million per year, or 0.0005 percent of the federal budget, according to the Joint Committee on Taxation.
Neither party speaks of reigning in expensive government programs of subsidies for corporations or agriculture. Veronique de Rugy, a senior research fellow at the Mercatus Center at George Mason University, writes that:
The big winner of the three presidential debates is government spending. Mitt Romney singled out only a few small programs that he thinks are ripe for cutting. President Obama stayed away from any specifics. . . . But the present levels of spending are not an option. . . . Everything has to be on the table. . . . The bottom line is that there is no silver bullet for balancing the budget. We didn't get in this fiscal mess overnight, and it will take us some time to get out of it.
Clearly, everything should be put on the table - including the Pentagon, which consumes 18 percent of the federal budget. Thus far, however, politicians of both parties seem unwilling to consider cutting programs that have strong constituencies. At the present time, 30 cents of every dollar government spends is borrowed. With the coming explosion in programs such as Social Security, Medicare, and Medicaid - not to mention President Obama's new health care program - we will be drowning in debt.
How bad do things have to get before our politicians begin to pay real attention? Thus far, there are no good answers to this question.
In the past presidential campaign, there had been much discussion about those who receive one form or another of government assistance and the need to reduce such aid.
The government aid we seem to focus upon is that received by those at the lower end of the economic spectrum, such as food stamps.
While it is certainly proper to review all forms of government assistance, it is surprising that neither Republicans nor Democrats have had much - if anything - to say about corporate welfare. For politicians who have bailed out Wall Street and the auto industry - and who subsidize many others - to focus all of their attention on forms of government aid other than that for corporations tells us a great deal about how our current politics avoids the real problems we face. Perhaps it would be different if both parties were not so busy raising campaign funds from the very people they subsidize.
A recent Cato Institute study finds that federal business subsidies total almost $100 billion annually. This includes subsidies to small businesses, large corporations and industry organizations. The subsidies come from programs in many federal departments including Agriculture, Commerce, Energy, Housing, and Development. As part of the national income accounts, the Bureau of Economic Analysis calculates that the federal government handed out $57 billion in business subsidies in 2010.
A recently issued report from The Heritage Foundation by Chris Edwards and Tad DeHaven, notes that:
There are several upsides to ending federal subsidies to businesses: it would reduce the amount of money taken from taxpayers and given to big corporations; and it would reduce the incentives for political corruption. A less obvious, but no less important, reason to end corporate welfare is that an economy that doesn't depend on subsidies from government is a more entrepreneurial economy that will grow faster.
Edwards and DeHaven provide examples of the failure of government subsidization in the energy industry, with the complicity of both parties:
An early subsidy effort was the Clinch River Breeder Reactor, which was an experimental nuclear fission power plant in Oak Ridge, Tennessee, in the 1970s. This Republican-backed boondoggle cost taxpayers $1.7 billion and produced absolutely nothing in return. Then we had the Synthetic Fuels Corporation (SFC) approved by President Jimmy Carter in 1980, who called it a "keystone" of U.S. energy policy. The government sank $2 billion of taxpayer money into this scheme that funded coal gasification and other technologies before it was closed down as a failure.
More recently, the Obama administration's failures in subsidizing green energy projects are piling up - Solyndra, Raser Technologies, Ecotality, Nevada Geothermal, Beacon Power, First Solar, and Abound Solar. These subsidy recipients have either gone bankrupt or appear to be headed in that direction. The Washington Post found that, "Obama's green-technology program was infused with politics at every level."
The flow of taxpayer money to business continues to grow, whichever party is in power. A recent The New York Times article - "Ties To Obama Aided in Access for Big Utility" - documented how Exelon Corp., a Chicago-based utility, used its political support for President Obama for access and profit. Exelon was one of six utilities that received the maximum $200 million stimulus grant. It also got a $600 million renewable-energy loan guarantee for a solar project in Los Angeles. One of Exelon's subsidiaries received a $200 million grant to install "smart meters" in the Philadelphia area.
Republicans and Democrats are in this together. Sheldon Richman, a senior fellow at The Future of Freedom Foundation and editor of The Freeman, points out, with regard to Republican Vice Presidential candidate Paul Ryan that:
In the Bush years, Ryan voted for everything: No Child Left Behind (which increased the centralization of education), the Medicare drug entitlement, housing subsidies, unemployment-benefits extension, the bank bailouts, and the 2008 subsidies to failing Chrysler and GM. In voting for TARP (the Troubled Asset Relief Program), Ryan said, "Madame Speaker, this bill offends my principles, but I'm going to vote for this bill in order to preserve my principles.
William O'Keefe of the George C. Marshall Institute notes that:
"Crony capitalism," a frequently used term describing firms that seek to invest with taxpayer dollars instead of share owner dollars will not reduce unemployment, promote robust economic growth or help the United States compete in the global economy. Reform is needed shrinking the public trough, creating a level playing field, providing business confidence and providing true regulatory reform provide a good start.
Thus far, neither party has had anything to say about corporate welfare. This indicates, sadly, that neither party is serious about bringing government spending under control. It is politics as usual - just what has produced the economic morass we are now in. Americans deserve something better. Things, it seems, will have to become even worse before anyone begins to tell the truth about our problems. *
Now that the 2012 election campaign has come to an end, it would be good if Americans could set partisan acrimony aside as the nation prepares to celebrate Thanksgiving.
This holiday has an interesting history and debate continues over where, in fact, the first Thanksgiving took place. Those of us who live in Virginia believe that the Old Dominion has a powerful historical case that others have tended to overlook.
This writer visited the Berkeley Plantation in Charles City County, Virginia, many years ago as the plantation prepared to host a celebration of the 350th anniversary of the first commemoration of Thanksgiving. Plantation owner Malcolm Jamieson displayed letters from President John F. Kennedy and former Massachusetts Governor John Volpe declaring that Berkeley was the site of the first formal Thanksgiving in the New World.
Berkeley is the site of other historical firsts as well. The land on which it stands was part of a grant made in 1619 by King James I to the Berkeley Company and was designated "Berkeley Hundred." On December 4, 1619, the settlers stepped ashore there and in accordance with the proprietors' instructions that "the day of our ships' arrival. . . shall be yearly and perpetually kept as a day of Thanksgiving" celebrated the first Thanksgiving Day more than a year before the Pilgrims arrived in New England.
There is much history at Berkeley. In 1781, it was plundered by British troops under Benedict Arnold. During the Civil War it served as headquarters for General McClellan after his withdrawal from the Battle of Malvern Hill. Federal troops were encamped in its fields, transports and gunboats were anchored in the James River. While quartered here with McClellan in the summer of 1862, Gen. Butterfield composed "Taps." It is also reported that the first bourbon distilled in America was distilled at Berkeley by an Episcopal minister.
Walking around the grounds at Berkeley is to enter another world. This is where America began. It was strong men and women who built a nation on these often inhospitable shores. They made many mistakes, as people are always wont to do, but they created a new society in which, as George Washington wrote to the Hebrew Congregation at Newport, Rhode Island, there would "be none to make men afraid." We are a young country, but we are also an old one. Our Constitution is the oldest in the world, and we have continuously maintained the freedoms to which we first paid homage. There has been no period of an elimination of freedom of religion, or of the press, or of assembly. We have weathered wars and depressions. We will also weather the difficulties in which we are now embroiled. But we will do so only if Americans begin to recall their history and their values and not give assent to those who seek only to condemn and to destroy.
Several years ago, I visited a U.S. military ceremony in Italy - near Anzio - with my son Peter and grandson Dario. This visit caused me to reflect on the unique nature of American society.
It was instructive to read the names of the American dead. Virtually all nationalities, ethnic groups and religions are represented. In the 1840s, Herman Melville wrote that, "We are heirs of all time and with all nations we divide our inheritance." If you kill an American, he said, you shed the blood of the whole world.
America is more than simply another country. Visiting New Amsterdam in 1643, French Jesuit missionary Isaac Jogues was surprised to discover that 18 languages were spoken in this town of 8,000 people. In his Letters From an American Farmer, J. Hector St. John Crevecoeur wrote in 1782:
Here individuals of all nations are melted into a new race of men, whose labors and posterity will one day cause great changes in the world.
Author Mario Puzo declared that:
What has happened here has never happened in any other country in any other time. The poor who had been poor for centuries - whose children had inherited their poverty, their illiteracy, their hopelessness, achieved some economic dignity and freedom. You didn't get it for nothing, you had to pay a price in tears, in suffering, why not? And some even became artists.
As a young man growing up in Manhattan's Lower East Side, Puzo was asked by his mother, an Italian immigrant, what he wanted to be when he grew up. When he said he wanted to be a writer, she responded that, "For a thousand years in Italy, no one in our family was even able to read." But in America, everything was possible - in a single generation.
In 1866, Lord Acton, the British Liberal party leader, said that America was becoming the "distant magnet." Apart from the "millions who have crossed the ocean, who should reckon the millions whose hearts and hopes are in the United States, to whom the rising sun is in the West?"
America has been a nation much loved. Germans have loved Germany, Frenchmen have loved France, Swedes have loved Sweden. This, of course, is only natural. America has been beloved not only by native Americans, but by men and women of every race and nation throughout the world who have yearned for freedom. America dreamed a bigger dream than any nation in the history of man.
As we gather for our Thanksgiving celebrations it is proper that we reflect upon that first Thanksgiving in Virginia. We have come a long way since that time, and most of that way has been good. Happy Thanksgiving!
The history of the world indicates that freedom is not natural to man, but must be carefully cultivated and taught. Through most of recorded history, man's natural state has been to live under one form of tyranny or another. Freedom must be learned and carefully transmitted from one generation to another if it is to endure.
There is little effort in our contemporary society to transmit our history, or culture, and the values upon which a free society is built. In an important new book, American's Best Colleges! Really? (Crossbooks), John Howard, at 90, continues his strenuous efforts as an educator to reverse recent trends.
This book is dedicated to Angus MacDonald, the long-time editor and publisher of The St. Croix Review and to James Crawford, the founding editor of The Herald Examiner of Freeville, New York.
John Howard has lived an extraordinary life. During World War II, he served in the 745th Tank Battalion, First Infantry Division, and received two Silver Stars and two Purple Hearts. From 1960-77 he was president of Rockford College. He then served as President of the Rockford Institute and at the present time is a Senior Fellow at the Howard Center for Family, Religion and Society.
He believes that our institutions of higher learning have let us down in carrying out their responsibility of introducing our history, culture, and values to the new generation of Americans. He quotes Aristotle: "Of all the things I have mentioned, that which contributes most of the permanence of constitutions is the adaption of education to the form of government." And Montesquieu, in The Spirit of the Laws, analyzed various forms of government. He stated that each one had a unique relationship with the people, and if that relationship changed, that form of government would perish.
In despotism or tyranny, he argues, the government could last as long as the people were afraid of it, doing what they were told to do for fear of severe penalties. A monarchy could last as long as the people were loyal to the crown.
"But a democracy," writes Howard:
. . . or other self-governing regime, depended upon a virtuous populace, which voluntarily abided by the laws and other settled standards of behavior. This free society was the best form of government, and the hardest to achieve and sustain. America's free society was destined for success because the colonists who came to New England and left England for the sole purpose of finding a land where they could practice the Christian faith. . . were already deeply committed to a virtuous life, wholly suited for the government of a free society.
John Howard believes that the Founding Fathers fully understood and supported the cardinal principle proclaimed by Aristotle:
On July, 1787, the Continental Congress enacted the Northwest Ordinance. It set forth the plan for the government of the residents of the Northwest Territory and the basis on which a region might qualify for statehood. Article III begins, "Religion, morality, and knowledge being necessary to good government and the happiness of mankind. . . ." Here is an acknowledgment that our self-government is dependent on religion, morality, and education in that order of importance. That document, and the Declaration of Independence, and the U.S. Constitution were so intelligently conceived that they reflect a breadth of knowledge and wisdom often said to be superior to the products of any other deliberative body in world history. Certainly, there have been no comparable accomplishments in recent times.
The stress on religion and morality was echoed in the main body of George Washington's inaugural address. American education's attention to the development of character among students was summarized in a 1979 report published by the Hastings Center. The author was Columbia Professor Douglas Sloan. He wrote:
Throughout the 19th century, the most important course in the college curriculum was moral philosophy, taught usually by the college president and required of all senior students. . . . The full significance and centrality of moral philosophy in the 19th century curriculum can only be understood in the light of the assumption held by American leaders and most ordinary citizens that no nation could survive, let alone prosper, without some common moral and social values. . . . However, moral philosophy did not carry the whole burden of forming the students' character and guiding their conduct, the entire college experience was meant above all to be an experience in character development and the moral life.
The wise political philosopher Edmund Burke declared that political liberty cannot exist unless it is sustained by moral behavior. This truth was embraced by our Founding Fathers. President John Adams' Second Inaugural Address was the first one given in the new capitol building. He urged:
May this residence of virtue and happiness . . . here and throughout our country, may simple manners, pure morals, and true religion flourish forever.
President James Madison wrote:
We have staked the whole future of American civilization, not upon the power of government, far from it. We have staked the future of all of our political institutions upon the capacity of mankind for self-government upon the capacity of each and all of us to govern ourselves according to the Ten Commandments of God.
Alexis de Tocqueville visited America in the 1830s. His book Democracy in America is a classic description of the government and people of America: "By their practice, Americans show they feel the urgent necessity to instill morality into democracy by means of religion."
John Howard declares: "Instill morality into democracy by means of religion - De Tocqueville saw this as the only means by which liberty can be perpetuated in all democratic nations."
John Howard has dedicated his long life to promoting the values upon which a free society depends. In this book are collected a series of his speeches and essays, as well as his latest thoughts on how to preserve a free society and extend it into the future. Those who seek to understand how the values upon which such a society depends can endure into the future would do well to ponder John Howard's thoughtful words on this subject.
American education is in the grip of an epidemic of cheating on the part of students and, sad to say, teachers as well.
In August, some 125 students at Harvard University were being investigated for cheating on a final examination.
Howard Gardner, professor of cognition and education at the Harvard Graduate School of Education, conducted a study of 100 of Harvard's "best and brightest" students nearly 20 years ago. "The results of that study," he writes
. . . surprised us. Over and over again, students told us they admired good work and wanted to be good workers. But they also told us they wanted - ardently - to be successful. They feared that their peers were cutting corners, and that if they themselves behaved ethically, they would be bested. And so they told us in effect, "Let us cut corners now and one day when we have achieved fame and fortune, we'll be good workers and set a good example.". . a classic case of the end justifies the means.
During the past six years, Gardner and colleagues have conducted reflection sessions at elite colleges. They found "hollowness at the core." In one case, that of a dean who was fired because she lied about her academic qualifications, the most common student response was, "Everyone lies on their resumes." In a discussion of the movie, "Enron: The Smartest Guys in the Room," students were asked what they thought of company traders who manipulated the price of energy. Not one student condemned the traders.
The example set by professors, Gardner argues, is not good:
. . . all too often they see their professors cut corners - in their class attendance, their attention to student work, and, most flagrantly, their use of others to do research. Most embarrassingly, when professors are caught - whether in financial misdealings or even plagiarizing others' work - there are frequently no clear punishments . . .
In surveys of high school students, the Josephson Institute of Ethics has found that about three-fifths admit to having cheated in the previous year. Michael Josephson, president of the institute, states that:
Few schools place any meaningful emphasis on integrity, academic or otherwise, and colleges are even more indifferent than high schools.
Some teachers have actually encouraged students to cheat and some have cheated themselves when reporting test scores. In July 2011, a cheating scandal erupted in school systems in and around Atlanta. Georgia state investigators found a pattern of "organized and systemic misconduct" dating back over 10 years. One hundred seventy-eight teachers and the principals of half the system's schools, aided and abetted students who were cheating on their tests. Top administrators ignored news reports of this cheating. A New York Times story described "a culture of fear and intimidation that prevented many teachers from speaking out."
This was not an isolated incident. In a feature on school testing, CBS News reported:
New York education officials found 21 proven cases of teacher cheating. Teachers have read off the answers during a test, sent students back to correct wrong answers, photocopied secure tests for use in class, inflated scores, and peeked at questions and then drilled those topics in class before the test.
William Damon, professor of education at Stanford and a senior fellow at the Hoover Institution, notes that
It is practically impossible to find a school that treats academic integrity as a moral issue by employing revealed incidents of cheating to communicate to its student body values such as honesty, respect for rules, and trust. . . . I have noticed a palpable lack of interest among teachers and staff in discussing the moral significance of cheating with students. The problem here is the low priority of honesty in our agenda for schooling specifically and child-rearing in general.
In the past, Professor Damon points out:
. . . there was not much hesitancy in our society about using a moral language to teach children essential virtues such as honesty. For us, today, it can be a culture shock to leaf through old editions of the McGuffey Readers, used in most American schools until the mid-20th century, to see how readily educators once dispensed unambiguous moral lessons to students. . . . As the Founders of our Republic warned, the failure to cultivate virtue in citizens can be a lethal threat to any democracy. . . . Honesty is no longer a priority in many of the settings where young people are educated. The future of every society depends upon the character development of its young. It is in the early years of life, when basic virtues that shape character are acquired. . . . Honesty is a prime example of a virtue that becomes habitual over the years if practiced consistently - and the same can be said about dishonesty.
The cheating scandals among students and teachers are, of course, simply the tip of the iceberg of our society's retreat from honesty - and honor. Ethical lapses on the part of Wall Street, Congress and other sectors of society seem to be growing. Each time a political leader speaks, the fact-checkers fill columns reporting about their misstatements. Didn't anyone think that if we stopped teaching morals and ethics - and the difference between right and wrong - that society would lose its moral compass? It appears no one did.
New York Times columnist James Reston once noted that writing newspaper columns about the events of the day is like making "footprints in the sand," quickly covered by something new.
Some writers, however, while focusing upon the events of their own time, write for the future as well, applying their philosophy and worldview to the events of the day, but focusing upon the timeless principles that reflect their view of the past as well as the future.
One such writer who graced late 20th century America was Joe Sobran, who died in 2010. He was referred to by Pat Buchanan as perhaps "the finest columnist of our generation."
In 1972, Sobran began working at National Review and stayed for 21 years, 18 as senior editor. He also spent 21 years as a commentator on the CBS Radio "Spectrum" program series and was a syndicated columnist, first with the Los Angeles Times and later with the Universal Press Syndicate.
In a new book, Joseph Sobran: The National Review Years, the Fitzgerald-Griffin Foundation has gathered together some of Sobran's best articles from 1974 to 1991. These cover a wide range of topics, including Christianity, the media, the Constitution, motion pictures, Shakespeare, and baseball. In the foreword, Buchanan writes that, "What is extraordinary about this book of essays is the range of Joe's interests and the quality of his insights."
One essay deals with an incident in 1987 when a gang of young toughs in Queens, New York, beat up three young men. When one of the three, trying to escape, was hit by a car and killed, Mayor Ed Koch called the crime a "racial lynching," because the perpetrators were white and the victims black. The media referred to America as an increasingly "racist" society, even though all indications pointed toward improving race relations.
In what came to be known as the "Howard Beach Incident" Sobran saw a built-in bias on the part of the media at work:
All news is "biased" in that it's the selection of information in accordance with tacit standards of relevance. We notice the bias when the news is chosen to fit a "super story" the audience doesn't necessarily subscribe to. . . . The super story behind the Howard Beach Story was Racist America. The very fact that it was empirically atypical made it all the more dramatic as a synecdoche. . . . The media are so saturated by myth that it's fair to see "news" as an early stage on the assembly line whose final product is a New York Times editorial.
In a review of the book Whatever Happened to the Human Race by Everett Koop and Francis Schaeffer, Sobran confronts the growing advocacy of abortion, infanticide, and euthanasia, what he calls the "cheapening of life." He declares that
. . . as the abortion issue shows, the definition of "defective" has quickly broadened to mean anything not wanted by people in a position to kill. There is the case of a young couple who asked for a prenatal test to determine the sex of the child they are expecting: they said they feared a boy would be a hemophiliac. When the test showed it was a girl, they admitted they actually wanted a boy, because they preferred a boy. The girl was aborted.
In an essay on censorship and stereotypes, Sobran points out that
Religion is still a real and vital part of American life, but it is amazingly "underrepresented" (to use the liberal term) in mass communications. This is not a matter of conspiracy or even conscious avoidance, but of unconscious habit, much like modes of dress: religion simply isn't in the intellectual wardrobe of media people.
Sobran's 1990 essay, "The Republic of Baseball" is accompanied by a picture of the author on National Review's cover in Yankee uniform at Yankee Stadium. To all Americans who grew up in the mid-20th century - particularly men - baseball was central, as Sobran shows:
Not to play means missing out on the common experience of the male sex. And once you get into it, it's easy to get absorbed. In Ypsilanti, Michigan, I spent long winters studying baseball statistics to while away the endless cold grey days until the snow melted. Then, around mid-March, we started our new season in the park, or any empty field. . . . Baseball wasn't just something we played and watched. It was something we lived.
Beyond this, writes Sobran
The statistics, discreetness of individual performance, set against the game's stable history, gives achievement in baseball a permanence and stature that other sports can seldom confer. . . . The rules are really impartial. . . . There are no "racist" balls and strikes . . . only balls and strikes. . . . In politics, men are elected to bend the rules in someone's favor. It shouldn't surprise us when they break them too. A key difference between baseball and democracy is that in baseball the winners don't get to rewrite the rules. And it never occurs to the losers to blame the rules for their losses.
Sobran was an admirer of the British author G. K. Chesterton, to whom he has been compared. He reports about his attendance in Toronto of a meeting of the Chesterton Society in 1979 and recalls Chesterton's early opposition to "the science of eugenics" whose "consequences he foresaw." Advocates of eugenics included Oliver Wendell Holmes, who supported mandatory sterilization. Of Chesterton, he wrote:
His defense of the poor was rooted in a defense of the family and of liberty against those state planners who pined for population refinement. It is not hard to see the likeness to those enlightened souls who think the state should now promote contraception and abortion among the poor. . . . It reminds us that we who are alive today are the lucky survivors of Nazism and related evils; those of the next generation will be the lucky survivors of abortion "reform."
There is, of course, much more excellent writing - and thoughtful insights in these essays, including several advancing Sobran's thesis that the 17th Earl of Oxford was, in fact, the author of the works attributed to William Shakespeare.
In the afterword, author Ann Coulter states that
Joe could say in a sentence what most writers would need an entire column to express. His specialty was to make blindingly simple points that would cut through mountains of sophistry.
One need not agree with all of Sobran's views to appreciate the keen intelligence and moral perspective he brought to his work.
Fran Griffin and the Fifzgerald-Griffin Foundation are to be congratulated for publishing this collection of Joe Sobran's essays. Hopefully, through this book a new generation of readers will be made aware of some of the best writing of the recent past. *
Milton Friedman, the 1976 winner of the Nobel Prize for Economic Science, and the pre-eminent American advocate of free enterprise, was born on July 31, 1912 - a hundred years ago. This is an appropriate time to commemorate his life and reflect upon his achievements in advancing freedom.
It was Milton Friedman's belief that free enterprise was the only form of economic organization consistent with other freedoms. In his important book Capitalism and Freedom, he points out that
The kind of economic organization that provides economic freedom directly, namely competitive capitalism, also promotes political freedom because it separates economic power from political power and in this way enables one to offset the other.
It was his view that
Political freedom means the absence of coercion of a man by his fellow men. The fundamental threat to freedom is power to coerce, be it in the hands of a monarch, a dictator, an oligarchy, or a momentary majority. The preservation of freedom requires the elimination of such concentration of power to the fullest possible extent and the dispersal and distribution of whatever power cannot be eliminated - a system of checks and balances. By removing the organization of economic activity from the control of political authority, the market eliminates this source of coercive power. It enables economic strength to be a check to political power rather than a reinforcement.
Businessmen, Friedman liked to point out, believe in maximizing profits, not necessarily in genuinely free markets. He declared that
With some notable exceptions, businessmen favor free enterprise in general, but are opposed to it when it comes to themselves.
In a lecture entitled "The Suicidal Impulse of the Business Community" given in 1938, he declared that
The broader and more influential organizations of businessmen have acted to undermine the basic foundation of the free market system they purport to represent and defend.
What would Milton Friedman think of the recent bailout of failing banks, supported by both Republicans and Democrats? Wall Street Journal columnist David Wessel points out that
He didn't trust central bankers. He blamed the Bank of Japan for the deflation of the 1990s and the Fed for the Great Depression of the 1930s and the Great Inflation of the 1970s. He would, if his co-author Anna Schwartz is any clue, have condemned the bank bailouts of recent years. "They should not be recapitalizing firms that should be shut down," she told The Journal in 2008.
The issue he devoted most of his time to in his later years was school choice for all parents, and his Friedman Foundation for Educational Choice, is dedicated to that cause. He used to lament that
We allow the market, consumer choice, and competition to work in nearly every industry except for the one that may matter most, education.
Friedman was also proud to have been an influential voice in ending the military draft in the 1970s. When his critics argued that he wanted a military of mercenaries, he would respond:
If you insist on calling our volunteer soldiers "mercenaries," I will call those who you want to draft into service involuntary "slaves."
One of Friedman's former students at the University of Chicago, the respected economist Thomas Sowell, recalls that
Like many, if not most, people who became prominent opponents of the left, Professor Friedman began on the left. Decades later, looking back at a statement of his own from his early years, he said: "The most striking feature of this statement is how thoroughly Keynesian it is." No one converted Milton Friedman, either in economics or in his views on social policy. His own research, analysis, and experience converted him. As a professor, he did not attempt to convert students to his political views. I made no secret of the fact that I was a Marxist when I was a student . . . but he made no effort to change my views. He once said that anybody who was easily converted was not worth converting. I was still a Marxist after taking Professor Friedman's class. Working as an economist in the government converted me.
As a student of Friedman's in 1960, Sowell, who is black, notes that
I was struck by two things - his tough grading standards and the fact that he had a black secretary. This was years before affirmative action. People on the left exhibit blacks as mascots. But I never heard Milton Friedman say that he had a black secretary, though she was with him for decades. Both his grading standards and his refusal to try to be politically correct increased by respect for him.
In the late 1960s, Friedman explained that "there is no such thing as a free lunch." If the government spends a dollar, that dollar has to come from producers and workers in the private economy.
Friedman once said that
The true test of any scholar's work is not what his contemporaries say, but what happens to his work in the next 25 or 50 years. And the thing that I will really be proud of is if some of the work I have done is still cited in the textbooks long after I'm gone.
It seems certain that Milton Friedman will not only be cited in the textbooks but that men and women who value freedom everywhere in the world will recognize in him one of its prophetic voices. He clearly identified the intrinsic link between freedom of speech, religious freedom, the freedom to govern oneself - and economic freedom that, as he often pointed out, is simply democracy applied to the marketplace.
Honor and integrity used to be important in the American society. This writer, as a student at the College of William and Mary, signed the school's Honor Code, which declared that anyone who stole or cheated would be immediately removed from the college. This was the first Honor Code at an American college. It reflected the values of that society. Professors left the room when students took exams, and dormitory rooms were often left unlocked. Honor was more valued than anything that might be gained from dishonor.
Now, our society seems to have embraced a different standard of value, or non-value. Consider just a few recent developments.
* Seventy-one at New York's elite Stuyvesant High School were involved in cheating on the state's Regent's examinations in Spanish, U.S. History, English, and Physics. Stuyvesant admits just the top tier of 8th graders. The students involved in cheating were not expelled - and were not even given a failing mark in the exam. Instead, they remain enrolled in the school and will be able to retake the exam. Commenting on this, Frank W. Abagnale, the subject of the book, movie and Broadway musical "Catch Me if You Can," declared: "We do not teach ethics at home, and we do not teach ethics in school because the teacher would be accused of teaching morality. In most cases, we do not teach ethics in college or even instill ethics in the workplace."
* A report issued in mid-July by former FBI Director Louis Freeh, after an eight-month investigation, concluded that four of Penn State University's most powerful leaders, including head football coach Joe Paterno and the school's president, covered up allegations of sexual abuse by an assistant coach because they were concerned about negative publicity. Confronted with reports that Jerry Sandusky lured boys to the State College campus where he sexually abused them, Penn State's leadership deferred to "a culture of reverence for the football program" and repeatedly "concealed Sandusky's activities from authorities." Freeh said "Our most saddening and sobering finding is the total disregard for the safety and welfare of Sandusky's child victims by the most senior leaders at Penn State."
* Congressional ethics, we know, is an oxymoron. Recently, a report was issued by Rep. Darrell Issa (R-CA), chairman of the House Committee on Oversight and Government Reform, about how a group of lawmakers and their staff benefited from a "VIP" loan program not available to the public that waived fees, cut interest rates, and eased borrowing standards. Countrywide Financial offered the special loans in an effort to dissuade lawmakers from voting for stiffer banking regulations. The report names names, with many lawmakers still in Congress, but it did not include a letter from Issa calling for the ethics panel to investigate the matter. Without the letter, the ethics panel is not required to do a thing.
* In Washington, D.C., there are calls for the resignation of Mayor Vicent Gray. He has refused to answer questions about whether or not he knew, before or during the 2010 Democratic mayoral primary, about a secret, well-funded and illegal "shadow campaign" on his behalf. More than $653,000 was unlawfully used to purchase materials and hire workers to secure his victory over then Mayor Adrian Fenty two years ago, money allegedly supplied by a prominent businessman with significant contractual interests with the U.S. Government. Mayor Gray's campaign slogan was "character, integrity, leadership." Three members of the D.C. Council - and a host of others in the city - have called for the mayor to resign.
Many books have been written about financial misdeeds on Wall Street, about child abuse and cover-ups within the Roman Catholic Church, and among the Orthodox Jewish community in New York. While it may be true that bad news is news while good news is not, the bad news is increasingly widespread.
Our crisis in values has been building for some time. The May-June1988 issue of The Harvard Magazine published an eleven-page essay, "Ethics, the University, and Society," by President Derek Bok. He declares that
The American nation is greatly in need of some means to civilize new generations of the people, preparing them to serve as honest, benevolent, productive citizens of a free society, and all of Harvard's deliberations and studies and initiatives and earnest concerns have not resulted in any effective means of Character Education.
Derek Bok concludes:
Despite the importance of moral development to the individual and the society, one cannot say that higher education has demonstrated a deep concern for the problem. . . . The subject is not treated as a serious responsibility worthy of sustained discussion and determined action. . . . If this situation is to change there is no doubt where the initiative must lie. Universities will never do much to encourage a genuine concern of ethical issues or to help their students to acquire a strong and carefully considered set of moral values until presidents and deans take the lead.
Needless to say, things have deteriorated a great deal since then. It is not only the values of average Americans that appear to be in a fee fall of decline, but those of our elites may be leading the way. Who in Washington or Wall Street - or at Penn State - is held responsible for what they do?
New York Times columnist David Brooks laments about the decline on the part of today's elites. In the past, he writes, elites
. . . had a stewardship mentality, that they were temporary caretakers of institutions that would span generations. They cruelly ostracized people who did not live up to their codes of gentlemanly conduct and scrupulosity. They were insular and struggled with intimacy, but they did believe in restraint, reticence, and service.
Today's elite, in Brooks' view,
. . . is more talented and open but lacks a self-conscious leadership code. The language of meritocracy (how to succeed) has eclipsed the language of morality (how to be virtuous). Wall Street firms, for example, now hire on the basis of youth and brains, not experience and character. Most of their problems can be traced to this. If you read the e-mails from the Libor scandal you get the same sensation from reading the e-mails in so many recent scandals: these people are brats; they have no sense that they are guardians for an institution the world depends on; they have no consciousness of their larger social role.
How to reverse our moral decline is not a subject that is being widely discussed in our contemporary society. It should be. If it is not addressed, all of us - and our children and grandchildren - will be losers.
Government spending, everyone realizes, is out of control. During President George W. Bush's tenure from 2001 through 2009, the national debt doubled. According to Bruce Riedl of the Heritage Foundation, the prescription drug bill alone is projected to add nearly $400 billion in its first decade.
Both parties are clearly in this together. According the Factcheck.org:
Spending under President Obama remains at a level that is quite high by historical standards. Measured as a percentage of the nation's economic production, it reached the highest level since World War II in fiscal 2009.
Since 2009, the Obama administration has maintained trillion dollar deficits. Writing in Roll Call, Dustin Siggins and David Weinberger report that
If we average spending as a percentage of GDP under Bush from 2001 to 2009, it comes to just over 20 percent. . . . If we do the same for Obama from 2010 to 2012, we get about 24 percent, quite a bit higher than the historical average.
One place to cut spending is to eliminate depression-era farm subsidies. But since each farm state has two senators, some Democrats, some Republicans, this has been difficult to do. Now, in our era of economic decline and skyrocketing government debt, it is time to take a serious look at this wasteful program.
In 2012, the Department of Agriculture is projected to spend $22 billion on subsidy programs for farmers. Veronique de Rugy of the Mercatus Center at George Mason University notes that this program was introduced in the 1930s to help struggling small family farms but
. . . the subsidies have become the poster child for government welfare for the affluent. Farm households have higher incomes, on average, than do non-farm U.S. households. Figures from the USDA show that in 2010 the mean farm household income was $84,400, up 9.4 percent from 2009. This is 25 percent higher than the mean U.S. household income of $67,350 as reported by the U.S. Census Bureau for 2010.
Beyond this, farm subsidies tend to flow toward the largest and wealthiest farm businesses. According the Environment Working Group database, in 2010, 10 percent of farms received 74 percent of all subsidies. These recipients are large commercial farms with more than $250,000 in sales and mostly produced crops tied to political interests. The Cato Institute's Tad DeHaven and Chris Edwards calculate that more than 90 percent of all farm subsidies go to farmers of just five crops - corn, wheat, soybeans, rice, and cotton. For every federal dollar spent on farm subsidies, 19 cents goes to small farms, 19 cents goes to intermediate (middle income) farms, and 62 cents to the largest commercial farms.
In De Rugy's view
The tragedy is that while cronyism benefits the haves, all other Americans - especially those with lower incomes - suffer from the resulting distortions. Take the domestic sugar industry as an example. The government protects its producers against foreign competitors by imposing U.S. import quotas, and against low prices generally with a no-recourse loan program that serves as an effective price floor. As a result . . . U.S. consumers and businesses had to pay twice the world price of sugar on average since 1982.
According to the Center for Responsive Politics, the U.S. farm lobby contributes millions of dollars to political campaigns to maintain federal support for the subsidies. The agribusiness sector as a whole spent $124 million on lobbying 2011. For the past decade, the amount of money this sector has spent on lobbying has grown more than 69 percent. Between 1995 and 2009, 23 farmers currently serving in Congress have signed up for farm subsidies.
The fact is that the U.S. has the richest, most productive agricultural sector, and the best fed population in the world. Boosted by $136.3 billion in gross sales to other countries, U.S. net farm income hit a record $98.1 billion in 2011. A new Economist Intelligence Unit report ranks the U.S. as the most "food secure" nation in the world, based on the affordability and quality of its food supply. The U.S. provides the equivalent of 3,748 calories per day for each of its roughly 314 million people. That is nearly 1,500 calories more than the minimum necessary for a healthy life.
Still, every five years Congress drafts a farm bill as if U.S. agriculture cannot possibly exist in a real free market economy. At this very moment, farm-state lawmakers and the lobbyists who swarm around them, are preparing to extend this program of subsidies. The Senate has already passed a measure priced at $969 billion over the next decade. The House has gone on summer vacation without acting as Republicans weigh the election-year political risks of proceeding with that chamber's own near-trillion dollar measure.
Editorially, The Washington Times points out that
Like the bank bailouts and TARP, the farm bill illustrates the capture of the legislative process by special interests. The last farm bill in 2008 was the focus of $173.5 million in lobbying expenditure. . . . This is all money spent on what the Mercatus Center's Matthew Mitchell calls "unproductive entrepreneurship" where people are organizing and expending their talent to become rent seekers, and the end result is wealth redistribution, not wealth creation. Real entrepreneurship innovates in ways that are socially useful. Cronyism diverts resources . . . into a system that rewards privileges to favored groups. In the case of the 2008 farm bill, recipients of subsidies of $30,000 or more had an average household income of $210,000.
Many in Congress who speak of their belief in the free market and who decry huge government deficits, nevertheless seem ready to extend farm subsidies into the future. This tells us, unfortunately, that what we are witnessing is politics as usual. And both parties are in it together. No one needs to wonder why we can't bring government spending under control. This is why. *
The growth of presidential power in recent years represents a serious threat to representative government. The idea of the executive "executing" the laws passed by the elected representatives of the people in the Congress seems to those in power - whether Republicans or Democrats - to be an old-fashioned notion.
When President Obama unilaterally called a halt to deportation proceedings for certain unauthorized immigrants who came to the U.S. as minors, the eligibility requirements roughly tracked the requirements of the Dream Act, which was never passed by Congress.
In an interview with a panel of Latino journalists last fall, the president said:
This notion that somehow I can just change the laws unilaterally is just not true. We live in a democracy. You have to pass bills through the legislature and then I can sign it.
Gene Healy, vice president of the Cato Institute, notes that,
As it happens, Obama's "royal dispensation" for young immigrants is hardly the most terrifying instance of administration unilateralism. In fact, as a policy matter, it's a humane and judicious use of prosecutorial resources. But given the context, it stinks. It looks uncomfortably like implementing parts of a bill that didn't pass and - carried out as it was with great fanfare and an eye to the impending election - the move sits uneasily with the president's constitutional responsibility to "take care that the laws be faithfully executed."
Or consider the president's claim of "executive privilege" in withholding information about the Justice Department's Operation Fast and Furious, which deliberately put assault weapons in the hands of Mexican drug cartels as part of a sting, and then lost track of hundreds of them. A Border Patrol agent was killed in 2010, apparently by one of these guns.
Executive privilege, affirmed by the Supreme Court in U.S. v. Nixon, is historically limited to the president's own discussions. President Obama has now extended it to his attorney general. This contravenes the president's promises of transparency.
Recent legislation has made legal the president's right to detain a person indefinitely on suspicion of affiliation with terrorist organizations or "associated forces," a broad, vague power that can be abused without real oversight from the courts or the Congress.
At the same time, American citizens can now be targeted for assassination or indefinite detention. Recent laws have also canceled the restraints in the Foreign Intelligence Surveillance Act of 1978 to allow unprecedented violations of our right to privacy through warrantless wiretapping and government mining of our electronic communications.
According to The New York Times, President Obama is personally deciding upon every drone strike in Yemen and Somalia and the riskiest ones in Pakistan, assisted only by his own aides. It is reported that suspects are now being killed in Yemen without anyone knowing their names, using criteria that have not been made public.
Editorially, The Times declares that no president
. . . should be able to unilaterally order the killing of American citizens or foreigners located far from a battlefield - depriving Americans of their due process rights - without the consent of someone outside his political inner circle. How can the world know whether this president or a successor truly pursued all methods short of assassination, or instead - to avoid a political charge of weakness - built up a tough-sounding list of kills?
To permit President Obama - or any president - to execute American citizens without judicial review and outside the theater of war, gives him the power of judge, jury and executioner without any check or balance. This is clearly an abuse of presidential power.
For many years, under both parties, the power of the executive has been growing. In his classic work, The American Presidency, written in 1956. Professor Clinton Rossiter writes that:
The presidency has the same general outlines as that of 1789, but the whole picture is a hundred times magnified. The president is all the things he was intended to be, and he is several other things as well. . . . The presidency today is distinctly more powerful. It cuts deeply into the power of Congress. In fact it has quite reversed the expectations of the framers by becoming itself a vortex into which these powers have been drawn in massive amounts. It cuts deeply into the lives of the people; in fact, it commands authority over their comings and goings that Hamilton himself might tremble to behold. The outstanding feature of American constitutional development is the growth of the power and prestige of the Presidency.
He also makes the converse explicit:
The long decline of Congress has contributed greatly to the rise of the presidency. The framers . . . expected Congress to be the focus of our system of government.
That was 1956. The power of the presidency has steadily expanded since then, no matter which party was in power.
When Republican presidents have expanded the power of the presidency, Republicans in the Congress have acquiesced. When Democratic presidents expanded the power of the executive, Democrats in the Congress have embraced that expansion. The result is an executive branch increasingly unaccountable to the elected representatives of the people. That is not the system the authors of the Constitution had in mind. We would do well to return to the constitutional philosophy of checks and balances, and a division of powers. An all powerful executive is a threat to freedom and accountability, as the Framers of the Constitution understood very well as a result of their own experience and of the experience of the world.
Government spending and government debt has been skyrocketing. Under President George W. Bush, the debt reached unprecedented levels. Under President Barack Obama, it has exploded still further. Whichever party is in power, government gets larger and government debt increases.
Our political system, sadly, rewards big spenders. Every organized special interest group in the American society - the farmers, the teachers, the labor unions, manufacturers, Wall Street financiers, realtors, etc. - want one or another form of government subsidization for themselves.
They all have active political action committees, which promise rewards for those who open the government coffers to them, and penalties for those who do not. The incentive is clearly one-sided. As Democrats used to say in the New Deal days, the way to electoral success is clear: "Spend and spend, tax and tax, elect and elect." Now, Republicans too have learned this lesson. Since neither Republicans nor Democrats are too eager to antagonize voters by raising taxes to pay for their extravagant spending, the budget deficits grow each year.
In May, for example, President Obama reauthorized the Export-lmport Bank, raising its lending authority 40 percent to $140 billion by 2014, one day before the 78-year-old federal bank would have been shut down had he not signed the bill. During the 2008 presidential campaign, Mr. Obama called the bank, "little more than a fund for corporate welfare."
Despite President Obama's frequent criticism of corporate jets, the bill includes $1 billion in subsidies for jet manufacturers, which have experienced a steep decline in demand in recent years. Export-lmport Bank supporters in the business community - who speak of "free markets" but campaign vigorously for government subsidies - welcomed the President's support. John Murphy, vice president for international affairs at the U.S. Chamber of Commerce, said that the President's action was "Great news for thousands of American workers and businesses of all sizes." The National Association of Manufacturers - and both Republicans and Democrats in Congress-supported the reauthorization of the Export-lmport Bank.
Tim Phillips, president of Americans for Prosperity, described the Bank in these terms:
In (its) nearly 80 years, the official credit export agency of the United States has financed over $450 billion in purchases. Ex-Im allows the federal government to pick winners and losers in the market, and all too often, that leads to back room deals and government cronyism. . . . It is a heinous practice that gives money to a small number of politically connected companies while leaving taxpayers with the risk. . . . The American taxpayer does not exist in order to keep businesses from failing.
Republicans and Democrats are co-conspirators in this enterprise. The incentive structure for both parties is precisely the same. Republicans may talk of the "free market" and argue that Democrats are against it, but both parties raise their funds on Wall Street and in corporate boardrooms, and both parties have supported bailouts of failed businesses and subsidies for others.
Voters say that they are against big government, and oppose deficit spending, but when it comes to their own particular share, they act in a different manner entirely. This is nothing new. Longtime Minnesota Republican Congressman Walter Judd once recalled that a Republican businessman from his district
. . . who normally decried deficit spending berated me for voting against a bill which would have brought several million federal dollars into our city. My answer was, "Where do you think federal funds for Minneapolis come from? People in St. Paul?". . . My years in public life have taught me that politicians and citizens alike invariably claim that government spending should be restrained - except where the restraints cut off federal dollars flowing into their cities, their businesses, or their pocketbooks.
If each group curbed its demands upon government it would be easy to restore health to our economy. Human nature, however, leads to the unfortunate situation in which, under representative government, people have learned that through political pressure they can vote funds for themselves that have, in fact, been earned by the hard work of others.
This point was made 200 years ago by the British historian Alexander Tytler:
A democracy cannot exist as a permanent form of government. It can only exist until the voters discover they can vote themselves largess out of the public treasury. From that moment on, the majority always votes for the candidates promising the most benefits from the public treasury - with the result that democracy collapses over a loose fiscal policy, always to be followed by a dictatorship.
Hopefully, we can avoid fulfilling this prediction. It is an illusion to think that such a thing as "government money" exists. The only money which government has is what it first takes from citizens. Many years ago, Senator William Proxmire (D-Wisconsin) pointed out that no one ever petitions members of Congress to "leave us alone," everyone who comes before Congress, he lamented, wants something. Members of Congress - of both parties - have the same incentive, to give each group what it wants to ensure their support for the future. The result is that government spending - and government debt - steadily grows.
Unless we find a way to change this incentive structure, it seems unlikely that we will bring government spending - and government deficits - under control. As the presidential campaign gets under way, neither party is addressing this crucial question. Politics as usual, unfortunately, will not help us to resolve the very real problems we face.
In June, the effort by labor unions and others to recall Wisconsin Governor Scott Walker was soundly defeated. Governor Walker had not committed any crime or other indiscretion. He was being recalled only because he had implemented the policies he promised during his campaign.
In February 2011, Walker announced his plan to limit the subjects covered by collective bargaining for public employees, compel them to contribute more to their healthcare and pension plans, stop government from collecting dues automatically on unions' behalf, and require public employee unions to hold annual certification elections. Once in office, he implemented these policies.
Wisconsin's recall policy is questionable and, in many ways, a threat to representative democracy. Officeholders can be removed at the conclusion of their terms for policy disagreements. Wisconsin has had 14 elected state government officials involved in recall elections during the past year alone. The state's largest newspaper, the Milwaukee Journal Sentinel, endorsed Governor Walker, arguing that elected officials should not be recalled simply for policy differences. Some 60 percent of voters in exit polls agreed.
Beyond this, the union effort in Wisconsin has focused a much-needed spotlight on the excesses of public pensions. Over the years, politicians have given in to union demands for higher pensions - rather than wage increases - because they knew that such pensions would be paid many years later, under someone else's watch. Now, these bills are coming due -and many states and cities are in no position to pay them.
In New Jersey, Governor Chris Christie is seeking to reform his state's sick-pay policies. Current law allows public workers to accumulate unused sick pay, which they can cash in upon retirement. "They call them 'boat checks,"' Christie says.
Know the reason they call them boat checks? It is the check they use to buy their boat when they retire - literally.
He tells the story of the town of Parsippany, where four police officers retired at one time and were owed a collective $900,000 in unused sick pay. The municipality didn't have the money and had to float a bond in order to make the payments.
Governor Christie wants to end sick-pay accumulation. "If you're sick, take your sick day," he says. "If you don't take your sick day, know what your reward is? You weren't sick - that was the reward." Newsweek notes that,
It was by the force of such arguments that Christie was able to overcome the powerful teachers, union and force educators to help pay for their healthcare costs, and to win broad rollbacks in benefits for the state's huge public workforce.
Shortly after the vote in Wisconsin, there were landslide victories in San Jose and San Diego, California, of ballot measures meant to cut back public sector retirees' benefits. Warren Buffet calls the costs of public-sector retirees a "time bomb." They are, he believes, the single biggest threat to our fiscal health.
In California, total pension liabilities - the money the state is legally required to pay its public-sector retirees - are 30 times its annual budget deficit. Annual pension benefits rose by 2,000 percent from 1999 to 2009. In Illinois, they are already 15 percent of general revenue and growing. Ohio's pension liabilities are now 35 percent of the state's GDP.
Commentator Fareed Zakaria notes that,
The accounting at the heart of the government pension plans is fraudulent, so much so that it should be illegal. Here's how it works. For a plan to be deemed solvent, employees and the government must finance it with regular monthly contributions as determined by assumptions about investment returns of the plan. The better the investment returns the less the state has to put in. So states everywhere made magical assumptions about investment returns.
David Crane, an economic adviser to former California Governor Arnold Schwarzenegger, points out that state pension funds have assumed that the stock market will grow 40 percent faster in the 21st century than it did in the 20th. While the market has grown 175 times during the past 100 years, state governments are now assuming that it will grow 1,750 times its size over the next 100 years.
Inflated retirement benefits are the reason for dramatic cuts in spending by state and local government for anything else. Last year, California spent $32 billion on employee pay and benefits, up 65 percent over the past 10 years. In that same time period, spending on higher education is down 5 percent. Three-quarters of San Jose's discretionary spending goes to its public safety workers alone. The city has closed libraries, cut back on park services, laid off many civil servants and asked the rest to take pay cuts. By 2014, San Jose, the 10th largest city in the country, will be serviced by 1,600 public workers, one-third the number it had 25 years ago.
The Pew Center on the States has quantified the problem. In 2008, the states had set aside $2.35 trillion to pay for public employees' retirement benefits while owing $3.35 trillion in promises. A year later, the trillion-dollar gap had grown by 26 percent. The massive expanding obligation cuts into the provision of government services. Former Los Angeles Mayor Richard Riordan notes that:
A lot of things are going to happen dramatically over the next couple of years and then people will listen. If you close down all the parks and all the libraries, this is political dynamite.
In Wisconsin, as a result of Governor Walker's reforms, the state has balanced its two-year budget without tax increases and local school districts have used their new bargaining power to save money without layoffs or significant increases in class size. While leading Democrats, such as President Obama and former President Clinton, supported the recall effort in Wisconsin, many others, such as the liberal Democratic mayor of San Jose, recognize that it is time to bring the excesses of public sector unions under control. Editorially, The Washington Post declared,
. . . those who voted for Mr. Walker to show approval for his policies, and not just disapproval for the recall itself, had plausible reasons for doing so. . . . Public employee union leaders are pledging to fight . . . new laws in court. . . . They would do better to engage governments in a good-faith effort to restructure and preserve public services for the long term. States and localities face genuine problems, and the unions share responsibility for them.
In recent days, with the extraordinary publicity surrounding the Trayvon Martin case in Florida and an escalation in overheated racial rhetoric, one would think that the real problem facing black Americans are a result of "white racism."
Needless to say, this overlooks the fact that race relations in America have dramatically improved in recent years and that we are well on our way to achieving a genuinely color blind society.
Writing in The Washington Post, columnist Richard Cohen points out that most Americans
. . . do not know what a miracle has been pulled off - how a nation that once contained so much bigotry now contains so little. I am not a fool on these matters, I think, and I recognize . . . the residue of bigotry, but still the big picture is that Obama is a black man and is the president of the United States. Mamma, can you believe it?
Cohen provides this assessment:
Some insist that not much has changed. They cite a persistent racism. There are many such examples - not all that many, actually - but they are newsworthy because they are exceptions to the rule, not what we expect.
Recently, Wesley A. Brown, the first African American to graduate from the U.S. Naval Academy died. He was the sixth black man admitted and - the only one to successfully endure the racist hazing that had forced others to quit. He graduated in 1949. Cohen writes that,
When I read the obituary on Wesley A. Brown, I was shocked once again at the depth and meanness of our racism and just plain dumbstruck by how far we have come. The new field house at the Naval Academy is named for Brown. He called it, "The most beautiful building I've ever seen," but he was wrong. It's not a building. It's a monument.
This is not to say that the black community does not face many problems. These problems, however, are not a result of white racism but of internal forces at work within the black community. One such serious problem is crime.
Each year, roughly 7,000 blacks are murdered; 94 percent of the time, the murderer is another black person. According to the Bureau of Justice Statistics, between 1976 and 2011, there were 279,384 black murder victims. The 94 percent figure suggests that 262,621 were murdered by other blacks.
Though blacks are 13 percent of the national population, they account for more than 50 percent of homicide victims. Nationally, the black homicide victimization rate is six times that of whites, and in some cities, it is 32 times that of whites. Blacks are also disproportionately victimized by violent personal crimes, such as assault and robbery.
Economist Walter Williams points out that,
The magnitude of this tragic mayhem can be viewed in another light. According to a Tuskegee Institute study, between the years 1882 and 1998, 3,446 blacks were lynched at the hands of whites. Black fatalities during the Korean War (3,075), Vietnam War (7,243) and all the wars since 1980 (8,107) come to 18,425, a number that pales in comparison with black loss of life at home. Tragically, young black males have a greater chance of reaching maturity on the battlefields of Iraq and Afghanistan than on the streets of Philadelphia, Chicago, Detroit, Oakland, Newark, and other cities.
Sadly, the question is hardly ever discussed by black leaders. In Williams' view,
A much larger issue is how might we interpret the deafening silence about the day-to-day murder in black communities compared with the national uproar over the killing of Trayvon Martin. Such a response by politicians, civil rights organizations, and the mainstream news media could easily be interpreted as blacks killing other blacks is of little concern, but it's unacceptable for a white to kill a black person.
Several black leaders have started to discuss black-on-black crime. When President Obama commented about the Martin case, William Fair, president of the Urban League of Greater Miami, said that "the outrage should be about us killing each other, about black-on-black crime." He asked rhetorically:
Wouldn't you think to have 41 people shot (in Chicago) between Friday morning and Monday morning would be much more newsworthy and deserve much more outrage?
Former NAACP leader Pastor C. L. Bryant said that the rallies organized by Al Sharpton and Jesse Jackson suggest there is an epidemic of "white men killing young black men," adding, "The epidemic is truly black-on-black crime. The greatest danger to the lives of young black men are young black men."
Beyond this, argues Walter Williams,
Not only is there silence about black-on-black crime, there's silence about black racist attacks on whites - for example, the recent attacks on two Virginian-Pilot newspaper reporters set upon and beaten by a mob of young blacks (in Norfolk, Virginia). The story wasn't even covered by their own newspaper. In March, a black mob assaulted and knocked unconscious, disrobed and robbed a white tourist in downtown Baltimore. Black mobs have roamed the streets of Denver, Chicago, Philadelphia, New York, Cleveland, Washington, Los Angeles and other cities, making unprovoked attacks on whites and running off with their belongings.
This is not a new story. This writer was the author of a book (with Lincoln Review editor J. A. Parker) in 1974 entitled What The Negro Can Do About Crime (Arlington House). There was an extensive discussion of black-on-black crime and the manner in which black leaders refused to confront it.
On Page 54 is the following passage:
Criticizing those Negroes who have not spoken out against crime, Roy Wilkins, executive director of the NAACP, declared that, "except for a few voices, Negro citizens have given consent to robbery, muggings, assaults, and murder by their silence. They have been intimidated by a curious twisting of the 'us blacks together' philosophy that holds that complaining of black criminals is somehow 'betraying the race.'" This is nonsense. One can be proud of being black without embracing every black mugger, rapist, and auto thief.
For those in the black community genuinely concerned about the future prospects of its young men and women, focusing upon the black-on-black crime wave that now engulfs our inner cities, and has broken out into attacks upon the community at large, is an important place to begin. Thus far, however, this has largely been ignored in place of repeated attacks upon "white racism," which, by any standard, has receded dramatically. Such racial demagoguery serves the very community in whose name it is launched. It is time for a radically different direction.
We are now in an era when we are told that a proper goal for society is for "everyone" to go to college. At the same time, there is a serious mismatch of jobs that are now available and the number of individuals who are qualified to fill them. Manufacturing companies, for example, cannot find enough high-tech machinists, and they are subsidizing tuition at local community colleges in a desperate effort to fill vacancies.
The Cato Institute's Andrew Coulson reports that we spend - in real terms - almost twice as much per student in a public school as we did in 1970. Despite this, academic achievement has remained flat or worsened. Vocational training, a long and important path to gainful employment, has been pushed aside.
Vocational education once played an important part in our schools, designed for those who were not suited for, or had no interest in, higher education. About forty years ago, it began to fall out of fashion, in part because it became a civil rights issue. As Time recently noted:
Vocational education was seen as a form of segregation, a convenient dumping ground for minority kids in Northern cities.
Former New York City schools chancellor Joel Klein says that,
This was a real problem. And the vocational education programs were pretty awful. They weren't training the kids for specific jobs or for certified skills. It really was a waste of time and money.
In an important article, "Learning That Works," Time writer Joe Klein declares that,
Unfortunately, the education establishment's response to the voc-ed problem only made things worse. Over time, it morphed into the theology that every child should go to college (a four-year liberal arts college at that) and therefore every child should be required to pursue a college-prep course in high school. The results have been awful. High school dropout rates continue to be a national embarrassment, and most high school graduates are not prepared for the world of work. The unemployment rate for recent high school graduates who are not in school is a stratospheric 33 percent. The results for even those who go on to higher education are brutal: four-year colleges graduate only about 40 percent of the students who start them, and two-year community colleges graduate less than that, about 25 percent.
Diane Ravitch, a professor of education at New York University, says that,
College for everyone has become a matter of political correctness. But according to the Bureau of Labor Statistics, less than a quarter of new job openings will require a bachelor of arts degree. We're not training our students for the jobs that actually exist.
At the same time, the U.S. is beginning to run out of welders, glaziers, and auto mechanics - jobs that actually keep things running, and cannot be outsourced.
In Arizona and a few other states, things are beginning to change. Vocational education there is now called career and technical education (CTE) and now attracts about 27 percent of students. It has been found that they are more likely to score higher on the state's aptitude tests, graduate from high school, and go on to higher education than those who don't.
"It's not rocket science," says Sally Downey, superintendent of the East Valley Institute of Technology in Mesa, Arizona, 98.5 percent of whose students graduate from high school. "It's just finding something they like and teaching it to them with rigor."
At the Auto shop in East Valley, there are 40 late model cars and the latest in diagnostic equipment, donated by Phoenix auto dealers, who are in need of trained technicians. "If you can master the computer science and electronic components," Downey says, "you can make over $100,000 a year as an auto mechanic."
Carolyn Warner, a former Arizona schools chancellor, says tech track students
. . . are more focused, so they're more likely to graduate from two- and four-year colleges. Those who graduate from high school with a certificate of technical expertise in a field like auto repair or welding are certainly more likely to find jobs.
At East Valley, there are 38 programs, with more coming. There are firefighter, police, and EMT programs; a state-of-the-art kitchen for culinary services training and welding (which can pay $40 per hour), aeronautics, radio station, marketing, and massage therapy instruction. Almost all of these courses lead to professional certificates. In addition to high school diplomas, many of the students are trained by employers for needed technical specialties.
An interesting example of business participation in technical and vocational education can be seen in the case of a new public school in Brooklyn, New York. called P-Tech, or Pathways in Technology Early College High School. Started last September, it is a partnership of the New York City department of education, the New York City College of Technology, the City University of New York and IBM, whose head of corporate social responsibility, Stanley Litow, used to be the city's deputy schools chancellor.
The goal is to create a science and tech-heavy curriculum to prepare students - some of whom would be the first in their families to graduate from high school - for entry and mid-level jobs at top tech-oriented companies. Each student gets an IBM mentor and there is also a core curriculum focused on English, math, science, and technology.
P-Tech students will graduate with not only a high school diploma but an associate's degree as well. This is important, since 63 percent of American jobs will require postsecondary training by 2018. The U.S. economy will create more than 14 million new jobs over the next decade, but only for people with at least a community college degree. These jobs - positions like dental hygienist, medical laboratory technician, aircraft mechanic and entry level software engineer - will allow millions entry into the middle class. Many of them will require serious technology skills.
Harvard Business School professor Rosabeth Moss Kanter argues that as much as a third of the increase in unemployment in recent years can be attributed to a mismatch between skills and jobs. The gap is greatest in positions that require more than a high school diploma but less than a bachelor's degree. Companies feel that schools are simply not turning out graduates with the skills they need. That was an impetus for IBM's role with New York's P-Tech.
Chicago Mayor Rahm Emanuel is setting up five new STEM schools - the acronym stands for science, tech, engineering and math - in partnership with IBM, Microsoft, Verizon, Cisco and other companies.
Vocational education deserves a serious second look by school systems across the country. Training young men and women for jobs that actually exist in our economy - something our current educational system is not doing very well - is certainly worth doing, both for the sake of the young people involved and for the health of our larger society and economy. *
The facts in the case of the killing of Trayvon Martin in Sanford, Florida remain unclear. As the trial proceeds, such facts should be revealed.
By any standard, the shooting of an unarmed 17-year-old is a tragedy. Did Martin attack the alleged shooter, George Zimmerman, who claims he was defending himself? Was Zimmerman animated by racial animus? All of this, as we move forward, will, hopefully become known.
What we have seen, however, is a rush to judgment, particularly by those who seem to have a vested interest of their own in painting a bleak picture of race relations in the United States.
In an interview with the Los Angeles Times, the Rev. Jesse Jackson explained that with the election of President Obama:
. . . there was this feeling that we were kind of beyond racism. . . . That's not true. This victory has triggered tremendous backlash. Blacks are under attack.
The New Black Panther Party (NBPP), involved in voter fraud in Philadelphia but never prosecuted, has offered a $10,000 bounty for the capture of George Zimmerman. The Orlando Sentinel asked NBPP spokesman Mikhail Muhammad whether the call for a bounty was incitement. The response: "An eye for an eye, a tooth for a tooth." In an interview with CNN's Anderson Cooper, Muhammad said that black people were not obliged to obey "the white man's law."
On Capitol Hill, Rep. Bobby Rush (D-IL), who is black, was ousted from the House floor for violating the chamber's dress code after attempting to deliver a statement while wearing a gray hoodie with the hood pulled over his head. Rush contended that the hoodie Trayvon Martin was wearing symbolized the "racial profiling" that led to his death. "Racial profiling has got to stop," Rush said. "Just because someone wears a hoodie does not make them a hoodlum."
Writing in The Washington Post, Renique Allen of the New America Foundation, argues that the election of a black president has made it more difficult to talk about race in America. In her view:
The Obama presidency is "post-racial" only in the sense that it gives us an excuse not to grapple with race anymore . . . I have encountered many people who seem to believe . . . that Obama's win is proof that America has reached the mountaintop. What more is there to say about race, they ask me, after this country so proudly and overwhelmingly elected a black president? They cite success stories as disparate as Oprah Winfrey, Jay-Z, and former Time Warner chief Dick Parsons. . . . Even the most well-intentioned white people, who fundamentally understand the challenges of race in America, often can't understand why race, as a subject to wrestle with, can never be "over."
There is no doubt that racial problems have not disappeared overnight with the election of a black president. Still the evidence that race relations have been steadily improving is clear, and it is not helpful for various black spokesmen - from Jesse Jackson to Al Sharpton to Spike Lee - to use any incident, such as in Sanford, Florida, to publicly proclaim that nothing - or very little - has, in fact, changed.
Things in the Trayvon Martin case have clearly gotten out of hand. Marcus Davonne Higgins, a Los Angeles man, sent a tweet to several celebrities including what he thought was the street address of alleged shooter George Zimmerman. Film director Spike Lee didn't check but re-tweeted the incorrect address to the 250,000 people who follow him on Twitter.
Columnist Gregory Kane, who is black, reports that:
Suddenly the Sanford home of Elaine McClain, 70, and her 72-year-old husband, David McClain, started receiving hate mail and threats. George M. Zimmerman does not and has never lived at the address that Lee and others published on Twitter. But William George Zimmerman, Elaine McClain's son from a previous marriage, lived there at one time. Higgins had tweeted the wrong address. . . . Lee, an African-American who's always trying to prove how black he is, and how down with the brothers he is, probably couldn't resist what must have come naturally to him. He decided to retweet the address, the better to make a statement about the Martin shooting. The McClains had to move from their home to a hotel. . . .
Despite the demagoguery we have seen in the wake of this incident in Florida, there are abundant signs that America is really moving in the direction of becoming a color-blind society. According to a study the U.S. Census released late in January, residential segregation has been dramatically curtailed. The study of census results from thousands of neighborhoods by the Manhattan Institute found that the nation's cities are more economically integrated than at any time since 1910. It was found that all-white enclaves "are effectively extinct."
"There is now much more black-white neighborhood integration than 40 years ago," said Professor Reynolds Farley of the University of Michigan's Population Studies Center. "Those of us who worked on segregation in the 1960s never anticipated such decline."
At the same time, interracial marriages in the U.S. have climbed to 4.8 million - a record 1 in 12. A Pew Research Center study, released in February, details an America where interracial unions and mixed-race children are challenging typical notions of race.
"The rise in interracial marriage indicates that race relations have improved over the past quarter-century," said Daniel Lichter, a sociology professor at Cornell University.
Mixed-race children have blurred America's color line. They often interact with others on either side of the racial divide, and frequently serve as brokers between friends and family members of different racial backgrounds.
Black Americans are optimistic about the future. A 2011 survey conducted by the Washington Post-Kaiser-Harvard Poll found that in the midst of our economic downturn, 60 percent of blacks said they believed their children's standard of living would be better than their own, while only 36 percent of whites held this view. On the eve of President Obama's inauguration, 69 percent of black respondents told CNN pollsters that Martin Luther King's vision had been "fulfilled."
From 2002 to 2007, the number of black-owned businesses grew by 60.5 percent to 1.9 million, more than triple the national rate of 18 percent, according to the Census Bureau. Black Americans hold positions of responsibility in every aspect of our society - from President, to Governor, to Attorney General, to Supreme Court justice. We have, in recent years, had two black Secretaries of State. There is no position in our society to which black Americans cannot aspire.
Whatever facts finally emerge in the Trayvon Martin case, we must reject those racial demagogues who seek every opportunity to deny racial progress and to promote themselves as leaders of an embattled and isolated minority. Our society has made dramatic progress. Certainly, there is more progress to be made in the future. But no incident - such as the one in Florida - should be used as a means to deny that progress and paint a dark - and untrue - picture of a society that has moved dramatically to overcome the racial barriers of the past. Those who engage in such tactics are not friends of the black community but may, in the end, be doing it as much harm as genuine racists.
At the present time, many are speaking of launching a pre-emptive strike against Iran. Washington Post columnist Dana Milbank writes that:
It's beginning to feel a lot like 2003 in the capital. Nine years ago . . . there was a similar feeling of inevitability - that despite President George W. Bush's frequent insistence that "war is my last choice," war in Iraq was coming.
In the case of Iraq, one of the key reasons given for launching our attack was that Saddam Hussein was in possession of weapons of mass destruction. This, of course, turned out not to be the case. Once again, some are urging an attack upon Iran because of that country's nuclear program. Our experience in Iraq should give us pause. Not only did we go to war with a country that had not attacked us, had no weapons of mass destruction, and no connection with the terrorists who were responsible for 9/11, but we did serious damage to our economy and lost untold numbers of American lives in an effort which now seems difficult to explain and understand.
Everyone agrees that Iran does not currently have nuclear weapons. U.S. intelligence believes that Iran is several years from achieving a nuclear capacity and it remains unclear that the Iranian leaders have made a decision to move in that direction. Those who have studied this region are most critical of those who call for war.
Gary Sick, national security adviser on Iran during the country's Islamic revolution, does not envision a situation in which Iran decides to break out and build a bomb, unless it is first attacked. Actually crossing the nuclear threshold would be "inviting an attack," Sick said, and would not be in Tehran's interest. But if Iran doesn't build a bomb, its demonstrated capability to do so, Sick explains, will make it a member of a small club of nations, such as Japan, Brazil and Sweden, that can acquire a nuclear weapon if they break away from the Non-Proliferation Treaty. In either case, Iran's goal is to assert its position as a major player in the region, one that the world should take seriously and with which it should consult.
The International Atomic Energy Agency (IAEA) has documented that Iran is putting all the pieces in place to have the option to develop nuclear weapons at some point. If Supreme Leader Ayatolla Ali Khamenei decides to produce a bomb, Iran is believed to have the technical capability to produce a testable nuclear device in a year or so and a missile-capable device in several years. But as Director of National Intelligence James Clapper told the Senate Armed Services Committee on February 16, it does not appear that Khamenei has made this decision.
Colin Kahl, an associate professor at Georgetown University's School of Foreign Service, who was deputy assistant secretary of defense for the Middle East from 2009 to 2011, argues:
Khamenei is unlikely to dash for a bomb in the near future because IAEA inspectors would probably detect Iranian efforts to divert low-enriched uranium and enrich it to weapons-grade level at declared facilities. Such brazen acts would trigger a draconian international response. Until Iran can pursue such efforts more quickly or in secret - which could be years from now - Khamenei is unlikely to act.
A full page ad in The Washington Post was headlined: "Mr. President: Say No to War of Choice With Iran." The signatories included General Joseph Hoar (USMC, Ret.), Brigadier General John H. Johns (USA, Ret.), Major General Paul Eaton (USA, Ret.), Tom Fingar, former Deputy Director of National Intelligence for Analysis, and Paul Pillar, former National Intelligence Officer for the Near East and South Asia. They declare:
The U.S. military is the most formidable military force on earth. But not every challenge has a military solution. Unless we, or an ally, are attacked, arms should be the option of last resort. Our brave servicemen and women expect you to exhaust all diplomatic and peaceful options before you send them into harm's way. Preventing a nuclear-armed Iran is rightfully your priority and your red line. Fortunately, diplomacy has not been exhausted and peaceful solutions are still possible. Military action is not only unnecessary, it is dangerous - for the United States and for Israel. We urge you to resist the pressure for a war of choice with Iran.
In Israel, opinion is sharply divided over the question of pre-emptive war. Many respected Israelis believe that a pre-emptive attack against Iran would be a serious mistake for Israel and would do it serious long-term harm. Political scientist Yeherkel Dror, an Israel Prize winner and founding president of the Jewish People Policy Institute, says that with regard to Iran, Israel needs to rely on "ultimate deterrence," that an attack on Tehran's nuclear facilities will not only be counterproductive and that the real danger Israel faces is from a gradual wearing away of its staying power.
"Assuming you attack, then what?" he says.
In five years, they will recuperate with absolute determination to revenge. The idea that an Israeli attack will make Iran into a peace-loving country is not on my horizon. I don't know anything like this in history. I know the opposite from history. . . . Iran has a very low probability of being a suicidal state. They have a long culture, a long history, and they are much more involved in the Shia-Sunni conflict than the Israeli side issue. I think no one has any doubt that if Israel's future existence is in danger it will use mass killing weapons.
The Jerusalem Report notes that:
Three men once most closely involved in Israeli efforts to stop Iran - former Mossad chiefs Meir Dagan (2002-2011), Efraim Halevy (1998-2002) and Danny Yatom (1996-1998) - all see a lone Israeli military attack as a last resort, to be avoided if at all possible.
Speaking at the Hebrew University last May, Dagan derided an Israeli strike as "a stupid idea," it might not achieve its goals. It could lead to a long war, and worse, it could give Iranian leaders justification to build a nuclear weapon. In Dagan's view, precipitate Israeli action could break up the current anti-Iranian consensus, leading to less pressure on Iran, not more.
According to The Jerusalem Report:
Dagan holds that there is still time; last year he estimated that Iran would not have a nuclear weapon before 2015. . . . Efraim Halevy says Israel should recognize that it is a regional power and act like one. He says the country is too strong to be destroyed and the Israeli people should not have existential fears about Iran or anything else. . . . Israel's strategy should be to work with its allies to convince the Iranian regime to change course without force coming into play. In Halevy's view, this is achievable since the Iranian regime is dedicated primarily to its own survival and will likely back down if it feels threatened by even more crippling sanctions. Israel should be using its international connections to ratchet up pressure on the Iranian regime, while preparing a military option if, and only if, all else fails.
It seems clear that if Iran were ever to develop and use a nuclear weapon there would be massive retaliation, endangering the country's entire population. During the Cold War, the Soviet Union was armed to the teeth with nuclear weapons, and never used them precisely because of a fear of retaliation, what became known as Mutual Assured Destruction. Iran would have to be suicidal to even think of using such a weapon.
The available evidence is that Iran is not suicidal. General Martin Dempsey recently explained that he viewed Iran as a "rational actor." Although some protested this characterization, Time's Fareed Zakaria points out:
Dempsey was making a good point. A rational actor is not necessarily a reasonable actor or one who has the same goals or values that you or I do. A rational actor is someone who is concerned about his survival.
Compared with radical revolutionary regimes like Mao's China - which spoke of sacrificing half of China's population in a nuclear war to promote global Communism - the Iranian regime has been rational and calculating in its actions.
In an essay in the Washington Monthly, former senior U.S. intelligence official Paul Pillar writes:
More than three decades of history demonstrate that the Islamic Republic's rulers, like most rulers elsewhere, are overwhelmingly concerned with preserving their regime and their power - in this life, not some future one.
For the most powerful country in the world to even think of a pre-emptive war against a country that has not attacked us, has no nuclear weapons, and there is serious question about whether or not they have even decided to pursue them in the future, would itself be an irrational act. It is time for a serious debate - and serious discussion of the consequences of any such action. And if the time comes when the U.S. decides that an attack on Iran does make sense, it should be done in the form of a declaration of war by the U.S. Congress, as called for in our Constitution.
Respect for private property is an essential element of a free society. In his Discourse on Political Economy, Rousseau writes that:
It should be remembered that the foundation of the social contract is property; and its first condition, that every one should be maintained in the peaceful possession of what belongs to him.
In The Prince, Machiavelli notes that, "When neither their property nor their liberty is touched, the majority of men live content."
In our own society, there have been increasing efforts to limit the rights of property owners. Fortunately, efforts are now growing to reverse such trends.
In March, the Supreme Court ruled that an Idaho couple facing ruinous fines for attempting to build a home on private property that the federal government considered protected wetlands may challenge an order from the Environmental Protection Agency.
This case was considered the most significant property rights case on the court's docket this year, with the potential to change the balance of power between landowners and the EPA in disputes over land use, development, and the enforcement of environmental regulations.
Critics called the EPA action an example of overreach, as the property in question was a small vacant lot in the middle of an established residential subdivision. The government argued that allowing EPA compliance orders to be challenged in court could severely delay actions needed to prevent imminent ecological disasters.
Justice Antonin Scalia, writing for a unanimous court, said that Michael and Chantell Sackett are entitled to appeal the EPA order, rejecting the agency's argument that allowing landowners timely challenges to its decisions would undermine its ability to protect sensitive wetlands.
In the decision, Justice Scalia wrote:
The law's presumption of judicial review is a repudiation of the principle that efficiency of regulation conquers all. And there is no reason to think that the Clean Water Act was uniquely designed to enable the strong-arming of regulated parties into "voluntary compliance" without the opportunity for judicial review - even judicial review of the question whether the regulated party is within the EPA's jurisdiction.
The EPA issues nearly 3,000 administrative compliance orders a year that call on suspected violators of environmental laws to stop what they're doing and repair the harm they have caused. Business groups, homebuilders, road builders and agricultural interests all came out in opposition to the EPA in the case.
Mr. Sackett said that the Supreme Court ruling affirmed his belief that "the EPA is not a law unto itself." He said that, "The EPA used bullying and threats of terrifying fines, and has made our life hell for the past five years."
Senator John Barasso (R-Wyoming) said that:
This decision delivers a devastating blow to the Obama administration's "War on Western Jobs." This victory by one Western couple against a massive Washington bureaucracy will inspire others to challenge the administration's regulatory overreach.
The case stemmed from the Sacketts' purchase of a 0.63 acre lot for $23,000 near Priest Lake, Idaho in 2005. They had begun to lay gravel on the land, located in a residential neighborhood, when they were hit by an EPA compliance order informing them that the property had been designated a wetland under the Clean Water Act. Justice Scalia noted that the property bore little resemblance to any popular concept of a wetland, protected or not.
The Pacific Legal Foundation in Sacramento, which represented the Sacketts, called it
. . . a precedent-setting victory for the rights of all property owners. . . . The Supreme Court's ruling makes it clear that EPA bureaucrats are answerable to the law in the courts like the rest of us.
There are also efforts under way to stop the abuses of the policy of eminent domain. In February, the Virginia General Assembly gave its first approval to a constitutional amendment restoring the sanctity of private property. The measure was made necessary by the 2005 Supreme Court decision in Kelo v. New London, that gave towns and cities free rein to grab land - not for public uses - but for the use and benefit of well-connected developers.
Over the years, the Supreme Court has expanded the scope of government takings by redefining "public use." The Washington Times declares that:
Originally, the term was applied to such things as parks, roads or rail lines - all of which were open for use by the entire community. The high court elasticized the concept to include land intended for a public "purpose" such as eliminating blight or other catch-all categories related to public safety. The Kelo court went further to rule that economic growth, and the tax revenue that would accrue from it, was sufficient to justify a land grab.
Virginia Attorney General Kenneth Cuccinelli and Governor Robert McDonnell have been leading the fight to reform that state's property laws over the developer interests that, until now, have succeeded in blocking the amendment from consideration. The measure clarifies that eminent domain may be used only for purposes that are truly public. Land could not be transferred by the government to private entities to generate more tax dollars.
Christina Walsh, of the Institute for Justice, which argued the Kelo case before the Supreme Court, states that:
The power of eminent domain is supposed to be for "public use" so government can build things like roads and schools. . . . But starting with the wildly unsuccessful urban renewal efforts of the 1940s and 1950s, "public use" has been stretched to mean anything that could possibly benefit the public. . . . It has been demonstrated time and again that eminent domain is routinely used to wipe out black, Hispanic, and poorer communities, with less political capital and influence in favor of developers' grand plans.
Groups across the political spectrum have recognized the need to limit this abuse of power. The diverse coalition has included the League of United Latin American Citizens, the National Federation of Independent Business and the Farm Bureau. There is now a bipartisan bill, H.R. 1433, making its way through the House that would strip a city of federal economic development funding for two years if the city takes private property to give to someone else for private use.
In all these cases - the Supreme Court decision concerning the EPA, the proposed constitutional amendment in Virginia - and the legislation now being considered in the House, we see a commitment to restore private property rights, an essential ingredient of a genuinely free society. All of these efforts should be encouraged and supported.
The question of economic inequality has become an important part of our national conversation. Recently, the Congressional Budget Office supplied hard data on the widening economic gap. Among Western countries, America stands out as the place where economic and social status is most likely to be inherited.
What is less often discussed are the reasons for this disparity. A key element has been the dramatic changes that have taken place in recent years in family life.
In 1965, the respected liberal intellectual, and later Senator from New York, Daniel Patrick Moynihan, wrote a controversial report on the perilous state of the black family, pointing out that 24 percent of births among blacks and 3 percent among whites were out of wedlock. In retrospect, we can see that the decline in the American family was only beginning. Today, out-of-wedlock births account for 73 percent of births among blacks, 53 percent among Latinos, and 29 percent among whites.
Recently, a front-page article in The New York Times reported that more than half of births to mothers under age 30 now occur out of wedlock. Many are casting aside the notion that children should be raised in a stable two-parent family.
The economic class divide that is attracting increasing attention cannot be considered outside of an understanding of the lifestyle choices of those involved. Almost 70 percent of births to high school dropouts and 51 percent to high school graduates are out of wedlock. Among those with some college experience, the figure is 34 percent and for those with a college degree, just 8 percent.
The breakdown of the family has a significant impact upon children. Children in two-parent families, University of Virginia sociologist Bradford Wilcox shows, are more likely to "graduate from high school, finish college, become gainfully employed, and enjoy a stable family life themselves."
In the new book, Coming Apart: The State of White America, 1960-2010, Charles Murray, of the American Enterprise Institute, focusing on white Americans to avoid capitalizing on the problems faced by minority groups, sees a significant decline in what he considers America's founding virtues - industriousness, honesty, marriage and religiosity - over the last 50 years.
That decline, he illustrates, has not been uniform among different segments of the white population. Among the top 20 percent in income and education, he finds that rates of marriage and church attendance, after falling marginally in the 1970s, have plateaued at a high level since then. And these people have been working longer hours than ever before.
In contrast, among the bottom 30 percent, those indicators started falling in the 1970s, and have been plunging ever since. Among this group, he reports, one-third of men age 30 to 49 are not making a living, one-fifth of women are single mothers raising children, and nearly 40 percent have no involvement in a secular or religious organization. The result is that children being raised in such settings have the odds stacked against them.
Discussing Murray's book, columnist Michael Barone declares that:
These findings turn some conventional political wisdom on its head. They tend to contradict the liberals who blame increasing income disparity on free-market economics. In fact it is driven in large part by personal behavior and choices. They also undermine the conservatives who say that a liberation-minded upper class has been undermining traditional values to which more downscale Americans are striving to adhere. Murray's complaint against upscale liberals is not that they are libertines but that they fail to preach what they practice.
Society does not, of course, move only in a single direction. Some indicators of social dysfunction have improved dramatically, even as traditional families continue to lose ground. There has, for example, been a dramatic decline in teenage pregnancies among all racial groups since 1990. There has also been a 60 percent decline in violent crime since the mid-90s.
Still, something is clearly happening to the traditional working-class family. Part of it, of course, is a reduction in the work opportunities available to less-educated men as many unskilled jobs move abroad to cheaper labor markets, such as China. Adjusted for inflation, entry-level wages of male high school graduates working in the private sector had health benefits, but, by 2009, that was down to 29 percent.
In 1996, sociologist William Julius Wilson published When Work Disappears: The New World of the Urban Poor, in which he argued that much of the social disruption among African-Americans popularly attributed to collapsing values was actually caused by a lack of blue-collar jobs in urban areas.
As with all complex social problems, there are many causes. Charles Murray makes an important point about the importance of marriage and family in fostering economic security and well-being, something which cannot be ignored in confronting the question of economic inequality.
Writing in Time, Rich Lowery, editor of National Review, notes that:
No one wants to be preachy about marriage when everyone knows its inevitable frustrations. . . . At the very least, though, we should provide the facts about the importance of marriage as a matter of child welfare and economic aspiration. As a society, we have launched highly effective public-education campaigns on much less momentous issues, from smoking to recycling. It's not hard to think of a spokeswoman. Michelle Obama is the daughter in a traditional two-parent family and the mother in another one that even her husband's critics admire. If she took up marriage as a cause, she could ultimately have a much more meaningful impact on the lives of children than she will ever have urging them to do jumping jacks. For now, the decline of marriage is our most ignored national crisis. As it continues to slide away, our country will become less just and less mobile.
There is growing evidence that our colleges and universities are not teaching students what they need to compete for jobs in our high-tech international economy.
A 2010 study published by the Association of American Colleges and Universities found that 87 percent of employers believe that higher-education institutions have to raise student achievement if the U.S. is to be competitive in the global market. Sixty-three percent say that recent college graduates do not have the skills they need to succeed. And, according to a separate survey, more than a quarter of employers say entry-level writing skills are deficient.
A recent book, Academically Adrift: Limited Learning on College Campuses, by Richard Arum of New York University and Josipa Roksa of the University of Virginia, point out that gains in critical thinking, complex reasoning, and writing skills are either
. . . exceedingly small or nonexistent for a larger proportion of students. It has been found that 36 percent of students experience no significant improvement in learning (as measured by the Collegiate Learning Assessment) over four years of higher education.
Most universities do not require the courses considered core education subjects -math, science, foreign languages at the intermediate level, U.S. government or history, composition, literature, and economics.
The American Council of Trustees and Alumni (ACTA) has rated schools according to how many of the core subjects are required. A review of more than 1,000 colleges and universities found that 29 percent of schools require two or fewer subjects. Only 5 percent require economics. Less than 20 percent require U.S. government or history.
ACNA President Anne Neal declares:
How can one think critically about anything if one does not have a foundation of skills and knowledge? It's like suggesting that our future leaders only need to go to Wikipedia to determine the direction of our country.
Eight years ago, leaders at the University of Texas set out to measure something few in higher education had thought to do - how much their students learn before graduation. The answer that emerged was: not very much. The conclusion is based on results from a 90-minute essay test given to freshmen and seniors that aims to gauge gains in critical thinking and communication skills. Both the University of Texas and several hundred other public universities have joined the growing accountability movement in higher education in an effort to quantify collegiate learning on a large scale.
Last year, University of Texas freshmen scored an average 1261 on the assessment, which is graded on a scale similar to that of the SAT. Seniors averaged 1303. Both groups scored well, but seniors fared little better than freshmen. "The seniors have spent four years there, and the scores have not gone up that much," says New York University's Richard Arum.
Needless to say, it is not only our colleges that seem not to be properly preparing our students. Our high schools have fallen dramatically behind in teaching algebra, geometry, and trigonometry. This means, writes economist Walter Williams, that:
There are certain relatively high-paying careers that are probably off-limits for life. These include careers in architecture, chemistry, computer programming, engineering, medicine and certain technical fields. For example, one might meet all of the physical requirements to be a fighter pilot, but he's grounded if he doesn't have enough math to understand physics, aerodynamics and navigation. Mathematical ability provides the disciplined structure that helps people to think, speak, and write more clearly.
Drs. Eric Hanushek and Paul Peterson, senior fellows at the Hoover Institution, looked at the performance of our young people compared with their counterparts in other nations in their Newsweek article, "Why Can't American Students Compete?" last year. In the latest international tests administered by the Organization for Economic Cooperation and Development, found that only 32 percent of U.S. students ranked proficient in math - coming in between Portugal and Italy, but far behind South Korea, Finland, Canada, and the Netherlands. Seventy-five percent of Shanghai students tested proficient. In the U.S. only 7 percent could perform at an advanced level in mathematics.
In a 2009 The New York Times article, "Do We Need Foreign Technology Workers?," Dr. Vivek Wadhwa of Duke University said
. . . 49 percent of all U.S. science and engineering workers with doctorates are immigrants, as were 67 percent of the additions to the U.S. science and engineering workers with doctorates are immigrants, as were 67 percent of the additions to the U.S. science and engineering work force between 1995 and 2006. And roughly 60 percent of engineering Ph.D. students and 40 percent of master's students are foreign nationals.
Recently, President Obama proposed making kids stay in school until they are 18. This would not do much to address the nation's educational woes, say education specialists. "It's not the slam bang that it looks like," said Russ Whitehurst, director of Brown Center on Education Policy at the Brookings Institution. "It's not like you raise the age to 18 and they're going to go ahead and graduate - they're just going to stay in school."
There is much talk about the need for "everyone" to go to college - and very little discussion about what is actually being taught in our colleges. Professor Richard Vedder of Ohio University argues that:
The number going to college exceeds the number capable of mastering higher levels of intellectual inquiry. This leads colleges to alter their mission, watering down the intellectual content of what they do.
Simply put, colleges dumb down courses so that the students they admit can pass them.
Professor Walter Williams notes that:
Much of American education is a shambles. Part of a solution is for colleges to stop admitting students who are unprepared for real college work. That would help reveal the shoddy education provided at the primary and secondary school levels. But college administrators are more interested in larger numbers of students because they translate to more economy.
Beyond this, the nation's security is also at risk if schools do not improve, warns a report by a panel led by former Secretary of State Condoleezza Rice and Joel I. Klein, a former chancellor of New York City's school system.
"The dominant power of the twenty-first century will depend on human capital," the report said. "The failure to produce that capital will undermine American security."
The report said that the State Department and intelligence agencies face critical shortages in the number of foreign-language speakers, and that fields like science, defense, and aerospace face a shortage of skilled workers that is likely to worsen as baby boomers retire.
According to the panel, 75 percent of young adults do not qualify to serve in the military because they are physically unfit, or have criminal records, or inadequate levels of education. It said 30 percent of high school graduates do not do well enough on an aptitude test to serve.
In our global, high-tech economy, we cannot afford to continue the educational system we have. It is high time that we turned our attention to making the necessary changes and reforms that would keep America competitive in the twenty-first century. *
There can be little doubt that government spending is out of hand, and that Washington's role in our society has dramatically expanded in recent years. The American people are dismayed about the manner in which our political life has deteriorated. The party out of power, whichever one it may be, seems to want the party in power to fail - so that it can be replaced by themselves. The long-term best interests of the country are obscured.
Many tend to think of our problems in narrow partisan terms. Some argue, for example, that Democrats favor big government and deficit spending, while Republicans favor balanced budgets and limited government. Our choices in elections would be clear-cut if this were, in fact, the case.
More realistically, we see that whichever party is in power tends to expand government power. Deficits reached all-time highs under President George W. Bush - and have now reached even higher levels - dramatically higher - under President Obama. The dilemma we face is far more complex than partisan political spokesmen permit themselves to admit.
An important new book, Throw Them All Out: How Politicians and Their Friends Get Rich Off Insider Stock Tips, Land Deals, and Cronyism that Would Send the Rest of Us to Prison (Houghton Mifflin Harcourt) by Peter Schweitzer explores the world in which our politicians - both Democrats and Republicans - live.
Three years ago, then-House Speaker Nancy Pelosi and her husband, Paul, made the first of three purchases of Visa stock - Visa was holding an initial public offering, among the most lucrative ever. The Pelosis were granted early access to the IPO as "special customers" who received their shares at the opening price, $44. They turned a 50 percent profit in just two days.
Starting on March 18, the speaker and her husband made the first of three Visa stock buys, totaling between $1 million and $5 million. "Mere mortals would have to wait until March 19, when the stock would be publicly traded to get their shares," writes Peter Schweitzer, a scholar at the Hoover institution. He points out that the Pelosis got their stocks just two weeks after legislation was introduced in the House that would have allowed merchants to negotiate lower interchange fees with credit card companies. Visa's general counsel described it as a "bad bill." The speaker squelched it and kept further action bottled up for more than two years. During the time period the value of her Visa stock jumped more than 200 percent while the stock market as a whole dropped 15 percent.
"Isn't crony capitalism beautiful?" asks Schweitzer. The book shows members of Congress enriching themselves through earmarks and unpunished insider trading, politically connected companies being given billions of dollars in stimulus funds, and public money intended to help the environment, plus many varieties of kickbacks and favors.
Sadly, most of these actions fall within the letter, if not the spirit, of the law and ethics rules governing Congress.
While Senator John K. Kerry (D-MA) was working on healthcare in 2009, he and his wife began buying stock in Teva Pharmaceuticals. The Kerrys purchased nearly $750,000 in November alone. As the bill got closer to passing, the stock value soared. Pharmaceutical companies support these legislative efforts because they would increase the demand for prescription drugs. When President Obama's healthcare bill became law, the Kerrys reaped tens of thousands of dollars in capital gains while holding onto more than $1 million in Teva shares.
Republicans join their Democratic colleagues in these and other enterprises. House majority Leader Eric Cantor (R-VA) relentlessly attacks run-away government spending. To Cantor, an $8 billion high-speed rail connecting Las Vegas to Disneyland is wasteful "pork-barrel spending." Rep. Cantor set up the "You Cut" website to demonstrate how easy it is to slash government spending. Yet Cantor has been pressing the Transportation Department to spend nearly $3 billion in stimulus money on a high-speed rail project in his home state of Virginia. Newsweek found about five-dozen of the most fiscally conservative Republicans - including Texas Governor Rick Perry and Rep. Ron Paul (R-TX) trying to gain access to the very government spending they publicly oppose. According to Newsweek:
The stack of spending-request letters between these GOP members and federal agencies stands more than a foot tall, and disheartens some of the activists who sent Republicans to Washington the last election.
Judson Phillips, founder of the Tea Party Nation, says:
It's pretty disturbing. We sent many of these people there, and really, I wish some of our folks would get up and say, you know what, we have to cut the budget, and the budget is never going to get cut if all 535 members of Congress have their hands out all the time.
Former vice presidential candidate Sarah Palin, writing in The Wall Street Journal, declares that:
The corruption isn't confined to one political party or just a few bad apples. It's an endemic problem encompassing leadership on both sides of the aisle. It's an entire system of public servants feathering their own nests.
Now, Republican presidential candidate Newt Gingrich denounces big government. Previously, he enriched himself at its trough. Conservative columnist Timothey P. Carney notes that:
When Newt Gingrich says he never lobbied, he's not telling the truth. When he was a paid consultant for the drug industry's lobby group, Gingrich worked hard to persuade Republican congressmen to vote for the Medicare drug subsidy that the industry favored. To deny Gingrich was a lobbyist requires an Obama-like parsing over who is and who isn't a lobbyist. . . . Newt Gingrich spent the last decade being paid by big businesses to convince conservatives to support the big government policies that would profit his clients.
The fact - which partisans on both sides like to deny - is that both parties are responsible for the sad state of our political life - and our economic decline. Money-making opportunities for members of Congress are widespread. Peter Schweitzer details the most lucrative methods: accepting sweetheart deals of IPO stock from companies seeking to influence legislation, practicing insider trading with nonpublic government information, earmarking projects that benefit personal real-estate holdings, and even subtly extorting campaign donations through the threat of legislation unfavorable to an industry. The list is a long one.
Congress has been able to exempt itself from the laws it applies to everyone else. That includes laws that protect whistleblowers - nothing prevents members of Congress from retaliating against staff members who expose corruption, as well as Freedom of Information Act requests. Some say that it is easier to get classified documents from the CIA than from a congressional office.
To correct any problem, it is essential first to understand it properly. The problems in our political life are institutional and thinking that a simple change of parties will correct them is to misunderstand reality. Those who seek limited government, balanced budgets, and a respect for the Constitution must understand that both parties are responsible for the current state of affairs. With an appreciation of the real challenge before us, perhaps real solutions will be explored and debated. These, however, do not appear to be on today's political agenda.
Late in December, the Justice Department blocked a new South Carolina law that would require voters to present photo identification, saying the law would disproportionately suppress turnout among eligible minority voters.
The move was the first time since 1994 that the department has exercised its powers under the Voting Rights Act to block out a voter identification law. It followed a speech by Attorney General Eric Holder that signaled an aggressive stance in reviewing a wave of new state voting restrictions enacted in the name of fighting fraud.
In a letter to the South Carolina government, Thomas E. Perez, the assistant attorney general for civil rights, said that allowing the new requirement to go into effect would have "significant racial disparities."
Richard L. Hansen, an election law specialist at the University of California at Irvine, predicts that South Carolina will go to court, which could set up a "momentous" decision in the Supreme Court on whether a part of the Voting Rights Act that prevents states like South Carolina from changing their voting rules without federal permission is unconstitutional.
Governor Nikki Haley criticized the decision, accusing the Obama administration of "bullying" the state. She declared: "It is outrageous, and we plan to look at every possible option to get this terrible, clearly political decision overturned so we can protect the integrity of our electoral process and our 10th amendment rights."
Under the Voting Rights Act, an election rule or practice that disproportionately affects minority voters is illegal - even if there is no sign of discriminatory intent. South Carolina is one of several states that, because of a history of discriminatory practices, must prove that a measure would not disproportionately discourage minority voting.
In 2011, eight states - Arkansas, Kansas, Mississippi, Rhode Island, South Carolina, Tennessee, Texas and Wisconsin - passed variations of a rule requiring photo identification for voters. It is unclear if the four states not subject to the Voting Rights Act requirement - Wisconsin, Kansas, Rhode Island, and Tennessee - will face challenges to their laws. These laws have proven popular. In November, Mississippi voters easily approved an initiative requiring a government-issued photo ID at the polls.
Artur Davis, who serves in Congress from 2003 to 2011, and was an active member of the Congressional Black Caucus, once vigorously opposed voter ID laws. Now, he has changed his mind. In a commentary in the Montgomery Advertiser, Davis says that Alabama "did the right thing" in passing a voter ID law and admits, "I wish I had gotten it right when I was in political office."
As a congressman, he says, he "took the path of least resistance," opposing voter ID laws without any evidence to justify his position. He simply
. . . lapsed into the rhetoric of various partisans and activists who contend that requiring photo identification to vote is a suppression tactic aimed at thwarting black voter participation.
Today, Davis recognizes that the "most aggressive" voter suppression in the black community "is the wholesale manufacture of ballots at the polls" in some predominantly black districts.
Hans A. von Spakovsky, senior legal fellow at the Heritage Foundation and a former member of the Federal Election Commission, wrote a case study about voter prosecution in one such district, Greene County, Alabama, which is 80 percent black. He writes that,
Incumbent black county officials had stolen elections there for years, perpetrating widespread, systematic voter fraud. The Democratic incumbents were challenged by black Democratic reformers in 1994 who wanted to clean up local government. Voter fraud ran rampant that year. Ultimately, the U.S. Department of Justice won 11 convictions of Greene County miscreants who had cast hundreds of fraudulent votes.
Spakowsky argues that,
There was no question that (fraudulent) tactics changed the election in Greene County in 1994. But the worst thing from the standpoint of the reformers who had complained to the FBI was the reaction of the NAACP and the Southern Christian Leadership Conference (SCLC). The reformers thought those civil rights organizations would be eager to help those whose elections had been stolen through fraud. Instead both organizations attacked the FBI and federal prosecutors, claiming that the voter-fraud investigation was simply an attempt to suppress black voters and keep them from the polls.
One of the black reformers, John Kennard, a local member of the NAACP, wrote a letter to then-NAACP chairman Julian Bond charging the group with "defending people who knowingly and willingly participated in an organized . . . effort to steal the 1994 election from other black candidates." Mr. Bond replied simply that "sinister forces" behind the prosecution were "part and parcel of an ongoing attempt to stifle black voting strength." The NAACP Legal Defense Fund even defended those later found guilty of fraud.
The rhetoric used by the NAACP at that time, states Spakovsky, "is exactly the same kind that is being used today by . . . the NAACP and others who oppose voter ID laws. . . . Mr. Davis was disappointed to see Bill Clinton . . . compare voter ID to Jim Crow."
In Davis's view, voter ID is "unlikely to impede a single good-faith voter - and that only gives voting the same elements of security as writing a check at the store, or maintaining a liberty card. The case for voter ID is a good one, and it ought to make politics a little cleaner and the process of conducting elections much fairer."
Photo IDS are required to drive a car, cash a check, collect government assistance and fly on a plane - among other things. No one suggests that the need for photo ID during such transactions are "racist." To ask voters to properly identify themselves seems to be simply common sense.
Robert Knight, a senior fellow for the American Civil Rights Union, notes that, "Article I, Section 4 of the U.S. Constitution leaves voting procedures largely to the states. The Voting Rights Act requires stricter scrutiny of some states, but the case for voter suppression has yet to be made."
What is motivating the Obama administration to embark upon a crusade against voters identifying who they are before casting their ballots, is less than clear. If they think they are somehow fighting "racism," they are clearly on the wrong track.
For many years there has been an effort to read black Americans who dare to think for themselves out of the black community. To disagree with liberal politics or affirmative action is to be, in some way, rejecting one's blackness.
One of the vocal enforcers of this policy of thought control is Professor Randall Kennedy of Harvard, author of books such as Sellout: The Politics of Racial Betrayal. In Kennedy's view, there should be an expulsion option in the black community for blacks who adopt conservative views. Clarence Thomas, he argues, should turn in his black card. There should be boundaries, he declares, or else the notion of a black community bound by shared struggle disappears.
Fortunately for all of us, this point of view is now in retreat. When Professor Cornel West painted President Obama as cowardly and out of touch with black culture, he was sharply criticized by Professor Melissa Harris-Perry of Tulane. Writing in The Nation, she declared:
I vigorously object to the oft-repeated sentiment that African-Americans should avoid public disagreements and settle matters internally to present a united front. . . . Citizenship in a democratic system rests on the ability to freely and openly choose, criticize, and depose one's leaders. This must obtain whether those leaders are elected or self-appointed. It cannot be contingent on whether the critiques are accurate or false, empirical or ideological, well or poorly made. Citizenship is voice. . . . That African-Americans strenuously disagree among ourselves about goals and strategies is an ancient historical truth.
The media attention given to the criticism of President Obama by Professor West, states Harris-Perry, "can be understood only by the repeated refusal by mainstream media and broader American political culture to adequately grasp the heterogeneity of black thought."
An important new book, Who's Afraid of Post-Blackness? has just appeared. Its author, Toure, is a correspondent for MSNBC, a contributing editor at Rolling Stone, and the author of three previous books. The central point of the book is that there is no single way to be black. Justice Clarence Thomas, in his view, is no less black than Jay-Z. One of his goals, Toure writes, is "to attack and destroy the idea that there is a correct or legitimate way of doing blackness." Post-blackness, he declares, has no patience with "self-appointed identity cops" and their "cultural bullying."
What this means, according to the 105 prominent black Americans interviewed for the book, is a liberating pursuit of individuality. Black artists, like other professionals, now feel free to pursue any interest they like and are no longer burdened with the requirement to represent "the race."
Reviewing Toure's book for The New York Times, Professor Orlando Patterson of Harvard notes that:
. . . this is one of the most acutely observed accounts of what it is like to be young, black, and middle-class in contemporary America. Toure inventively draws on a range of evidence - autobiography, music, art, interviews, comedy and popular social analysis - for a performance carried through with unsparing honesty, is a distinctive voice that is often humorous, occasionally wary and defensive, but always intensely engaging.
Toure says that: "If there are 40 million black Americans, then there are 40 million ways to be black," repeating a line from Harvard university's Henry Louis Gates, Jr.:
I'm telling the self-appointed identity cops, who want to say, "This person isn't black enough," to put down their swords. Fear of post-blackness just inhibits our potential. Stop the bullying, and stop telling people they don't walk right, talk right, think right or like the right things. It's silly and ridiculous and pernicious.
When he was a student at Emory University, Toure made friends with the white students in his dormitory. Then he read The Autobiography of Malcolm X, switched his major to African-American studies, started a black-nationalist newspaper and moved into the Black Student Association's private house.
It was in this all-black house, he says, that after a party, in a room full of black people, that he was "loudly and angrily told by a linebacker-sized brother: 'Shut up, Toure! You ain't black!'" This episode led to something of an epiphany, he says. "Who gave him the right to determine what is and is not blackness for me? Who made him the judge of blackness?"
An interesting phenomenon of the emerging 2012 presidential election is the success of Herman Cain among Republican candidates. Ron Christie, a former special assistant to President George W. Bush and a fellow at the Institute of Politics at Harvard's Kennedy School of Government, notes that:
Cain's candidacy is the ultimate extension of the Obama presidency. A contender for the highest office in the land can be taken seriously regardless of race. We are heading into a 2012 election cycle in which Republican and tea party conservatives appear eager to support a candidate who just happens to be black, based on his convictions and ideas.
Several members of the Congressional Black Caucus have called the tea party movement and its backers racist. In August, Rep. Andre Carson (D-Ind.) told an audience at a CBC event in Miami that "some of them in Congress right now of this tea party movement would love to see you and me . . . hanging on a tree." He likened to "Jim Crow" the efforts of the tea party and its supporters in Congress to limit the size of the federal government.
Mr. Christie, who is black, declares that:
There will always be a fringe element in this country that is unable to accept individuals based on the color of their skin. But to me, continuing to paint the tea party as racist - even as Cain is surging - is simply more race baiting by dissatisfied Democrats.
In an interview with CNN, Herman Cain said he thinks at least a third of black voters would be inclined to support his candidacy because they are "open-minded." He declared:
This whole notion that all black Americans are necessarily going to stay and vote for Obama, that's simply not true. More and more black Americans are thinking for themselves, and that's a good thing.
Sadly, for many years, freedom of speech and debate, hailed in the nation at large as an essential element of a thriving democratic society, has been discouraged in the black community in the name of "unity." As Julius Lester, a one-time black radical and later a member of the faculty of the University of Massachusetts, said almost twenty years ago:
For two decades, an honest exchange of ideas in black America has been discouraged in the name of something called unity. Public disagreements have been perceived as providing ammunition to "the Enemy," that amorphous white "they" that works with a relentlessly evil intent against blacks. . . . The suppression of dissent and differences in the name of unity evolved into a form of social fascism, especially on college and university campuses. In some instances, black students were harassed and ostracized for having white friends. . . . Thinking black took precedence over thinking intelligently. . . .
Stifling free speech in the name of "unity," Lester shows, is something quite new in black American history. He notes that:
In the first part of the 19th century, Negro national conventions were held where black leaders debated and disagreed bitterly with each other over slavery and freedom, abolitionism and separatism. Frederick Douglass, the first national leader and Martin Delaney, the first black separatist, were political adversaries and friends. Dissent and disagreement have been the hallmark of black history.
The first black to win a seat in the U.S. House of Representatives in the 20th century and the first to be elected from a Northern state was Oscar de Priest of Illinois, a Republican. He believed in limited government, hard work, and the free market.
Finally, the realization is growing that not all black Americans think alike - nor do white, Hispanic, or Asian Americans. This understanding is long overdue. *
A Washington Post-ABC News poll shows a new high - 84 per cent of Americans - disapproving of the job Congress is doing, with almost two-thirds saying they "disapprove strongly." Just 13 percent of Americans approve of how things are going. It has been nearly four years since even 30 percent expressed approval of Congress.
Editorially, The Washington Examiner notes that,
Nobody can remember the last time the public approval rating of Congress was so low. That's because it's never been as low as it is now. . . . It's not hard to see why: the American people are fed up with the bipartisan corruption, endless partisan bickering, and lack of concrete action to address the nation's most pressing problems, especially out-of-control spending and the exploding national debt. . . . Both parties have presided over congressional majorities as Congress sank in public esteem during the past decade.
One reason for public dismay is the manner in which members of Congress often support while in office the interests they then go to work for once out of office. Of equal concern is the manner in which members of Congress vote for subsidies to the groups that have contributed to their political campaigns. This is true of members of Congress from both parties.
For example, soon after he retired last year as one of the leading liberals in Congress, former Rep. William D. Delahunt (D-MA) started his own lobbying firm with an office on the 16th floor of a Boston skyscraper. One of his first clients was a small coastal town that has agreed to pay him $15,000 a month for help in developing a wind energy project.
The New York Times reports that,
Amid the revolving door of congressmen-turned-lobbyists, there is nothing particularly remarkable about Mr. Delahunt's transition, except for one thing. While in Congress, he personally earmarked $1.7 million for the same energy project. So today, his firm, the Delahunt Group, stands to collect $90,000 or more for six months of work from the town of Hull, on Massachusetts Bay, with 80 percent of it coming from the pot of money he created through a pair of Energy Department grants in his final term in office.
Beyond the town of Hull, Delahunt's clients include at least three others who received millions of dollars in federal aid with his direct assistance. Barney Keller, communications director for the Club for Growth, a conservative group that tracks earmarks, says:
I cannot recall such an obvious example of a member of Congress allocating money that went directly into his own pocket. It speaks to why members of Congress shouldn't be using earmarks.
While this case may be somewhat extreme, it is repeatedly duplicated in one form or another by members of Congress. Consider former Senator Rick Santorum of Pennsylvania. A review of Santorum's many earmarks suggests that the federal money he helped direct to Pennsylvania paid off in the form of campaign cash. In just one piece of legislation, the defense appropriations bill for the 2006 fiscal year, Santorum helped secure $124 million in federal financing for 54 earmarks, according to Taxpayers for Common Sense, a budget watchdog group. In that year's election cycle, Santorum's Senate campaign committee and its "leadership PAC" took in more than $200,000 in contributions from people associated with the companies that benefited or their lobbyists. In all, Taxpayers for Common Sense estimated, Santorum helped secure more than $1 billion in earmarks during his Senate career.
Or consider former House Speaker Newt Gingrich, who speaks about being a "Reagan conservative" who supports "limited government," yet received $1.6 million from Freddie Mac over an eight-year period and gave the government-backed mortgage giant assistance in resisting reformers in Congress. Mr. Gingrich denies that he was a "lobbyist," as do some other former members of Congress. The Lobbying Disclosure Act of 1995 has three tests:
(1) Do you make more than $3,000 over three months from lobbying?
(2) Have you had more than one lobbying contract?
(3) Have you spent more than 20 per cent of your time lobbying for a single client over three months?
Only a person who has met all three tests must register as a lobbyist. Thus, a former member of Congress who has many lobbying contacts and makes $1 million a year lobbying but has no single client who takes up more than 20 per cent of his time would not be considered a lobbyist.
Clearly, it is time to change this rule. A task force of the American Bar Association recommended last year that the 20 percent rule be eliminated, which would require far more people to register as lobbyists, and subject them to ethics and disclosure requirements. The Center for Responsive Politics found that more than 3,000 lobbyists simply "de-registered" after Congress imposed new reporting requirements for lobbyists in 2007.
With regard to Gingrich, Washington Times columnist Don Lambro writes:
Mr. Gingrich . . . is the quintessential Washington insider, peddling influence in government. . . . He denied he was lobbying, insisting that he was hired to be a historian, when he was selling his services to one of the richest bidders in government. He was being paid well out of Freddie Mac's coffers while it was sowing the seeds of a housing scandal that resulted in an economic meltdown that has hurt millions of Americans and cost taxpayers billions of dollars. In other words, as a paid insider, he was part of the problem, not part of the solution.
Cutting the size of government, reducing our debt, and balancing the budget are embraced rhetorically by candidates for public office. Once elected, however, many become part of the system they campaigned against. The incentive structure once in office is to raise money to stay in office, and the way to do this is to vote subsidies to those groups being called upon to contribute. Both parties are engaged in this behavior, and candidates of both parties are rewarded so that special interests will have a friend in office no matter who is elected.
Sadly, the action of Congress - and the lobbying enterprises of former members of Congress - are legal. This, of course, is because it is Congress itself that writes the laws. There was a time when members of Congress, when they retired or were defeated, returned home. Some still do. Many others, however, remain in Washington, getting rich trying to influence their former colleagues.
This enterprise, of course, is only part of why Congress is viewed in such negative terms by 84 percent of Americans. Narrow partisanship and a greater concern for politics than for the country's well being is another. All of this is on naked display in today's Washington. The public contempt has been well earned. Whether that public dismay with our current politics can be transformed into an effective effort to alter this behavior remains to be seen. Too many in Washington have a vested interest in today's corrupt system as it exists. How to change the incentive structure for those in political life is our real challenge.
On December 31, 2011, President Obama signed the National Defense Authorization Act, which was supported by both Republicans and Democrats in the Congress. This legislation allows for the indefinite detention of American citizens within the United States - without charging them with a crime.
Under this law, those suspected of involvement with terrorism are to be held by the military. The president has the authority to detain citizens indefinitely. While Senator Carl Levin (D-MI) said that the bill followed existing law, "whatever the law is," the Senate specifically rejected an amendment that would exempt citizens, and the administration has opposed efforts to challenge such authority in federal court. The administration claims the right to strip citizens of legal protections based on its sole discretion.
This legislation was passed by the Senate 93 to 7. "The only comparable example was Reconstruction in the South," says constitutional law scholar Bruce Fein.
That was 150 years ago. This is the greatest expansion of the militarization of law enforcement in this country since.
The opposition to this legislation assembled an unlikely coalition of liberal Democrats, the American Civil Liberties Union, constitutional conservatives, libertarians, and three Republican senators - Rand Paul (KY), Mark Kirk (IL), and Mike Lee (UT).
The law, argued Senator Paul:
. . . would arm the military with the authority to detain indefinitely - without due process or trial - suspected al-Qaeda sympathizers, including American citizens apprehended on American soil. I want to repeat that. We are talking about people who are merely suspected of a crime. And we are talking about American citizens. If these provisions pass, we could see American citizens being sent to Guantanamo Bay.
Senator Mark Udall (D-CO), who proposed a failed amendment to strip the language from the bill, said that these provisions would "authorize the military to exercise unprecedented power on U.S. soil.
Writing in The American Conservative, Kelley Beaucar Vlahos notes that:
Already the federal government has broad authority to decide whether terror suspects are detained and held by federal law enforcement agencies and tried in regular courts or carried off by the military under the Military Commissions Act. This new legislation would allow the military to take control over the detention of suspects first - which means no Miranda rights and potentially no trial even on U.S. soil, putting the front lines of the War on Terror squarely on Main Street.
Bruce Fein argues that the ambiguity of words like "associated groups" or "substantially supports" gives the military wide discretion over who is considered a terrorist. "It's a totally arbitrary weapon that can be used to silence people."
Rep. Justin Amash (R-MI), one of the leading critics of the bill in the House of Representatives, issued a fact-checking memo outlining how the language can be abused:
For example, a person makes a one-time donation to a non-violent humanitarian group. Years later, the group commits hostile acts against an ally of the U.S. Under the Senate's NDAA, if the President determines the group was "associated" with terrorists, the President is authorized to detain the donor indefinitely, and without charge or trial.
James Madison warned that, "The means of defense against foreign danger historically have become instruments of tyranny at home."
Senator Paul states that:
The discussion now to suspend certain rights to due process is especially worrisome, given that we are engaged in a war that appears to have no end. Rights given up now cannot be expected to be returned. So we do well to contemplate the diminishment of due process, knowing that the rights we lose now may never be restored. . . . This legislation would arm the military with the authority to detain indefinitely - without due process or trial - suspected al-Qaeda sympathizers, including American citizens apprehended on American soil. . . . There is one thing and one thing only protecting innocent Americans from being detained at will by the hands of a too-powerful state: our Constitution and the checks it puts on government power. Should we err and remove some of the most important checks on state power in the name of fighting terrorism, well, then the terrorists will have won.
In his dissent in Hamdi v. Rumfeld, Justice Antonin Scalia declared:
Where the government accuses a citizen a waging war against it, our constitutional tradition has been to prosecute him in federal court for treason or some other crime. . . . The very core of liberty secured by our Anglo-Saxon system of separated powers has been freedom from indefinite imprisonment at the will of the executive.
Jonathan Turley, professor of law at George Washington University, points out that:
In a signing statement with the defense authorization bill, Obama said he does not intend to use the latest power to indefinitely imprison citizens. Yet, he still accepted the power as a sort of regretful autocrat. An authoritarian nation is defined not just by the use of authoritarian powers, but by the ability to use them. If a president can take away your freedom or your life on his own authority, all rights become little more than a discretionary grant subject to executive will.
James Madison, Turley recalls,
. . . famously warned that we needed a system that did not depend on the good intentions or motivations of our rulers: "if men were angels, no government would be necessary." Since 9/11, we have created the very government the framers feared: a government with sweeping and largely unchecked powers resting on the hope that they will be used wisely. The indefinite-detention provision in the defense authorization bill seemed to many civil libertarians like a betrayal by Obama. While the president had promised to veto the law over that provision, Senator Levin, a sponsor of the bill, disclosed on the Senate floor that it was in fact the White House that asked for the removal of an exception for citizens from indefinite detention.
Historically, those who seek to expand government power and diminish freedom always have a variety of good reasons to set forth for their purposes. In the case of Olmstead v. United States (1927), Justice Louis Brandeis warned that:
Experience should teach us to be most on our guard to protect liberty when the government's purposes are beneficent. Men born to freedom are naturally alert to repel invasion of their liberty by evil-minded rulers. The greatest dangers to liberty lurk in the insidious encroachment of men of zeal, well meaning but without understanding.
In recent years, in the name of ecology, racial equality, public health, and a variety of other "beneficent" purposes, the power of government has grown and the freedom of the individual has diminished, just as Justice Brandeis feared it would. But it has also diminished in the name of national security, something many conservatives, usually alert to the growth of government power, tend to support - or to acquiesce in. This is a serious mistake, as we now face the new threat of indefinite detention of American citizens. Freedom cannot be preserved by taking it away.
Developments in the Middle East remain chaotic. In the wake of the Arab Spring we have seen the overthrow of autocratic regimes in Tunisia and Egypt, a virtual civil war in Syria, and challenges to such governments as those in Bahrain and Yemen. The brutal Libyan dictator Muammar Ghaddafi has been overthrown. What comes next in this volatile region is difficult to know.
In an important new book, The Invisible Arab, Marwan Bishara, senior political analyst for Al Jazeera's English language service and the editor of its flagship show "Empire," and a former lecturer at the American University of Paris, provides a thoughtful analysis on how Arabs broke their own psychological barrier of fear to kindle one of the first significant revolutionary transformations of the 21st century.
Bishara describes how the historic takeover of Tunisia's November 7 Square, Egypt's Tahrir Square, and Bahrain's Pearl Square, among others, was the culmination of a long social and political struggle: countless sit-ins, strikes, and demonstrations by people who risked and suffered intimidation, torture, and imprisonment. It was aided by the dramatic rise of satellite television networks, including Al Jazeera, which bypass attempts by governments to censor news and information.
"Like most revolutions," he writes,
. . . this one was a long time coming. . . . They were the culmination of a long social and political struggle - countless sit-ins, strikes, pickets, and demonstrations. . . . The story begins with the young Arabs whose networking and organizations brought the people out into the streets. The youth, who make up 60 percent of all Arabs, have been looked upon as a "demographic bomb," and "economic burden," or as a "reservoir for extremism." However, unlike previous generations, this group heralded change.
For decades, Bishara argues, these Arab citizens and their social and political movements
. . . have been either unfairly demonized or totally ignored by the West . . . who saw the region through the prism of Israel, oil, terrorists, or radical Islamism. But today's Arabs are presenting a stark contrast to the distortion . . . heaped upon them. Characterized as unreceptive to democracy and freedom, they are now giving the world a lesson in both.
The more difficult part of this revolutionary journey, he notes, will come as
. . . the Arabs, sooner rather than later, discover that democracy and freedom come with greater responsibility. Defeating dictators is a prerequisite for progress, but does not guarantee it, especially in the absence of functional state institutions, democratic traditions, and modern infrastructure. The prevalence of poverty, inequality, and rising regional and international competition present huge challenges.
The origins of what he calls "the miserable Arab reality" are not civilizational, economic, or philosophical per se. Instead,
. . . . The origins . . . are political par excellence. Like capital to capitalists, or individualism to liberalism, the use and misuse of political power has been the factor that defines the contemporary Arab state. Arab regimes have subjugated or transformed all facets of Arab society.
By the beginning of the 21st century, Arab autocracies represented some of the oldest dictators in the world. Zine el-Abidine Ben Ali's dictatorship in Tunsia, the most recently established in the region, ruled for 25 years, followed by 30 years for Egypt's Mubarak, and 33 years for Yemen's Ali Abdulla Sale, and 43 years for Ghaddafi in Libya. In Syria, the al-Assad dynasty has ruled for 43 years, and Saddam Hussein was removed in 2003 after 24 bloody years ruling Iraq. Only the authoritarian Arab monarchies precede these dictatorships in longevity. Bahrain, a repressive Sunni monarchy, has ruled a Shia majority since its independence from Britain in 1971.
Arab states, writes Bishara,
. . . were, for a lack of better words, turned into the private estates of the ruling families. While these regimes boasted of secular republicanism, they were run similar to the Kingdom of Saudi Arabia and the United Arab Emirates, where no political activism was allowed and where the ruling families dominated all facets of political life. . . . The energy-producing Arab states are sustained rentier-type economies, characterized by a trade-off between economic welfare and political representation. Whereas the modern democratic state was founded on the cry of "no taxation without representation" . . . the modern Arab state has turned that notion on its head. With free-flowing petro-dollars pouring into their countries, Arab leaders have been able to sell off national resources and enrich themselves without having to turn to their citizens for personal taxation. . . . It became a ritual in the wealthy monarchies for the kings, emirs, or princes to provide small sums of money to their "subjects," and the poor in particular, as a makrama or "generous gift" that was generated from the natural resources in their land.
According to the U.N. Development Program's (UNDP) first Arab Human Development Report, written exclusively by Arab experts,
. . . Arab countries have not developed as quickly as comparable nations in other regions. Indeed, more than half of Arab women are illiterate; the region's infant mortality rate is twice as high as in Latin America and the Caribbean. Over the past 20 years, income growth per capita has also been extremely low.
In virtually every Arab country, more than half the population is under 30 - more than 140 million people - while a quarter are between the ages of 15 and 29, making this generation the largest youth cohort in the history of the Middle East. This unemployed and increasingly angry demographic has given traction to the "youth bulge" theory, which posits that when population growth outstrips that of jobs, social unrest is inevitable.
The influence of the information revolution has been crucial to developments in the region. As a result, notes Bishara,
. . . The Arab youth were able to think for themselves, freely exchange ideas, and see clearly beyond their ruler's deception, vengeful jihadist violence, or cynical Western calculations. . . . At the beginning of 2011, there were 27 million Arabs on Facebook, including 6 million Egyptians. Within a few nights, 2 million more Egyptians joined, underlining the centrality of the medium to the changes in the country. More than 60 million people in the Arab world are online.
Yemeni activist and 2011 Nobel Peace Prize co-winner Tawakkol Karman described the use of social media:
The revolution in Yemen began immediately after the fall of Ben Ali in Tunisia. . . . As I always do when arranging a demonstration, I posted a message on Facebook, calling on people to celebrate the Tunisian uprising.
This new media, writes Bishara,
. . . had an important cultural, even sociological role to play in patriarchal Arab societies. It helped young people break free from social constraints. It propelled them into uncharted territory, and it helped them mold a certain type of individualism. They began to enjoy an uninhibited space where they could share information and experiences, join chat rooms, and participate with one another. Theirs is a new found egalitarianism. . . .
Bishara laments the fact that,
Arabs have been valued not for their embrace of freedom or respect for human rights, but rather in terms of their proximity to U.S. interests. A subservient ally and energy providing partner made for a good Arab regime, regardless of its despotic or theocratic rule. . . . Western leaders have talked in slogans . . . about democracy and Islam, but have always been as indifferent to the people of the region as their dictators.
What does the future hold? Bishara recognizes that there are great dangers:
Islamist movements, the likes of the Egyptian Brotherhood, have already opened dialogue with the military and with Western powers on the basis of mutual interest and respect. This might be seen as a positive development, that allows for a new sort of regional order on the basis of a new accommodation among Islamists, the generals, and Western leaders. However, this triangle could eventually be as oppressive and totalitarian as the previous dictatorships . . . the Islamists must make sure that they reconcile with the principles of democracy and modern statehood, not a division of labor with the military. . . . Many of the Islamists I spoke to reckon that if they have a majority they have a democratic right to change the constitution and govern as they see religiously fit. They don't recognize democracy as first and foremost a system of government based on democratic values that go beyond the right of the majority to rule, to ensure that the rights and privileges of the minorities are respected and preserved. . . .
The Invisible Arab is a thoughtful contribution to our understanding of the Middle East from one of its articulate new voices. He shows how the revolutions have evolved - and how it could all go terribly wrong. Marwan Bishara hopes for a free and democratic Middle East - and he has his fingers crossed. *