Saturday, May 26, 2007

Arrested While Grieving

The New York Times
May 26, 2007

No one is paying much attention, but parts of New York City are like a police state for young men, women and children who happen to be black or Hispanic. They are routinely stopped, searched, harassed, intimidated, humiliated and, in many cases, arrested for no good reason.

Most black elected officials have joined their white colleagues and the media in turning a blind eye to this continuing outrage. And many black cops have joined their white colleagues in the systematic mistreatment.

Last Monday in the Bushwick section of Brooklyn, about three dozen grieving young people on their way to a wake for a teenage friend who had been murdered were surrounded by the police, cursed at, handcuffed and ordered into paddy wagons. They were taken to the 83rd precinct stationhouse, where several were thrown into jail.

Leana Matia, an 18-year-old student at John Jay College, was one of those taken into custody. “We were walking toward the train station to take the L train when all these cops just swooped in on us,” she said. “They cursed us out and pushed the guys. And then they handcuffed us. We kept asking, ‘What are you doing?’ ”

Children as young as 13 were among those swept up by the cops. Two of them, including 16-year-old Lamel Carter, were the children of police officers. Some of the youngsters were carrying notes from school saying that they were allowed to be absent to attend the wake. There is no evidence that I’ve been able to find — other than uncorroborated statements by the police — that the teenagers were misbehaving in any way.

Everyone was searched, but nothing unlawful was found — no weapons, no marijuana or other drugs. Some of the kids were told at the scene that they were being seized because they had assembled unlawfully. “I didn’t know what unlawful assembly was,” said Kumar Singh, 18, who was among those arrested.

According to the police, the youngsters at the scene were on a rampage, yelling and blocking traffic. That does not seem to be the truth.

I spoke individually to several of the youngsters, to the principal of Bushwick Community High School (where a number of the kids are students), to a parent who was at the scene, and others. Nowhere was there even a hint of the chaos described by the police. Every account that I was able to find described a large group of youngsters, very sad and downcast about the loss of their friend, walking peacefully toward the station.

Kathleen Williams, whose son and two nieces were rounded up, was at the scene. She said there was no disturbance at all, and that when she tried to ask the police why the kids were being picked up, she was told to be quiet or she would be arrested, too.

Capt. Scott Henderson of the 83rd Precinct told me that the police had developed a “plan” to deal with youngsters going to the wake because they suspected that the murder was gang-related and there had already been some retaliation. He said he had personally witnessed the youngsters in Bushwick behaving badly and gave the order to arrest them.

Many of the kids were wearing white T-shirts with a picture of the dead teenager and the letters “R.I.P.” on them. The cops cited the T-shirts as evidence of gang membership.

Thirty-two of the youngsters were arrested. Most were charged with unlawful assembly and disorderly conduct. Several were held in jail overnight.

Police Commissioner Ray Kelly did not exactly give the arrests a ringing endorsement. He said, in a prepared statement, “A police captain who witnessed the activity made a good-faith judgment in ordering the arrests.”

A spokesman for the Brooklyn district attorney, Charles Hynes, said, “It wouldn’t be unusual for a lot of this stuff to get dismissed.”

The principal of Bushwick Community High, Tira Randall, said, “My kids come in here on a daily basis with stories about harassment by the police. They’re not making these stories up.”

New York City cops stopped and, in many cases, searched individuals more than a half million times last year. Those stops are not happening on Park Avenue or Fifth Avenue in Manhattan. Thousands upon thousands of them amount to simple harassment of young black and Hispanic males and females who have done absolutely nothing wrong, but feel helpless to object.

It is long past time for this harassment of ethnic minorities by the police to cease. Why it has been tolerated this long, I have no idea.

A Katrina Health Care System

The New York Times
May 26, 2007

This is my third week as a guest columnist. Let’s take a look at the health care news that’s transpired in that time.

First, DaimlerChrylser sold off 80 percent of its Chrysler division for three pebbles and a piece of string. O.K., the cash payment was actually $1.35 billion. But for an 82-year-old company that built more than two million cars and trucks last year, took in $47 billion in revenue, and owns 64 million square feet of factory real estate in North America alone, that’s almost nothing. Yet analysts say that it was a great deal for Daimler. Why? Because the buyer, Cerberus Capital Management, agreed to absorb Chrysler’s $18 billion in health and pension liability costs.

Stop and think about this for a minute. The deal meant that the costs of our job-based health insurance system — costs adding $1,500 to each car Chrysler builds here, but almost nothing to those built in Canada or Europe — have so broken the automaker’s ability to compete that giving it away became the smartest thing Daimler could do. Chrysler’s mistake was to hang around long enough to collect retirees and an older-than-average work force. As a result, it now has less market value than Men’s Wearhouse, Hasbro, the Cheesecake Factory, NutriSystem, Foot Locker and Pottery Barn. Oprah is worth more than Chrysler. This is not good.

Meanwhile, officials at West Jefferson Medical Center outside New Orleans reported that the number of indigent patients admitted there has tripled since Hurricane Katrina. The uninsured are now 30 percent of their emergency room patients. Officials in Houston hospitals are reporting similar numbers. Conditions seem worse rather than better. Katrina caused a vicious spiral. Large numbers of people lost their jobs and, with them, their health coverage. Charity Hospital, the one state-funded hospital in New Orleans, closed. The few open hospital emergency rooms in the area have had to handle the load, but it’s put the hospitals in financial crisis. Four hundred physicians filed a lawsuit against the state seeking payment for uncompensated care, and massive numbers of doctors and nurses have left the area.

In Washington, a conference held by the American College of Emergency Physicians revealed that New Orleans may have it worst, but emergency rooms everywhere are drowning in patients. Mandated to care for the uninsured, they are increasingly unprofitable. So although the influx of patients has grown, 500 emergency rooms have closed in the last decade. The result: 91 percent report overcrowding — meaning wait times for the acutely ill of more than an hour or waiting rooms filled more than six hours per day. Almost half report this occurring daily.

A few days later, the Commonwealth Fund released one of the most detailed studies ever done comparing care in the U.S., Australia, Canada, Germany, New Zealand and Britain. We’ve known for awhile that health care here is more expensive than anywhere and that our life expectancy is somehow shorter. But the particulars were the surprise.

On the good side, the study found that once we get into a doctor’s office, American patients are as likely as patients anywhere to get the right care, especially for prevention. Only Germans have a shorter wait for surgery when it’s needed. And 85 percent of Americans are happy with the care they get.

But we also proved to be the least likely to have a regular doctor — and starkly less likely to have had the same doctor for five years. We have the hardest time finding care on nights or weekends outside of an E.R. And we are the most likely (after Canadians) to wait six days or more for an appointment when we need medical attention. Half of Americans also reported forgoing medical care because of cost in the last two years, twice the proportion elsewhere.

None of this news, however, did more than lift a few eyebrows. So this is the picture of American health care you get after watching for a few weeks: it’s full of holes, it’s slowly bankrupting us and we’re kind of used to it.

That leaves two possibilities: (1) We’ve given up on the country; or (2) We’re just waiting for someone else to be in charge.

I’m pulling for No. 2.

Friday, May 25, 2007

Repairing the Damage Done

By Jules Witcover
Campaigning for History
The New York Times
May 25, 2007

WASHINGTON — More than three decades ago, Nixon White House Counsel John Dean called the Watergate cover-up “a cancer on the presidency.” Another one exists today, posing a challenge for the next president to restore the office as a credible voice in foreign policy.

President Bush’s detour in Iraq off the multilateral track adhered to throughout the Cold War years has caused a deep drop in American prestige abroad, requiring extensive repair by his successor regardless of which party wins in 2008.

While Bush’s invasion and occupation of Iraq has been the immediate trigger for the decline of American influence, just as significant was his original failure to capitalize on the terrorist attacks of 9/11 to mobilize a truly collective global response.

The outpouring of empathy for the United States in the wake of those events was quickly short-circuited by the invasion. In diverting the American military from its legitimate focus against the real perpetrators of the attacks, Bush left the primary job undone in Afghanistan, in order to chase a more ambitious dream of superpower dominance.

A decade earlier, neoconservative theorists in the Republican Party saw in the collapse of the Soviet Union an invitation for America to assume a vastly more assertive, unilateral role in imposing its power and political ideology elsewhere.

Among these theorists at the Pentagon was Paul Wolfowitz, deputy undersecretary to Secretary of Defense Dick Cheney, who worried that with the demise of Soviet communism the strongest rationale for a muscular national defense was gone. Yet serious threats remained, from nuclear ambitions in North Korea and the determination in Iran and Iraq to assure control of their vast oil resources essential to American power.

Under Wolfowitz, a quest was undertaken for a strategy justifying continued American military hegemony. As James Mann wrote in his revealing 2004 book, “The Rise of the Vulcans: The History of Bush’s War Cabinet,” Wolfowitz assigned his chief assistant, I. Lewis “Scooter” Libby, to have a draft prepared that “set forth a new vision for a world dominated by a lone American superpower, actively working to make sure that no rival or group of rivals would ever emerge.”

Libby gave the assignment to another Wolfowitz aide named Zalmay Khalilzad, little known then outside defense circles. He ultimately became the American ambassador to occupied Iraq after the overthrow of Saddam Hussein and the establishment of a new American-sponsored regime in Baghdad, and subsequently ambassador to the United Nations.

A leak of the Khalilzad draft, according to Mann, caused embarrassment and was rewritten, but the finished product became a rough blueprint for the radical new American foreign policy that flowered in the George W. Bush administration.

The draft envisioned a world in which American military power alone would rival or replace the collective security that had marked U.S. containment policy through the Cold War. It even hypothesized, Mann wrote, the possible future need for “preempting an impending attack with nuclear chemical and biological weapons” — the rationale eventually dusted off for the Iraq invasion.

A side incentive for developing the new strategy was pressure from congressional Democrats for a substantial “peace dividend” after the Cold War’s end. To counter such diversions of defense spending for neglected domestic needs, the Pentagon theorists needed a persuasive argument for a lusty military budget.

When Khalilzad’s draft kicked up criticism that it smacked of hostility to other nations, Libby toned down the language in what became the Defense Policy Guidance of 1992, but the essential message remained. By keeping America militarily all-powerful, other countries would be deterred from attempting to match its strength.

When Bill Clinton took over the White House after the 1992 election, he didn’t, according to Mann, seriously challenge the basic force concept, focusing more on domestic matters. The neoconservative theorists, out of power, nevertheless fretted about Congressional projections of static or shrinking defense budgets.

In 1997, they banded together as the Project for the New American Century to build on the 1992 policy statement. A subsequent paper called for more defense spending to preserve “the current Pax Americana … through the coming transformation of war made possible by the new techniques,” including nuclear weapons, in the hands of new, often regional threats.

The group noted critically that the Pentagon’s Quadrennial Defense Review of 1997 “assumed that [North Korea’s] Kim Jong Il and [Iraq’s] Saddam Hussein each could begin a war—perhaps even while employing chemical, biological or even nuclear weapons—and the United States would make no effort to unseat either ruler.”

The paper observed that “past Pentagon war games have given little or no consideration to the force requirements necessary not only to defeat an attack but to remove these regimes from power and conduct post-combat stability operations …

“The current American peace will be short-lived if the United States becomes vulnerable to rogue powers with small, inexpensive arsenals of ballistic missiles and nuclear warheads or other weapons of mass destruction. We cannot allow North Korea, Iran, Iraq or similar states to undermine American leadership, intimidate American allies or threaten the American homeland itself. The blessings of the American peace, purchased at fearful cost and a century of effort, should not be so trivially squandered.”

According to Gary Schmitt, a co-chairman of the project, George W. Bush, governor of Texas at the time, was neither a member of the group nor as far as Schmitt knows aware at the time of its findings. But among the participants were Wolfowitz and Libby, architects of the basic concept of a muscular defense including preemption of threats of weapons of mass destruction.

Did Bush as president come on his own to embrace the precepts of the project or was he sold on them by Cheney, Wolfowitz, Libby and others of the circle known as “the Vulcans”? Either way, events of the post-9/11 years have confirmed that those precepts were at the core of the radical foreign policy that have imperiled his presidency and American leadership across the globe.

Among the first challenges for Bush’s successor in 2009 will be to demonstrate dramatically that he or she has learned the hard lesson of that go-it-alone foreign policy, which in the end forced America to go hat-in-hand to the international community. The new president must waste no time putting America back on the track of multilateralism and collective security.

With very good luck and a return to diplomacy, the United States could be out of Iraq by that time, giving the next president, Republican or Democratic, a free hand to restore the reputation of the American presidency in the eyes of friends and foes abroad, and at home as well.

US Congress ratifies Democratic cave-in on Iraq war funding

By Patrick Martin and Barry Grey
25 May 2007

The US House of Representatives and Senate voted Thursday to approve an additional $100 billion to fund the wars in Iraq and Afghanistan, with Democrats supplying ample votes in both chambers to give President Bush all of the money he requested and a free hand to further escalate the military violence in Iraq.

The legislation was the product of negotiations between Democratic Speaker of the House Nancy Pelosi and Democratic Senate Majority Leader Harry Reid, Republican congressional leaders, and the White House. The Democratic leadership abandoned all of its earlier demands for troop withdrawal timetables, enforceable “benchmarks” and other limitations on Bush’s conduct of the war.

At a press conference Thursday morning, in advance of the House and Senate votes, Bush endorsed the war-funding legislation. He is expected to sign it on Friday.

The wide margins in support of the bill in both legislative chambers underscored the abject character of the Democrats’ capitulation to the administration. The measure was passed in the House of Representatives by a vote of 280 to 142, with 86 Democrats voting in favor. Among the Democrats voting “yes” were House Majority Leader Steny Hoyer and Rahm Emanuel, the chairman of the House Democratic Caucus.

It was approved in the Senate by a lopsided vote of 80 to 14, with more than twice as many Democrats voting “yes” as those who voted against. The top Democrat in the Senate, Reid, voted “yes,” along with Richard Durbin, the Democratic majority whip, Joseph Biden, chairman of the Senate Foreign Relations Committee and 2008 presidential contender, and Carl Levin, the chairman of the Senate Armed Services Committee. Among the nominal liberals who supported the bill was Michigan Senator Debbie Stabenow.

Contenders for the Democratic presidential nomination Hillary Clinton, Barack Obama and Christopher Dodd all voted “no.” But they allowed the measure to pass by default, refusing to fight for a filibuster or other procedural device to block its passage.

With the completion of their capitulation to Bush’s war policy, following months of antiwar posturing, the Democrats fulfilled their pledge to pass a war-funding bill that Bush would sign before the Memorial Day recess.

The congressional action is in defiance of the sentiments the American people, expressed in last November’s congressional election. Only hours before the votes were taken, a new poll commissioned by the New York Times and CBS News found a record level of opposition to the war. The findings included 61 percent believing the US should never have intervened in Iraq, 76 percent saying the war was going badly, and 47 percent who described it as going “very badly.”

Only 30 percent gave President Bush a positive approval rating, with 63 percent opposed. Only 23 percent approved of Bush’s handling of the war. More than three quarters, 76 percent, including a majority of Republicans, said the Bush plan to “surge” additional troops to Iraq had either accomplished nothing or made conditions worse.

One figure sums up the enormous gulf between mass opinion and the sentiments of the US political establishment: 63 percent of those polled said the US should set a date in 2008 for withdrawing troops from Iraq.

The Democrats have sought to navigate between this massive popular opposition to the war and the determination of the Bush administration and the entire US ruling elite to control Iraq’s oil resources and dominate the Persian Gulf. Democratic congressional leaders Reid and Pelosi have attempted to fob off public opinion with antiwar noises, while they proceeded to give the Bush administration everything it asked for in terms of funding to continue the bloodbath in Iraq.

However, the Democrats’ craven cave-in will further antagonize and disgust millions of people who deeply oppose the US aggression in Iraq and voted the Republicans out of power in Congress six months ago in order to bring a speedy end to the war.

In an effort to give rank-and-file House Democrats—many of them elected on the basis of the groundswell in antiwar voting last November—some political cover, Pelosi adopted a cynical parliamentary stratagem. Instead of a single up-or-down vote on the war-funding, there were two votes: the first to approve the funding of domestic measures, including aid to Hurricane Katrina victims and an increase in the minimum wage. That part of the bill passed by a vote of 348 to 73. The second vote was on the military portion of the emergency funding bill.

This maneuver insured that a solid Republican bloc would approve the military funding, with significant Democratic support, while a solid Democratic bloc would approve the domestic funding over mainly Republican opposition. Pelosi herself announced that she would vote against the military funding, although she helped negotiate the agreement with the White House and congressional Republicans that produced the bill, and then approved the parliamentary procedure that ensured its passage.

During the 12-year period of Republican control of the House of Representatives, Republican speakers of the house like Dennis Hastert laid down the rule that no bill would be brought to a vote unless it had the support of the Republican caucus, regardless of whether there was majority support in the House as a whole. This “majority of the majority” principle was invoked repeatedly to prevent any legislation from being passed through a coalition of the Democrats and dissident Republicans.

Facing the first major vote on the most important of issues, war funding, Pelosi adopted the opposite position, in order to make sure that the war-funding measure garnered a sufficiently large Republican vote to succeed.

This decision, in and of itself, demonstrates a major difference between the Democrats and Republicans. The Republicans are more ruthless and determined because they openly represent the interests of the corporate ruling class. The Democrats are just as committed to defending the moneyed elite. But in order to maintain the political monopoly of the two-party system, they have to pretend to represent the interests of working people. Hence the vacillating, half-hearted, intrinsically two-faced character of this party.

Thursday’s debate in the House produced an effusion of outpourings from Democrats professing anguish over the prospect of approving war funding, but concluding either that they had to vote for more killing in Iraq—in the name “supporting the troops”—or vote, for the record, against the funding, while supporting a leadership that had worked to make sure the money was authorized.

House Appropriations Chairman David Obey epitomized the duplicity and hypocrisy of the Democrats, declaring, “I hate this agreement. I’m going to vote against it, even though I negotiated it.”

The response of the Bush administration to the capitulation of the Democrats was to press forward with its policy of escalating the violence in Iraq. Bush appeared at a Rose Garden press conference to proclaim his determination to achieve “victory” in Iraq. Repeatedly invoking 9/11, he resorted to his staple tactic of fear-mongering, telling two different reporters that their children could die at the hands of terrorists if the US withdrew from Iraq.

Bush stated flatly that the ensuing months would see an increase in violence and death among both Iraqis and American soldiers. August could be a “bloody” month, he declared.

This is what the Democrats are sanctioning by granting Bush’s war funding request and giving him a free hand to further escalate the war.

In a front-page story May 23, the Washington Post reported that top US commanders and diplomats in Iraq have drafted a detailed plan for intensifying the war over the next 18 months, elaborating both military operations and political interventions such as the purging of Iraq’s government and security forces of elements suspected of undermining the US occupation regime.

According to the newspaper, “The plan anticipates keeping US troop levels elevated into next year,” meaning that the “surge” level of 160,000 troops will be sustained indefinitely, and with it, the increased death toll among both American troops and Iraqi civilians.

May seems likely to become the bloodiest month of the year, and perhaps the bloodiest of the war in terms of American casualties. Nine more soldiers and Marines were killed Tuesday, May 22, bringing the death toll for the month to 81. Wednesday was one the worst days of the year for Iraqi casualties, with more than 100 people killed and 130 wounded in a series of bombings, shootings and other incidents., the liberal lobbying group founded by former Democratic Party and Clinton administration officials, sent an email alert Wednesday declaring that “every single Democrat must oppose this bill.” Eli Pariser, the group’s executive director, told the press, “This is going to be a very important vote. It will signal who is very serious about ending the war, and who is posturing.”

In fact, as well knew, appealing for Democratic congressional action to defeat the war funding bill was an exercise in futility. There is not a single Democratic congressman or senator who is genuinely committed to ending the war. All are posturing, in a variety of ways, but all voted for Pelosi as speaker and Reid as majority leader, and all would vote for them again today.

Pariser added, “The perplexing thing about this moment is that the Democrats have the political wind strongly at their backs, and the country wants them to fight.”

Such apologetics—the stock-in-trade of and similar liberal groups—only conceal the central political reality: The Democratic Party is a party of American imperialism, and, as such, is beholden not to the will of the people, but to the demands of the US financial elite. The war was launched—on the basis of lies—to further the economic and geo-political interests of this ruling elite in the Middle East and internationally.

If anything, the massive popular opposition to the war placed even greater pressure on the Democrats to withdraw their tactical objections to Bush’s conduct of the war and give him what he demanded. The Democratic Party has become the critical enabler and facilitator of a neo-colonial war to which the US ruling elite remains fully committed.

See Also:

Democratic Party completes its capitulation on Iraq [24 May 2007]

US officials guilty of "sociocide" in Iraq must be held accountable [24 May 2007]

Democrats drop "withdrawal" deadlines as administration mulls post-surge Iraq [23 May 2007]

The US war and occupation of Iraq—the murder of a society [19 May 2007]

Immigrants and Politics

The New York Times
May 25, 2007

A piece of advice for progressives trying to figure out where they stand on immigration reform: it’s the political economy, stupid. Analyzing the direct economic gains and losses from proposed reform isn’t enough. You also have to think about how the reform would affect the future political environment.

To see what I mean — and why the proposed immigration bill, despite good intentions, could well make things worse — let’s take a look back at America’s last era of mass immigration.

My own grandparents came to this country during that era, which ended with the imposition of severe immigration restrictions in the 1920s. Needless to say, I’m very glad they made it in before Congress slammed the door. And today’s would-be immigrants are just as deserving as Emma Lazarus’s “huddled masses, yearning to breathe free.”

Moreover, as supporters of immigrant rights rightly remind us, everything today’s immigrant-bashers say — that immigrants are insufficiently skilled, that they’re too culturally alien, and, implied though rarely stated explicitly, that they’re not white enough — was said a century ago about Italians, Poles and Jews.

Yet then as now there were some good reasons to be concerned about the effects of immigration.

There’s a highly technical controversy going on among economists about the effects of recent immigration on wages. However that dispute turns out, it’s clear that the earlier wave of immigration increased inequality and depressed the wages of the less skilled. For example, a recent study by Jeffrey Williamson, a Harvard economic historian, suggests that in 1913 the real wages of unskilled U.S. workers were around 10 percent lower than they would have been without mass immigration. But the straight economics was the least of it. Much more important was the way immigration diluted democracy.

In 1910, almost 14 percent of voting-age males in the United States were non-naturalized immigrants. (Women didn’t get the vote until 1920.) Add in the disenfranchised blacks of the Jim Crow South, and what you had in America was a sort of minor-key apartheid system, with about a quarter of the population — in general, the poorest and most in need of help — denied any political voice.

That dilution of democracy helped prevent any effective response to the excesses and injustices of the Gilded Age, because those who might have demanded that politicians support labor rights, progressive taxation and a basic social safety net didn’t have the right to vote. Conversely, the restrictions on immigration imposed in the 1920s had the unintended effect of paving the way for the New Deal and sustaining its achievements, by creating a fully enfranchised working class.

But now we’re living in the second Gilded Age. And as before, one of the things making antiworker, unequalizing policies politically possible is the fact that millions of the worst-paid workers in this country can’t vote. What progressives should care about, above all, is that immigration reform stop our drift into a new system of de facto apartheid.

Now, the proposed immigration reform does the right thing in principle by creating a path to citizenship for those already here. We’re not going to expel 11 million illegal immigrants, so the only way to avoid having those immigrants be a permanent disenfranchised class is to bring them into the body politic.

And I can’t share the outrage of those who say that illegal immigrants broke the law by coming here. Is that any worse than what my grandfather did by staying in America, when he was supposed to return to Russia to serve in the czar’s army?

But the bill creates a path to citizenship so torturous that most immigrants probably won’t even try to legalize themselves. Meanwhile, the bill creates a guest worker program, which is exactly what we don’t want to do. Yes, it would raise the income of the guest workers themselves, and in narrow financial terms guest workers are a good deal for the host nation — because they don’t bring their families, they impose few costs on taxpayers. But it formally creates exactly the kind of apartheid system we want to avoid.

Progressive supporters of the proposed bill defend the guest worker program as a necessary evil, the price that must be paid for business support. Right now, however, the price looks too high and the reward too small: this bill could all too easily end up actually expanding the class of disenfranchised workers.

, , , , , , , ,

The Catholic Boom

The New York Times
May 25, 2007

The pope and many others speak for the thoroughly religious. Christopher Hitchens has the latest best seller on behalf of the antireligious. But who speaks for the quasi-religious?

Quasi-religious people attend services, but they’re bored much of the time. They read the Bible, but find large parts of it odd and irrelevant. They find themselves inextricably bound to their faith, but think some of the people who define it are nuts.

Whatever the state of their ambivalent souls, quasi-religious people often drive history. Abraham Lincoln knew scripture line by line but never quite shared the faith that mesmerized him. Quasi-religious Protestants, drifting anxiously from the certainties of their old religion, built Victorian England. Quasi-religious Jews, climbing up from ancestral orthodoxy, helped shape 20th-century American culture.

And now we are in the midst of an economic boom among quasi-religious Catholics. A generation ago, Catholic incomes and economic prospects were well below the national average. They had much lower college completion rates than mainline Protestants. But the past few decades have seen enormous Catholic social mobility.

According to Lisa Keister, a sociologist at Duke, non-Hispanic white Catholics have watched their personal wealth shoot upward. They have erased the gap that used to separate them from mainline Protestants.

Or, as Keister writes in a journal article, “Preliminary evidence indicates that whites who were raised in Catholic families are no longer asset-poor and may even be among the wealthiest groups of adults in the United States today.”

How have they done it?

Well, they started from their traditional Catholic cultural base. That meant, in the 1950s and early ’60s, a strong emphasis on neighborhood cohesion and family, and a strong preference for obedience and solidarity over autonomy and rebellion.

Then over the decades, the authority of the church weakened and young Catholics assimilated. Catholic values began to converge with Protestant values. Catholic adults were more likely to use contraceptives and fertility rates plummeted. They raised their children to value autonomy more and obedience less.

The process created a crisis for the church, as it struggled to maintain authority over its American flock. But the shift was an economic boon to Catholics themselves. They found themselves in a quasi-religious sweet spot.

On the one hand, modern Catholics have retained many of the traditional patterns of their ancestors — high marriage rates, high family stability rates, low divorce rates. Catholic investors save a lot and favor low-risk investment portfolios. On the other hand, they have also become more individualistic, more future-oriented and less bound by neighborhood and extended family. They are now much better educated than their parents or grandparents, and much better educated than their family histories would lead you to predict.

More or less successfully, the children of white, ethnic, blue-collar neighborhoods have managed to adapt the Catholic communal heritage to the dynamism of a global economy. If this country was entirely Catholic, we wouldn’t be having a big debate over stagnant wages and low social mobility. The problems would scarcely exist. Populists and various politicians can talk about the prosperity-destroying menace of immigration and foreign trade. But modern Catholics have created a hybrid culture that trumps it.

In fact, if you really wanted to supercharge the nation, you’d fill it with college students who constantly attend church, but who are skeptical of everything they hear there. For there are at least two things we know about flourishing in a modern society.

First, college students who attend religious services regularly do better than those that don’t. As Margarita Mooney, a Princeton sociologist, has demonstrated in her research, they work harder and are more engaged with campus life. Second, students who come from denominations that encourage dissent are more successful, on average, than students from denominations that don’t.

This embodies the social gospel annex to the quasi-religious creed: Always try to be the least believing member of one of the more observant sects. Participate in organized religion, but be a friendly dissident inside. Ensconce yourself in traditional moral practice, but champion piecemeal modernization. Submit to the wisdom of the ages, but with one eye open.

The problem is nobody is ever going to write a book sketching out the full quasi-religious recipe for life. The message “God is Great” appeals to billions. Hitchens rides the best-seller list with “God is Not Great.” Nobody wants to read a book called “God is Right Most of the Time.”

A New Silent Majority

By Mark Buchanan
Our Lives as Atoms
The New York Times
May 23, 2007

Something seems a little out of whack between the mainstream media and the American people. Take the arguments of the past few days over former President Jimmy Carter’s remarks about the Bush administration and the consequences of its particular brand of foreign policy. Carter didn’t attack President Bush personally, but said that “as far as the adverse impact on the nation around the world, this administration has been the worst in history,” which can’t really be too far out of line with what many Americans think.

In coverage typical of much of the media, however, NBC Nightly News asked whether Carter had broken “an unwritten rule when commenting on the current president,” and portrayed Carter’s words — unfairly it seems — as a personal attack on President Bush. Fox News called it “unprecedented.” Yet as an article in this newspaper on Tuesday pointed out, “presidential scholars roll their eyes at the notion that former presidents do not speak ill of current ones.”

The pattern is familiar. Polls show that most Americans want our government to stop its unilateral swaggering, and to try to solve our differences with other nations through diplomacy. In early April, for example, when the speaker of the House, the Democrat Nancy Pelosi, visited Syria and met with President Bashar al-Assad, a poll had 64 percent of Americans in favor of negotiations with the Syrians. Yet this didn’t stop an outpouring of media alarm.

A number of CNN broadcasts — including one showing Pelosi with a head scarf beside the title “Talking with Terrorists?” — failed even to mention that several Republican congressmen had met with Assad two days before Pelosi did. The conventional wisdom on the principal television talk shows was that Pelosi had “messed up on this one” (in the words of NBC’s Matt Lauer), and that she and the Democrats would pay dearly for it.

So it must have been a great surprise when Pelosi’s approval ratings stayed basically the same after her visit, or actually went up a little.

Or take the matter of the impeachment of President Bush and Vice President Cheney. Most media figures seem to consider the very idea as issuing from the unhinged imaginations of a lunatic fringe. But according to a recent poll, 39 percent of Americans in fact support it, including 42 percent of independents.

A common explanation of this tendency toward distortion is that the beltway media has attended a few too many White House Correspondents’ Dinners and so cannot possibly cover the administration with anything approaching objectivity. No doubt the Republicans’ notoriously well-organized efforts in casting the media as having a “liberal bias” also have their intended effect in suppressing criticism.

But I wonder whether this media distortion also persists because it doesn’t meet with enough criticism, and if that’s partially because many Americans think that what they see in the major political media reflects what most other Americans really think – when actually it often doesn’t.

Psychologists coined the term “pluralistic ignorance” in the 1930s to refer to this type of misperception — more a social than an individual phenomenon — to which even smart people might fall victim. A study back then had surprisingly found that most kids in an all-white fraternity were privately in favor of admitting black members, though most assumed, wrongly, that their personal views were greatly in the minority. Natural temerity made each individual assume that he was the lone oddball.

A similar effect is common today on university campuses, where many students think that most other students are typically inclined to drink more than they themselves would wish to; researchers have found that many students indeed drink more to fit in with what they perceive to be the drinking norm, even though it really isn’t the norm. The result is an amplification of a minority view, which comes to seem like the majority view.

In pluralistic ignorance, as researchers described it in the 1970s, “moral principles with relatively little popular support may exert considerable influence because they are mistakenly thought to represent the views of the majority, while normative imperatives actually favored by the majority may carry less weight because they are erroneously attributed to a minority.”

What is especially disturbing about the process is that it lends itself to control by the noisiest and most visible. Psychologists have noted that students who are the heaviest drinkers, for example, tend to speak out most strongly against proposed measures to curb drinking, and act as “subculture custodians” in support of their own minority views. Their strong vocalization can produce “false consensus” against such measures, as others, who think they’re part of the minority, keep quiet. As a consequence, the extremists gain influence out of all proportion to their numbers, while the views of the silent majority end up being suppressed. (The United States Department of Education has a brief page on the main ideas here.)

Think of the proposal to put a timetable on the withdrawal of troops from Iraq, supported, the latest poll says, by 60 percent of Americans, but dropped Tuesday from the latest war funding bill.

Over the past couple months, Glenn Greenwald at has done a superb job of documenting what certainly seems like it might be a case of pluralistic ignorance among the major political media, many (though certainly not all) of whom often seem to act as “subculture custodians” of their own amplified minority views. Routinely, it seems, views that get expressed and presented as majority views aren’t really that at all.

In a typical example in March, NBC’s Andrea Mitchell reported that most Americans wanted to pardon Scooter Libby, saying that the polling “indicates that most people think, in fact, that he should be pardoned, Scooter Libby should be pardoned.” In fact, polls showed that only 18 percent then favored a pardon.

Mitchell committed a similar error in April, claiming that polling showed Nancy Pelosi to be unpopular with the American people, her approval rating being as low as the dismal numbers of former Republican Speaker Dennis Hastert just before the 2006 November elections. But in fact the polls showed Pelosi’s approval standing at about 50 percent, while Hastert’s had been 22 percent.

As most people get their news from the major outlets, these distortions – however they occur, whether intentionally or through some more innocuous process of filtering – almost certainly translate into a strongly distorted image in peoples’ minds of what most people across the country think. They contribute to making mainstream Americans feel as if they’re probably not mainstream, which in turn may make them less likely to voice their opinions.

One of the most common examples of pluralistic ignorance, of course, takes place in the classroom, where a teacher has just finished a dull and completely incomprehensible lecture, and asks if there are any questions. No hands go up, as everyone feels like the lone fool, even though no student actually understood a single word. It takes guts, of course, to admit total ignorance when you might just be the only one.

Last year, author Kristina Borjesson interviewed 21 prominent journalists for her book “Feet to the Fire,” about the run-up to the Iraq War. Her most notable impression was this:

“The thing that I found really profound was that there really was no consensus among this nation’s top messengers about why we went to war. [War is the] most extreme activity a nation can engage in, and if they weren’t clear about it, that means the public wasn’t necessarily clear about the real reasons. And I still don’t think the American people are clear about it.”

Yet in the classroom of our democracy, at least for many in the media, it still seems impolitic – or at least a little too risky – to raise one’s hand.

Thursday, May 24, 2007

'The Bloody Week' (The Suppression of the Paris Commune, May 1871)

“The Law was given to us to live by. Not to die by…”

Wednesday, May 23, 2007

Pirates and Sanctions

The New York Times
May 24, 2007


This is a city you’ve probably never heard of, yet it has a population of 10 million people who fill your dressers and closets. By one count, 40 percent of the sports shoes sold in the U.S. come from Dongguan.

Just one neighborhood within Dongguan, Dalang, has become the Sweater Capital of the World. Dalang makes more than 300 million sweaters a year, of which 200 million are exported to the U.S.

Keep towns like this in mind when American protectionists demand sanctions, after the latest round of talks ending yesterday made little progress. Some irresponsible Democrats in Congress would have you believe that China’s economic success is simply the result of currency manipulation, unfair regulations and pirating American movies.

It’s true that China’s currency is seriously undervalued. But places like Dongguan have thrived largely because of values we like to think of as American: ingenuity, diligence, entrepreneurship and respect for markets.

The people in Dalang, the Sweater Capital, used to be farmers, until a Hong Kong investor opened a sweater factory at the dawn of the 1980’s. After a few years, the workers began to quit and open their own factories, and both the bosses and the staff work dizzyingly hard. One factory worker here in Guangdong Province told me that she works 12-hour shifts, seven days a week, 365 days a year, not even taking time off for Chinese New Year. She chooses to work these hours to gain a better life for her son. If protectionists want somebody to criticize for China’s trade success, blame that woman and millions like her.

Remember that China isn’t like 1980s Japan, which had a sustained huge surplus with nearly everybody. China’s global surplus has surged in the last five years, but traditionally its global trade position has been close to a balance, and it still has a trade deficit with many countries.

China imports components, does the low-wage assembly, and then exports the finished products to the U.S. — so the whole value appears in the Chinese trade surplus with the U.S., even though on average 65 percent of the value was imported into China. When a Chinese-made Barbie doll sells in the U.S. for $9.99, only 35 cents goes to China.

Sure, China pirates movies and software — but the U.S. was even worse at this stage of development (when we used to infuriate England by stealing its literary properties without paying royalties). Pirated DVDs are sold openly on the streets of Manhattan, while sellers in China can be far more creative. A couple of days ago, I dropped into a small DVD shop in Beijing to check its wares. Everything seemed legal.

Then the two saleswomen asked if I wanted to see American movies — and tugged at a bookshelf, which rolled forward on wheels. Behind was a door; one of the saleswomen whisked me into a secret room full of pirated DVDs. That’s piracy — but also capitalism at its harshest and hungriest. There are plenty of reasons to put pressure on China, including its imprisonment of journalists and its disgraceful role in supplying the weaponry used to commit genocide in Darfur. But whining about the efficiency of Chinese capitalism is beneath us.

All that said, the Chinese development model is running out of steam.

Labor shortages are growing and pushing up wage costs. Factories are having to spend more money to improve worker safety and curb pollution. The environment is such a disaster that 16 of the world’s most polluted cities are now in China.

China will also be forced to appreciate its undervalued currency, further pushing up costs. The “China price” will no longer be the world’s lowest, and millions of jobs making T-shirts and stuffed toys will move to lower-wage countries like Vietnam and Bangladesh.

So if China is going to continue its historic rise, it will have to move up the technology ladder and shift to domestic consumption as its economic engine. Yet the share of consumption in China’s economy has fallen significantly since 2000.

So as one who has been profoundly optimistic about China for the last 25 years, I think it’s time to sober up. President Hu Jintao is China’s least visionary leader since Hua Guofeng 30 years ago, and China has the burden of unusually weak leadership as it navigates a transition to a new economic model as well as a political transition to a more open society.

I’m betting China will pull it off, but I don’t think the world appreciates the risks and challenges ahead.

Rethinking Old Age

The New York Times
May 24, 2007

At some point in life, you can’t live on your own anymore. We don’t like thinking about it, but after retirement age, about half of us eventually move into a nursing home, usually around age 80. It remains your most likely final address outside of a hospital.

To the extent that there is much public discussion about this phase of life, it’s about getting more control over our deaths (with living wills and the like). But we don’t much talk about getting more control over our lives in such places. It’s as if we’ve given up on the idea. And that’s a problem.

This week, I visited a woman who just moved into a nursing home. She is 89 years old with congestive heart failure, disabling arthritis, and after a series of falls, little choice but to leave her condominium. Usually, it’s the children who push for a change, but in this case, she was the one who did. “I fell twice in one week, and I told my daughter I don’t belong at home anymore,” she said.

She moved in a month ago. She picked the facility herself. It has excellent ratings, friendly staff, and her daughter lives nearby. She’s glad to be in a safe place — if there’s anything a decent nursing home is built for, it is safety. But she is struggling.

The trouble is — and it’s a possibility we’ve mostly ignored for the very old — she expects more from life than safety. “I know I can’t do what I used to,” she said, “but this feels like a hospital, not a home.” And that is in fact the near-universal reality.

Nursing home priorities are matters like avoiding bedsores and maintaining weight — important goals, but they are means, not ends. She left an airy apartment she furnished herself for a small beige hospital-like room with a stranger for a roommate. Her belongings were stripped down to what she could fit into the one cupboard and shelf they gave her. Basic matters, like when she goes to bed, wakes up, dresses, and eats were put under the rigid schedule of institutional life. Her main activities have become bingo, movies, and other forms of group entertainment. Is it any wonder most people dread nursing homes?

The things she misses most, she told me, are her friendships, her privacy, and the purpose in her days. She’s not alone. Surveys of nursing home residents reveal chronic boredom, loneliness, and lack of meaning — results not fundamentally different from prisoners, actually.

Certainly, nursing homes have come a long way from the fire-trap warehouses they used to be. But it seems we’ve settled on a belief that a life of worth and engagement is not possible once you lose independence.

There has been, however, a small band of renegades who disagree. They’ve created alternatives with names like the Green House Project, the Pioneer Network, and the Eden Alternative — all aiming to replace institutions for the disabled elderly with genuine homes. Bill Thomas, for example, is a geriatrician who calls himself a “nursing home abolitionist” and built the first Green Houses in Tupelo, Miss. These are houses for no more than 10 residents, equipped with a kitchen and living room at its center, not a nurse’s station, and personal furnishings. The bedrooms are private. Residents help one another with cooking and other work as they are able. Staff members provide not just nursing care but also mentoring for engaging in daily life, even for Alzheimer’s patients. And the homes meet all federal safety guidelines and work within state-reimbursement levels.

They have been a great success. Dr. Thomas is now building Green Houses in every state in the country with funds from the Robert Wood Johnson Foundation. Such experiments, however, represent only a tiny fraction of the 18,000 nursing homes nationwide.

“The No. 1 problem I see,” Dr. Thomas told me, “is that people believe what we have in old age is as good as we can expect.” As a result, families don’t press nursing homes with hard questions like, “How do you plan to change in the next year?” But we should, if we want to hope for something more than safety in our old age.

“This is my last hurrah,” the woman I met said. “This room is where I’ll die. But it won’t be anytime soon.” And indeed, physically she’s done well. All she needs now is a life worth living for.

The world as Shakespearean tragedy

By Niall Ferguson
Los Angeles Times
May 21, 2007

'ALL THE WORLD'S a stage," observes Jacques in "As You Like It." "And all the men and women merely players."

No sphere of human life is more theatrical than politics. And seldom has the world's political stage seemed more Shakespearean than it does today — in "The Tragedy of King George." To judge by the number of bodies that currently litter it, we appear to be nearing the end of Act V. By the concluding scenes of Shakespeare's greatest political tragedies — "Hamlet," "Julius Caesar," "King Lear" and "Macbeth" — nearly all the principal characters lie dead. So it is with King George, the tale of an unworldly fellow who ascends the throne of a great empire, responds heroically to an unprovoked attack, then wreaks havoc by turning from retaliation to preemption.

The latest corpse to slump lifeless beneath the proscenium arch is that of Paul Wolfowitz, who last week finally announced that he would resign as president of the World Bank. Another central character — British Prime Minister Tony Blair — has taken the political equivalent of slow-acting poison.

Think back to 2003, to the invasion of Iraq. One after another, the politicians who most strongly supported the decision have been ousted from office.

As in "Julius Caesar," the fault is not in the central characters' stars but in themselves. President Bush's dominant character traits — his decisiveness and tenacity — at first appeared to be strengths. But once he had been convinced by his advisors that the attacks of 9/11 furnished a pretext for the overthrow of Saddam Hussein, these became weaknesses.

As in "Macbeth," King George was soon "in blood, steeped in so far" that turning back seemed no more attractive than wading onward. Remember, the corpses that litter this stage can already be counted in the tens, if not the hundreds, of thousands.

And, as in "King Lear," the whole catastrophe has stemmed from a fatal confusion at the outset between the true and the false, enemies and friends. Lear succumbs to the flattery of the ugly sisters, Regan and Goneril, and casts out the blunt but honest Cordelia (not to mention the straight-talking Kent).

The mistaken identity in the tragedy of King George was that of the real enemy in the post-9/11 war on terror. It is almost certain that the hijackers hailed from Saudi Arabia, the United Arab Emirates, Egypt and Lebanon. The chief architect of the plot, Osama bin Laden, also was a Saudi. Contrast this list of countries with the "axis of evil" identified by Bush in his 2002 State of the Union address: North Korea, Iran and Iraq. Bush was right to target Afghanistan in the immediate aftermath of 9/11 because the Taliban regime was sheltering Al Qaeda's leadership. But the decision to overthrow Hussein was one of history's great non sequiturs.

The real enemy in the global war on terror is not the "axis of evil" but the "axis of allies." Today, the countries most likely to produce another 9/11 are not Iran, much less North Korea, but countries long regarded as (after Israel) America's most reliable allies in the greater Middle East. Step forward, Saudi Arabia (almost certainly still the biggest source of funding for radical Islamists) and Pakistan (definitely their one-stop shop for nuclear weaponry).

There is, in short, a twist in this tale. Before the curtain can fall on "The Tragedy of King George," we need at least three more scenes to decide the fates of three crucial characters: the only principals left standing aside from King George himself.

First, we need a scene in Israel. Since the failure of the war against Hezbollah in Lebanon, Prime Minister Ehud Olmert's popularity has been in free fall. His current approval rating is about 2%, by comparison with which King George is a pop idol. Somehow, Olmert is clinging to political life. But he surely cannot last much longer. What happens next will be crucial; if Benjamin Netanyahu returns to power, the probability of a military confrontation with Iran goes above 50%. Remember, Netanyahu compared Iranian President Mahmoud Ahmadinejad to Hitler. "It is the year 1938", he recently declared, "and Iran is Germany."

Then we need a scene in Saudi Arabia. Here the key figure is Prince Bandar bin Sultan, who, as Saudi ambassador to the United States, was one of the leading advocates of the invasion of Iraq. Since October 2005 he has been in Riyadh as secretary-general of the National Security Council, where he is said to be lobbying hard for another attack: This time — you guessed it — on Iran.

Finally, the action needs to shift eastward to Pakistan, where it is the future of President Pervez Musharraf that hangs in the balance. After eight years of military dictatorship, Pakistan's democratic forces are stirring. But watch out — these include the Islamist coalition known as the Muttahida Majlis-e-Amal.

You thought this play was nearly over. But Act V has only just begun. With war looming between Iran and Israel, and Pakistan on the brink of an upheaval that could well end with Islamists in power, the worst bloodshed has yet to come.

Laughing and Crying

The New York Times
May 23, 2007

First I had to laugh. Then I had to cry.

I took part in commencement this year at Rensselaer Polytechnic Institute, one of America’s great science and engineering schools, so I had a front-row seat as the first grads to receive their diplomas came on stage, all of them Ph.D. students. One by one the announcer read their names and each was handed their doctorate — in biotechnology, computing, physics and engineering — by the school’s president, Shirley Ann Jackson.

The reason I had to laugh was because it seemed like every one of the newly minted Ph.D.’s at Rensselaer was foreign born. For a moment, as the foreign names kept coming — “Hong Lu, Xu Xie, Tao Yuan, Fu Tang” — I thought that the entire class of doctoral students in physics were going to be Chinese, until “Paul Shane Morrow” saved the day. It was such a caricature of what President Jackson herself calls “the quiet crisis” in high-end science education in this country that you could only laugh.

Don’t get me wrong. I’m proud that our country continues to build universities and a culture of learning that attract the world’s best minds. My complaint — why I also wanted to cry — was that there wasn’t someone from the Immigration and Naturalization Service standing next to President Jackson stapling green cards to the diplomas of each of these foreign-born Ph.D.’s. I want them all to stay, become Americans and do their research and innovation here. If we can’t educate enough of our own kids to compete at this level, we’d better make sure we can import someone else’s, otherwise we will not maintain our standard of living.

It is pure idiocy that Congress will not open our borders — as wide as possible — to attract and keep the world’s first-round intellectual draft choices in an age when everyone increasingly has the same innovation tools and the key differentiator is human talent. I’m serious. I think any foreign student who gets a Ph.D. in our country — in any subject — should be offered citizenship. I want them. The idea that we actually make it difficult for them to stay is crazy.

Compete America, a coalition of technology companies, is pleading with Congress to boost both the number of H-1B visas available to companies that want to bring in skilled foreign workers and the number of employment-based green cards given to high-tech foreign workers who want to stay here. Give them all they want! Not only do our companies need them now, because we’re not training enough engineers, but they will, over time, start many more companies and create many more good jobs than they would possibly displace. Silicon Valley is living proof of that — and where innovation happens matters. It’s still where the best jobs will be located.

Folks, we can’t keep being stupid about these things. You can’t have a world where foreign-born students dominate your science graduate schools, research labs, journal publications and can now more easily than ever go back to their home countries to start companies — without it eventually impacting our standard of living — especially when we’re also slipping behind in high-speed Internet penetration per capita. America has fallen from fourth in the world in 2001 to 15th today.

My hat is off to Andrew Rasiej and Micah Sifry, co-founders of the Personal Democracy Forum. They are trying to make this an issue in the presidential campaign by creating a movement to demand that candidates focus on our digital deficits and divides. (See: Mr. Rasiej, who unsuccessfully ran for public advocate of New York City in 2005 on a platform calling for low-cost wireless access everywhere, notes that “only half of America has broadband access to the Internet.” We need to go from “No Child Left Behind,” he says, to “Every Child Connected.”

Here’s the sad truth: 9/11, and the failing Iraq war, have sucked up almost all the oxygen in this country — oxygen needed to discuss seriously education, health care, climate change and competitiveness, notes Garrett Graff, an editor at Washingtonian Magazine and author of the upcoming book “The First Campaign,” which deals with this theme. So right now, it’s mostly governors talking about these issues, noted Mr. Graff, but there is only so much they can do without Washington being focused and leading.

Which is why we’ve got to bring our occupation of Iraq to an end in the quickest, least bad way possible — otherwise we are going to lose Iraq and America. It’s coming down to that choice.

The Golden Rule in the Human Jungle

By Mark Buchanan
Our Lives as Atoms
The New York Times
May 21, 2007

News of the past few days and weeks suggests a rather dismal view of humanity. Israel is once again bombing the Palestinians, who are already locked in their own violent internal power struggle. On the streets of Karachi, just over a week ago, Pakistani security forces killed 41 people and injured many more, while preventing a rally for Iftikhar Chaud, deposed Chief Justice of the Supreme Court and opponent of President Pervez Musharraf. In the United States, a company compiling data on consumers is making money by helping criminals steal the savings of thousands of retired Americans.

Violence, corruption and greed. What kind of people are we?

But counter all of that with this – a young man in Cleveland has pledged $1 million of his own money to establish scholarships for disadvantaged children. His name is Braylon Edwards, and, O.K., he’s an emerging star for the Cleveland Browns who makes more money in a year than most of us will in a lifetime, but still. He could have bought a yacht and a fleet of sparkling Humvees. Instead, he invested in the future of hundreds of people he doesn’t even know.

“To secure a positive future for our country,” an ESPN article quoted Edwards as saying, “we have to start with these kids. We have to support them.”

So maybe the news is more dismal than it needs to be. But a glance at my previous columns shows that I’ve fallen into a similar pattern, writing on racial prejudice, genocide and entrenched political polarization, while not mentioning the more positive sides of the social atom. Cynicism can be pushed too far, because pure and untainted human altruism really exists – and it’s something to which we should learn to pay a lot more attention.

In a classic experiment of modern behavioral science – one that is now familiar to many people – an experimenter gives one of two people some cash, say $50, and asks them to offer some of it (any amount they choose) to another person, who can either accept or reject the offer. If the second person accepts, the cash is shared out accordingly; if he or she rejects it, no one gets to keep anything.

If we were all self-interested and greedy, then the second person would always accept the offer, as getting something is clearly better than getting nothing. And the first person, knowing this, would offer as little as possible. But that’s most certainly not what happens.

Experiments across many cultures show that people playing this “ultimatum game” typically offer anything from 25 to 50 percent of the money, and reject offers less than around 25 percent, often saying they wanted to punish the person for making an unfair offer.

An important point that people often overlook about these experiments (and others like them) is that they’ve been performed very carefully, with participants remaining completely anonymous, and playing only once. Everything is set up so no one can have any hope of building a good reputation or of getting any kind of payback in the future in kind for their actions today.

So this really does seem to be pure altruism, and we do care about fairness, at least most of us.

That’s not to say, of course, that we’re not often self-interested, or that human kindness isn’t frequently strategic and aimed at currying favor in the future. The point is that it’s not always like that. People give to charity, tip waiters in countries they’ll never again visit, dive into rivers to save other people or even animals – or set aside $1 million to send poor kids to school – not because they hope to get something but, sometimes, out of the goodness of their hearts.
Social researchers have begun referring to this human tendency with the technical term “strong reciprocity,” which refers to a willingness to cooperate, and also to punish those who don’t cooperate, even when no gain is possible. And there’s an interesting theory as to why we’re like this.

In theoretical studies, economists and anthropologists have been exploring how self-interest and cooperation might have played out in our ancestral groups of hunter-gatherers. In interactions among individuals, it’s natural to suppose that purely self-interested people would tend to come out ahead, as they’d never get caught out helping others without getting help in return and would also be able to cheat any naïve altruists that come along.

But it is also natural to suppose that when neighboring groups compete with one another, the group with more altruists would have an advantage, as it would be better able to manage collective tasks – things like farming and hunting, providing for defense or caring for the sick – than a group of more selfish people.

So you can imagine a basic tension in the ancient world between individual interactions that favor self-interest and personal preservation, and group interactions that favor individual altruism. Detailed simulations suggest that if the group competition is strong enough, cooperators will persist because of their intense value to group cohesion. But there’s slightly more to the story, too.

Further work shows that groups really thrive if the altruists are of a special sort – not just people who are willing to cooperate with others, but who are also willing to punish those who they see failing to cooperate.

This work is only suggestive, but it raises the interesting idea that it’s a long history of often brutal competition among groups that has turned most of us into willing cooperators, or, more accurately, strong reciprocators. We’re not Homo economicus, as Herbert Gintis of the University of Amherst puts it, but Homo reciprocans – an organism biologically prone to cooperative actions, and for good historical reasons.

No doubt this is what many people probably thought all along, without the aid of any theory or computer simulations. It just goes to show how theorists can labor for years to re-discover the obvious. Then again, re-discovery often casts the familiar in a not-so-familiar light, and leads us to reconsider what we thought we already knew.

We’ve been so busy over the past half century glorifying the power of markets driven by self-interest that we’ve overlooked how many of our most important institutions depended not on self-interest but on something more akin to a cooperative public spirit. If an impulse toward cooperation rather than self-interest alone is the “natural” human condition, then we’ve been poor stewards of a powerful social resource for the collective good. The United States health care system, to take one example, has by design been set up around the profit motive, based on the belief that only this narrow motivator of individual action can be counted on to produce anything good. It’s perhaps no surprise that it is among the most expensive in the world, and far from the most effective.

In a press conference at the Cannes Film Festival, following a screening of his new film “Sicko,” Michael Moore criticized how financial interests play such a foundational role in health care in the United States. “It’s wrong and it’s immoral,” he said. “We have to take the profit motive out of health care. It’s as simple as that.”

But it’s not quite that simple. It’s not that profits shouldn’t play any role, because we are indeed motivated in part by self-interest. It’s just that we have other motivations, too, and helping others is one of those. We need to be just as open to the better parts of human nature as we are protective against the narrowly materialistic ones, whether we’re considering health care or anything else, including education.

You don’t need a new breed of experimental economists to tell you that. Just ask Braylon Edwards.

Pass the Clam Dip

The New York Times
May 23, 2007


It’s no wonder Al Gore is a little touchy about his weight, what with everyone trying to read his fat cells like tea leaves to see if he’s going to run.

He was so determined to make his new book look weighty, in the this-treatise-belongs-on-the-shelf-between-Plato-and-Cato sense, rather than the double-chin-isn’t-quite-gone-yet sense, that he did something practically unheard of for a politician: He didn’t plaster his picture on the front.

“The Assault on Reason” looks more like the Beatles’ White Album than a screed against the tinny Texan who didn’t get as many votes in 2000.

The Goracle does concede a small author’s picture on the inside back flap, a chiseled profile that screams Profile in Courage and that also screams Really Old Picture. Indeed, if you read the small print next to the wallet-sized photo of Thin Gore looking out prophetically into the distance, it says it’s from his White House years.

A subliminal clue to his intentions, perhaps? He must be flattered that many demoralized leading Republicans and Bush insiders think a Gore-Obama ticket would be unbeatable. And he must be gratified that his rival Hillary has never cemented her inevitability, even with Bill Clinton’s lip-licking Web video pushing her.

But though he’s on a book tour clearly timed to build on his Oscar flash and Nobel buzz, and take advantage of the public’s curiosity about whether he’ll jump in the race, he almost seems to want to sigh and roll his eyes when he’s asked about it.

“I’m not a candidate,” he told Diane Sawyer on “Good Morning America.” “This book is not a political book. It’s not a candidate book at all.”

Of course, his protestation was lost given the fact that he was sitting in front of a screen blaring the message “The Race to ’08,” and above a crawl that asked “Will he run for the White House?”

He is so fixed on not seeming like a presidential flirt that he risks coming across as a bit of a righteous tease or a high-minded scold, which is exactly what his book is, a high-minded scolding.

He upbraided Diane about the graphics for his segment, complaining about buzzwords and saying “That’s not what this is about.”

Diane was not so easily put off as he turned up his nose at the horse race and the vast wasteland of TV, and bored in for the big question: “Donna Brazile, your former campaign manager, has said, ‘If he drops 25 to 30 pounds, he’s running.’ Lost any weight?”

Laughing obligingly, he replied: “I think, you know, millions of Americans are in the same struggle I am on that one. But look, listen to your questions. And you know, if the horse race, the cosmetic parts of this — and look, that’s all understandable and natural. But while we’re focused on, you know, Britney and KFed and Anna Nicole Smith and all this stuff, meanwhile, very quietly, our country has been making some very serious mistakes that could be avoided if we the people, including the news media, are involved in a full and vigorous discussion of what our choices are.”

He explained to James Traub of The New York Times Magazine that TV induces a sort of national trance because the brain’s fear center, the amygdala, receives only a fraction of electrical impulses from the neocortex, and couldn’t resist lecturing about the amygdala — “which as I’m sure you know comes from the Latin for ‘almond.’ ”

Mr. Traub said that, as he followed him around, the Goracle was “eating like a maniac: I watched him inhale the clam dip at a reception like a man who doesn’t know when his next meal will be coming.”

So if Al Gore is really unplugged and unleashed and uncensored, as Tipper and his fans say, then he is no longer bound by the opinions of gurus and focus groups. He can be himself, and inhale away and still run if he wants.

Barack Obama is as slender as an adolescent and exercises constantly, but he still sometimes seems strangely tired on the campaign trail. He blamed fatigue when he overstated the death toll of the Kansas tornadoes, saying it was 10,000 when it was 12.

Doug Brinkley, the presidential historian, said that even though the fashion now is for fit candidates, after the Civil War, there was a series of overweight presidents. “It showed you had a zest for life,” he said. The excess baggage may make Bill Clinton and Bill Richardson look roguish, but unfortunately, too many cheeseburgers and ice cream sundaes make Mr. Gore look puffy and waxy. “Maybe,” Mr. Brinkley suggested, “Gore can sit in Tennessee and do it via high-definition satellite — like McKinley, just eat and sit on the porch.”

Tuesday, May 22, 2007

When Government Was the Solution

Jean Edward Smith
Campaigning for History
The New York Times
May 21, 2007

For more than a generation, Americans have been told that government is the problem, not the solution. The mantra can be traced back to Barry Goldwater’s presidential bid in 1964. It provided the mind-set for the Reagan administration, and it has come to ultimate fruition during the presidency of George W. Bush.

On college campuses and at think tanks across the country, libertarian scholars stoke the urge to eliminate government from our lives. This thinking has led to the privatization of vital government functions such as the care of disabled veterans, the appointment to regulatory commissions of members at odds with the regulations they are sworn to enforce, the refusal of the Environmental Protection Agency to protect the environment, and the surrender of the government’s management of military operations to profit-seeking contractors.

A look back at Franklin D. Roosevelt’s presidency shows how differently Americans once viewed the government’s role, how much more optimistic they were and how much more they trusted the president.

F.D.R., like his cousin Theodore, saw government in positive terms. In 1912, speaking in Troy, N.Y., F.D.R. warned of the dangers of excessive individualism. The liberty of the individual must be harnessed for the benefit of the community, said Roosevelt. “Don’t call it regulation. People will hold up their hands in horror and say ‘un-American.’ Call it ‘cooperation.’ ”

When F.D.R. took office in 1933, one third of the nation was unemployed. Agriculture was destitute, factories were idle, businesses were closing their doors, and the banking system teetered on the brink of collapse. Violence lay just beneath the surface.

Roosevelt seized the opportunity. He galvanized the nation with an inaugural address that few will ever forget (”The only thing we have to fear is fear itself.”), closed the nation’s banks to restore depositor confidence and initiated a flurry of legislative proposals to put the country back on its feet. Sound banks were quickly reopened, weak ones were consolidated and, despite cries on the left for nationalization, the banking system was preserved.

Roosevelt had no master plan for recovery but responded pragmatically. Some initiatives, such as the Civilian Conservation Corps, which employed young men to reclaim the nation’s natural resources, were pure F.D.R. Others, such as the National Industrial Recovery Act, were Congressionally inspired. But for the first time in American history, government became an active participant in the country’s economic life.

After saving the banks, Roosevelt turned to agriculture. In Iowa, a bushel of corn was selling for less than a package of chewing gum. Crops rotted unharvested in the fields, and 46 percent of the nation’s farms faced foreclosure.

The New Deal responded with acreage allotments, price supports and the Farm Credit Administration. Farm mortgages were refinanced and production credit provided at low interest rates. A network of county agents, established under the Agricultural Adjustment Act, brought soil testing and the latest scientific advances to every county in the country.

The urban housing market was in equal disarray. Almost half of the nation’s homeowners could not make their mortgage payments, and new home construction was at a standstill. Roosevelt responded with the Home Owners’ Loan Corporation. Mortgages were refinanced. Distressed home owners were provided money for taxes and repairs. And new loan criteria, longer amortization periods and low interest rates made home ownership more widely affordable, also for the first time in American history.

The Glass-Steagall Banking Act, passed in 1933, authorized the Federal Reserve to set interest rates and established the Federal Deposit Insurance Corporation to insure individual bank deposits. No measure has had a greater impact on American lives or provided greater security for the average citizen.

The Tennessee Valley Authority, also established in 1933, brought cheap electric power and economic development to one of the most poverty-stricken regions of the country. Rural electrification, which we take for granted today, was virtually unknown when Roosevelt took office. Only about one in 10 American farms had electricity. In Mississippi, fewer than 1 in 100 did. The Rural Electrification Administration, which F.D.R. established by executive order in 1935, brought electric power to the countryside, aided by the construction of massive hydroelectric dams, not only on the Tennessee River system, but on the Columbia, Colorado and Missouri rivers as well.

To combat fraud in the securities industry, Roosevelt oversaw passage of the Truth in Securities Act, and then in 1934 established the Securities and Exchange Commission. As its first head he chose Joseph P. Kennedy. “Set a thief to catch a thief,” he joked afterward.

By overwhelming majorities, Congress passed laws establishing labor’s right to bargain collectively and the authority of the federal government to regulate hours and working conditions and to set minimum wages.

An alphabet soup of public works agencies — the C.W.A. (Civil Works Administration), the W.P.A. (Works Progress Administration) and the P.W.A. (Public Works Administration) — not only provided jobs, but restored the nation’s neglected infrastructure. Between 1933 and 1937, the federal government constructed more than half a million miles of highways and secondary roads, 5,900 schools, 2,500 hospitals, 8,000 parks, 13,000 playgrounds and 1,000 regional airports. Cultural projects employed and stimulated a generation of artists and writers, including such luminaries as Willem de Kooning, Jackson Pollock, John Cheever and Richard Wright.

Roosevelt saw Social Security, enacted in 1935, as the centerpiece of the New Deal. “If our Federal Government was established … ‘to promote the general welfare,’ ” said F.D.R., “it is our plain duty to provide for that security upon which welfare depends.”

For the first time, the government assumed responsibility for unemployment compensation, old-age and survivor benefits, as well as aid to dependent children and the handicapped. At F.D.R.’s insistence, Social Security was self-funding – supported by contributions paid jointly by employers and employees. (In most industrialized countries, the government provides the major funding for pension plans.) “Those taxes are in there,” Roosevelt said later, “so that no damn politician can ever scrap my Social Security program.”

The government’s positive role did not end when the New Deal lost effective control of Congress in 1938. Neither Wendell Willkie, the G.O.P. standard-bearer in 1940, nor Thomas E. Dewey, in 1944 and ’48, advocated turning back the clock.

The G.I. Bill of Rights, adopted unanimously by both houses of Congress in 1944, provided massive government funding to provide university and vocational training for returning veterans. The G.I. Bill changed the face of higher education by making universities accessible to virtually every American.

The Eisenhower administration continued to see government in positive terms. President Eisenhower added 15 million low-income wage earners to Social Security, and he launched the interstate highway system – which also was self-funding, through additional gasoline taxes. Only the federal government could have organized so vast an undertaking, the benefits of which continue to accrue.

The ideological obsession of the Bush administration to diminish the role of government has served the country badly. But perhaps this government’s demonstrated inability to improve the lives of ordinary Americans will ensure that future efforts to “repeal the New Deal” are not successful.

Fred Thompson's Jerk Ethic

In today's [Sunday, May 20, 2007] Tennessean, columnist Larry Daughtrey -- who may well be the paper's only redeeming feature -- addresses Fred Thompson's famously missing work ethic. Thompson was evidently tagged as lazy even in his high school days! Maybe that's why the slacker party loves him so.

And how many laws does Thompson break by sucking on his Cuban cigars? Guess we'll find out if or when the Law and Order guy actually enters the race.

An excerpt from Daughtrey's column:

Oh, to be in the somewhat sizable shoes of Fred Dalton Thompson.

Ol' Freddie, or Moose, as they remember him in Lawrenceburg, has stumbled yet again into a memorable acting role. This one is about how to run for president by not running for president.

Fred has got that supposedly all-powerful world of the bloggers, at least those of the neo-conservative persuasion, all atwitter by, well, not doing anything at all. It is a role for which he is perfectly suited.

In the Washington world of workaholics, Fred is remembered as a virtual teetotaler. During his eight years in the U.S. Senate, an insertion into the Congressional Record amounted to heavy lifting.

His high school yearbook sized him up early: "The lazier a man is, the more he plans to do tomorrow."

Maybe Fred will get around to running for president. Maybe not. But by not running, he is running as high as second in some polls of a less-than-overwhelming field of Republicans. Some Republicans believe he is Ronald Reagan, arisen.

Sliding back and forth between the fantasy worlds of the silver screen and politics is nothing new for him. Back in 1992, the Gucci loafers, Lincoln Continental and high-dollar lobbying fees of Fred D. Thompson, Esquire, weren't playing too well at the political box office in Tennessee. So, he bought an old red pickup and a pair of $100 boots, tuned up the drawl and beat a Harvard man for the Senate in Big Orange country.

As Reagan said, here he goes again. Read more . . .

[Acknowledgements to Tennessee Guerilla Women.]

Monday, May 21, 2007

"The War On Poverty is over. And the Poor lost..."

American Cities and the Great Divide
The New York Times
May 22, 2007

A public high school teacher in Brooklyn told me recently about a student who didn’t believe that a restaurant tab for four people could come to more than $500. The student shook his head, as if resisting the very idea. He just couldn’t fathom it.

“How much can you eat?” the student asked.

When I asked a teacher in a second school to mention the same issue, one of the responses was, “Is this a true story?”

A lot of New Yorkers are doing awfully well. There are 8 million residents of New York City, and roughly 700,000 are worth a million dollars or more. The average price of a Manhattan apartment is $1.3 million. The annual earnings of the average hedge fund manager is $363 million.

The estimated worth of the mayor, Michael Bloomberg, ranges from $5.5 billion to upwards of $20 billion.

You want a gilded age? This is it. The elite of the Roaring Twenties would be stunned by the wealth of the current era.

Now the flip side, which is the side those public school students are on. One of the city’s five counties, the Bronx, is the poorest urban county in the nation. The number of families in the city’s homeless shelters is the highest it has been in a quarter of a century. Twenty-five percent of all families with children in New York City — that’s 1.5 million New Yorkers — are trying to make it on incomes that are below the poverty threshold established by the federal government.

The streets that are paved with gold for some are covered with ash for many others. There are few better illustrations of the increasingly disturbing divide between rich and poor than New York City.

“I get to walk in both worlds,” said Larry Mandell, the president of the United Way of New York City. “In a given day I might be in a soup kitchen and also in the halls of Fortune 500 companies dealing with the senior executives. I’ve become acutely aware that the lives of those who are well off are not touched at all by contact with the poor. It’s not that people don’t care or don’t want to help. It’s that they have very little awareness of poverty.”

I’d always thought of the United Way as a charitable outfit. But Mr. Mandell has committed his organization to the important task of raising the awareness of Americans and their political leaders to the pressing needs of America’s cities, and especially the long-neglected, poverty-stricken neighborhoods of the inner cities.

It’s a measure of how low the bar has been set for success in America’s cities that New York is thought to be doing well, even though 185,000 of its children ages 5 or younger are poor, and 18,000 are consigned to homeless shelters each night. More than a million New Yorkers get food stamps, and another 700,000 are eligible but not receiving them. That’s a long, long way from a $500 restaurant tab.

Only 50 percent of the city’s high school students graduate in four years. And if you talk to the kids in the poorer neighborhoods, they will tell you that they don’t feel safe. They are worried about violence and gang activity, which in their view is getting worse, not better.

This is what’s going on in the nation’s most successful big city.

Mr. Mandell is upset that urban issues, which in so many cases are related to poverty, have played such a minuscule role in the presidential campaign so far. “People need to become more aware of the issue of poverty,” he said. “It’s discouraging, frankly, to have it barely mentioned at all in the debates.

“It’s true that John Edwards is the one candidate who seems concerned about it, but to actually have the issue come up just briefly in the debates, and not at all in the Republican debate — well, my view is that we have to change that.”

The United Way of New York has issued a white paper on “America’s Urban Agenda” that says, “The greatest single challenge most American cities face lies in the increasing divide between the haves and have-nots.”

There was a time, some decades ago, when urban issues and poverty were important components of presidential campaigns. Now the poor are kept out of sight, which makes it easier to leave them farther and farther behind. We’ve apparently reached a point in our politics when they aren’t even worth mentioning.

America’s Admissions System

The New York Times
May 22, 2007

Harvard is tough to get into. To be admitted to a school like that, students spend years earning good grades, doing community service and working hard to demonstrate their skills. The system has its excesses, but over all it’s good for Harvard and it’s good for the students beginning their climb to opportunity.

The United States is the Harvard of the world. Millions long to get in. Yet has this country set up an admissions system that encourages hard work, responsibility and competition? No. Under our current immigration system, most people get into the U.S. through criminality, nepotism or luck. The current system does almost nothing to encourage good behavior or maximize the nation’s supply of human capital.

Which is why the immigration deal reached in the Senate last week is, on balance, a good thing. It creates a new set of incentives for immigrants and potential immigrants. It encourages good behavior, in the manner of a demanding (though overly harsh) admissions officer. It rewards the bourgeois virtues that have always been at the heart of this nation’s immigrant success, and goes some way to assure that the people who possess these virtues can become U.S. citizens.

Let’s look at how this bill would improve incentives almost every step of the way.

First, consider the 10 to 12 million illegal immigrants who are already here. They now have an incentive to think only in the short term. They have little reason to invest for the future because their presence here could be taken away.

This bill would encourage them to think in the long term. To stay, they would have to embark on a long, 13-year process. They’d have to obey the law, learn English and save money (to pay the stiff fines). Suddenly, these people would be lifted from an underclass environment — semi-separate from mainstream society — and shifted into a middle-class environment, enmeshed within the normal rules and laws that the rest of us live by. This would be the biggest values-shift since welfare reform.

Second, consider the millions living abroad who dream of coming to the U.S. Currently, they have an incentive to find someone who can smuggle them in, and if they get caught they have an incentive to try and try again.

The Senate bill reduces that incentive for lawlessness. If you think it is light on enforcement, read the thing. It would not only beef up enforcement on the border, but would also create an electronic worker registry. People who overstay their welcome could forfeit their chance of being regularized forever.

Moreover, aspiring immigrants would learn, from an early age, what sort of person the U.S. is looking for. In a break from the current system, this bill awards visas on a merit-based points system that rewards education, English proficiency, agricultural work experience, home ownership and other traits. Potential immigrants would understand that the U.S. is looking for people who can be self-sufficient from the start, and they’d mold themselves to demonstrate that ability.

Third, consider the people who are admitted to the U.S. under the bill’s guest-worker program. By forcing these workers to spend a year away after two years of work here, this section encourages them to think of the U.S. as a place to earn some money before building their long-term futures back home. It encourages these young workers to be as flexible as possible, to go wherever the jobs are, so they can maximize earnings during each two-year window.

Nobody can like all aspects of this compromise bill. It has needless complexities and touchback mechanisms. The guest-worker part threatens to set up a permanent and un-American divide between temporary and skilled workers. But, over all, this bill finally gives this meritocratic nation a meritocratic immigration system.

Personally, I’d like to see it go farther. I’d prefer a system in which potential immigrants were admitted on an audition basis. An engineer from China who ran a neighborhood association would get citizenship. A construction worker from Mexico who was promoted to crew chief would get citizenship. This would be a system that rewarded hard work and perseverance as much as it rewarded I.Q. and advanced degrees. People who qualified could bring their nuclear families with them, since families are the foundries of responsible behavior.

In the meantime, this bill is a step. Despite its ramshackle and unforgiving nature, there’s still a little of the spirit of Ben Franklin flickering inside. There is still enough encouragement for the ambitious young striver, desperate to make good.

Web Site Hit Counters
High Speed Internet Services