Saturday, November 25, 2006

THE WORLD IS AN OIL WELL















Cheney [in that gruff Cheney voice]: “It’s always a pleasure doing business with you, Kingsie…”

King Abdullah: “If those stupid little rubes out there only knew the half of it…Hehehe…”

Cheney: “Hell…If Waxman gets too close to the truth we can always blame everything on Israel…”

King: “Inshallah…!”

“LOOK OUT EVILDOERS, WHEREVER YOU ARE…!”



























Waxman has Bush administration in sights

LOS ANGELES - The lawmaker poised to cause the Bush administration's biggest headaches when Democrats take control of Congress may just be a grocer's son from Watts who's hardly a household name off Capitol Hill.

Rep. Henry Waxman has spent the last six years waging a guerrilla campaign against the White House and its corporate allies, launching searing investigations into everything from military contracts to Medicare prices from his perch on the Government Reform Committee.

In January, Waxman becomes committee chairman — and thus the lead congressional hound of an administration many Democrats feel has blundered badly as it expanded the power of the executive branch.

Waxman's biggest challenge as he mulls what to probe?

"The most difficult thing will be to pick and choose," he said.

The choices he makes could help define Bush's legacy.

"There is just no question that life is going to be different for the administration," said Rep. Tom Davis, R-Va., the current committee chairman. "Henry is going to be tough. ... And he's been waiting a long time to be able to do this."[MORE]

From the THEY DON’T CALL IT BANGKOK FOR NOTHING…files

BANGKOK (Reuters) - A Thai Buddhist monk cut off his penis with a machete because he had an erection during meditation and declined to have it reattached, saying he had renounced all earthly cares, a doctor and a newspaper said on Wednesday. MORE

COMMENT: Now that’s carrying ‘spiritual enlightenment’ just a bit too far…

President-elect of Christian Coalition resigns

ORLANDO, Fla. (AP) - The Reverend elected to take over as president of the Christian Coalition of America said he will not assume the role because of differences in philosophy.

The Rev. Joel Hunter, of Longwood's Northland, A Church Distributed, said Wednesday that the national group would not let him expand the organization's agenda beyond opposing abortion and gay marriage.

This is the latest setback for the group founded in 1989 by religious broadcaster the Rev. Pat Robertson. Four states - Georgia, Alabama, Iowa and Ohio - have decided to split from the group over concerns it's changing direction on issues like the minimum wage, the environment and Internet law instead of core issues like abortion and same-sex marriage.

Hunter, who was scheduled to take over the socially conservative political group Jan. 1, said he had hoped to focus on issues such as poverty and the environment.

"These are issues that Jesus would want us to care about," Hunter said.

He resigned Tuesday during an organization board meeting. Hunter said he was not asked to leave.

"They pretty much said, 'These issues are fine, but they're not our issues, that's not our base,'" Hunter said.

A statement issued by the coalition said Hunter resigned because of "differences in philosophy and vision." The board accepted his decision "unanimously," it states.

The organization, headed by President Roberta Combs, claims a mailing list of 2.5 million.

"To tell you the truth, I feel like there are literally millions of evangelical Christians that don't have a home right now," Hunter said.

NO ONE TO LOSE TO













A Civil War?

By MAUREEN DOWD

After the Thanksgiving Day Massacre of Shiites by Sunnis, President Bush should go on Rupert Murdoch’s Fox News and give an interview headlined: “If I did it, here’s how the civil war in Iraq happened.”

He could describe, hypothetically, a series of naïve, arrogant and self-defeating blunders, including his team’s failure to comprehend that in the Arab world, revenge and religious zealotry can be stronger compulsions than democracy and prosperity.

But W. is not yet able to view his actions in subjunctive terms, much less objective ones. Bush family retainers are working to deprogram him, but the president is loath to strip off his delusions of adequacy.

W. declined to tear himself away from his free-range turkey and pumpkin mousse trifle at Camp David and reassure Americans about the deadliest sectarian attack in Baghdad since the U.S. invaded. More than 200 Shiites were killed and hundreds more wounded by car bombs and a mortar attack in Sadr City. October was the bloodiest month yet for civilians, and in the last four months, some 13,000 men, women and children have died.

American helicopters and Iraqi troops did not arrive for two hours after Sunni gunmen began a siege on the Health Ministry controlled by the Shiite cleric Moktada al-Sadr, who has a militia that kills Sunnis and is married to the Maliki government.

Continuing the cycle of revenge yesterday, Shiite militiamen threw kerosene on six Sunnis and set them on fire, as Iraqi soldiers watched, and killed 19 more. The New York Times and other news outlets have been figuring out if it’s time to break with the administration’s use of euphemisms like “sectarian conflict.” How long can you have an ever-descending descent without actually reaching the civil war?

Some analysts are calling it genocide or clash of civilizations, arguing that civil war is too genteel a term for the butchery that is destroying a nation before our very eyes. Anthony Shadid, The Washington Post reporter who won a Pulitzer Prize for his Iraq coverage, went back recently and described “the final, frenzied maturity of once-inchoate forces unleashed more than three years ago by the invasion. There was civil-war-style sectarian killing, its echoes in Lebanon a generation ago. Alongside it were gangland turf battles over money, power and survival; a raft of political parties and their militias fighting a zero-sum game; a raging insurgency; the collapse of authority; social services a chimera; and no way forward for an Iraqi government ordered to act by Americans who themselves are still seen as the final arbiter and, as a result, still depriving that government of legitimacy. Civil war was perhaps too easy a term, a little too tidy.”

It will be harder to sell Congress on the idea that America’s troops should be in the middle of somebody else’s civil war than to convince them that we need to hang tough in the so-called front line of the so-called war on terror against Al Qaeda.

With Iraq splitting, Tony Snow indulges in the ludicrous exercise of hair-splitting. He said that in past civil wars, “people break up into clearly identifiable feuding sides clashing for supremacy.” In Iraq, “you do have a lot of different forces that are trying to put pressure on the government and trying to undermine it. But it’s not clear that they are operating as a unified force.” But Lebanon was a shambles with multiple factions, and everybody called that a civil war.

Mr. Snow has said this is not a civil war because the fighting is not taking place in every province and because Iraqis voted in free elections. But that’s like saying that the Battle of Gettysburg only took place in one small corner of the country, so there was no real American Civil War. And there were elections during our civil war too. President Lincoln was re-elected months before the war’s end.

The president’s comparison to how Vietnam turned out a generation later, his happy talk that Iraq is going to be fine, is preposterous.

As Neil Sheehan, a former Times reporter in Vietnam who wrote the Pulitzer Prize-winning “A Bright Shining Lie,” told me: “In Vietnam, there were just two sides to the civil war. You had a government in Hanoi with a structure of command and an army and a guerrilla movement that would obey what they were told to do. So you had law and order in Saigon immediately after the war ended. In Iraq, there’s no one like that for us to lose to and then do business with.”

The questions are no longer whether there’s a civil war or whether we can achieve a military victory. The only question is, who can we turn the country over to?

At the moment, that would be no one.

Friday, November 24, 2006

DOUBLE THUMP

By JOHN PODHORETZ

November 24, 2006 -- PRESIDENT Bush contributed a new word to the political lexicon when he called the GOP defeat on Election Day a "thumpin'." Now, two weeks after the election, the full nature of the "thumpin' " is coming through pretty clearly - and it's devastating news for Republicans and conservatives and even more disastrous for Bush.

According to vote-cruncher Jay Cost of Realclearpolitics.com, 54 percent of the ballots in open races were cast for Democrats and 46 percent for Republicans. Between 2004 and '06, the GOP's share of the vote fell an astonishing 10 percentage points.

Cost puts it like this: "Republicans should thus count themselves very lucky. With this kind of vote share prior to 1994, the Democrats would have an 81-member majority, as opposed to the 29-member majority they now enjoy." Only certain structural changes in U.S. politics since 1990 prevented that mega-thumpin'. That is, Republicans in the House were spared a decimation of their ranks by forces beyond their control.

But those forces aren't beyond Democratic control - which should panic Republican politicians. Many of the structural changes that saved them this time can be undone, especially after the census of 2010 leads to new congressional maps - which it appears will be supervised in a majority of the states by legislatures controlled by Dems.

Happy-talkers on the Right initially tried to explain away this enormous gap by pointing to lopsided vote tallies like Hillary Clinton's victory here in New York. That line of thought lasted only as long as it took others to point out that other lopsided vote tallies elsewhere in the country benefited GOP candidates, balancing out any "excessive" Democratic totals.

There's no good news whatever for Republicans in the exit polls or anywhere else. The talk that they suffered at the polls this time because GOP voters were disenchanted by the party? Nonsense: By all accounts, more than 90 percent of Republican voters cast their ballot for GOP candidates, and turnout was high. GOP voters didn't revolt against the Republican Party. Independent and conservative Democrats did.

This is a very big deal, because it discredits or revises the governing voting theory of the Bush years. Karl Rove argued that the number of genuinely independent voters whose ballot choice is up for grabs every year has shrunk almost to nothing - 6 percent to 8 percent. Thus, the best way to win wasn't to appeal to the independents but to wring every last vote out of GOP-aligned folks who might be too busy or too distracted to go to the polls on Election Day.

To make sure the GOP could do that, Rove and the Republican National Committee built an extraordinary national database and a community-outreach system to fire people up in the 72 hours before the election. It worked brilliantly in 2002 and '04, and was pretty effective in '06.

Partly due to Rove's brilliance, participation in the electoral process has been revitalized - and not only on the Right. Democrats figured out how to do the 72-hour thing too this year, neutralizing the GOP advantage.

Even more important, independents are back in force in American politics. The one thing you can say about independents is this: They're independent in large measure because they're repelled by ideological passion. They also tend to know less about politics and to follow it less closely - and are susceptible to hollow pseudo-guarantees to get in there and fix what's broken.

Conservatives will be arguing over the meaning of the defeat and how to change things for the better. But we need to understand a key aspect of the defeat - a cultural aspect.

For decades, Americans whose lives did not revolve around politics believed that Democrats were trying to use politics to revise the rules of society - to force America to "evolve" in a Left-liberal direction.

They didn't like the bossiness implied by this attitude and they were appalled by the unintended consequences of the changes instituted by left-liberals, mainly when it came to confiscatory tax policy and the refusal to maintain social order and safe streets. These consequences were marks of profound incompetence in the management of the country, and the Democrats were punished for it.

But over in the past few years, Americans began to get the sense that Republicans had become the party of social revision - that it had allowed its own ideological predilections to run riot and that a new form of political correctness had overtaken the party that had seemed more sensible and more in line with their way of thinking.

And, of course, there was and is Iraq. On all sides, partisans are trying to make the case that the election didn't revolve around Iraq. But it did, at least in this sense: Can anyone doubt that if we had won in Iraq in 2005, Republicans would have strengthened their hold on Congress in 2006 rather than losing both Houses? That voters would have rewarded the party of George W. Bush rather than delivering the "thumpin'" of a lifetime?

COMMENT: Just between you & me…

I believe the GOP as we knew it --from Reaganism to Bush the Lesser--is finished. (& just in time too…) The only thing that can possibly artificially extend its lifespan is the Dems focking up again. Therefore? Don’t fock up, Speaker Pelosi. (& Uncle Bernie? Please keep a low profile. Taking the Congress by stealth, for starters, would be a job well-done in the best Trotskyite tradition…”Better they shouldn’t know we‘re here…” After the bloodless Revolution, free potato knishes & lean corned beef on rye for everybody…& don‘t forget the Dr. Brown‘s Cream Soda…)

CRAPITALIST CRIMINALS VIE FOR CULTURAL RESPECTABILITY

Russia Inaugurates Book Prize. It’s Big.

By SOPHIA KISHKOVSKY

MOSCOW, Nov. 22 — A new Russian national book prize that claims to offer the second largest cash award, after the Nobel, was presented for the first time on Wednesday night to Dmitry Bykov, a prolific journalist, novelist and essayist, for his biography, “Boris Pasternak.”

The prize — sponsored by the Russian government and backed by Russian oligarchs who made their fortunes in oil, commodities and banking — is known as Bolshaya Kniga, or Big Book, and came with 3 million rubles, or just over $113,000. The Nobel carries an award of about $1.4 million.

The winning biography of the author of “Doctor Zhivago” matched the prize’s title, weighing in at nearly 900 pages.

Aleksandr Kabakov, famous here for his anti-utopian writings during the final years of Soviet rule, took second prize and 1.5 million rubles, for his novel “Everything Can Be Put Right,” while third place and 1 million rubles went to Mikhail Shishkin, a Russian novelist who lives in Switzerland, for “Maidenhair.”

The winners were announced at a ceremony at the Central House of Writers, the setting for decades of literary intrigue during the Soviet era. Hundreds of people drank Russian sparkling wine and clustered about in the stark hall, which dates to Stalinist times. Most of the audience was dressed quite casually, including T-shirts and sweaters. Fekla Tolstaya, a well-known television personality and great-great-granddaughter of Leo Tolstoy, was the night’s mistress of ceremonies. The plan is to make the Big Book competition an annual event.

“I think this is colossal if this happens next year and the year after that,” Mr. Kabakov said in an interview following the ceremony. “Soviet literature lived from Lenin Prize to Lenin Prize. In essence this can replace that function.”

By Western standards it was an odd literary event. The billionaire oligarchs not only financed the prize, they were members of the new literary academy, or jury, that picked among the 14 finalists. The jury included writers like Eduard Radzinsky, the co-chair of the jury; Alyona Doletskaya, the editor of Russian Vogue; and business leaders like Viktor Vekselberg, the oil and metals tycoon who bought Malcolm Forbes’s Fabergé egg collection.

Aleksandr Mamut, a financier who was a co-presenter of the third prize, to Mr. Shishkin, is an owner of a chain of Russian bookstores and recently bought three publishing houses. “Literature and music are what made Russia great in the eyes of the world, more than the atom bomb,” said Pyotr Aven, a banker who presented the prize with Mr. Mamut.

Critics of the prize have described it as an attempt by the oligarchs to get some public relations mileage out of literature.

“They want to show a cultural image, they want to show their significance,” said Aleksandr Shatalov, a literary critic who attended the ceremony, but was not a jury member.

Though the most lucrative, the Big Book prize is not the only literary award in Russia. Mikhail Khodorkovsky, a former oil magnate now in a Siberian jail, had financed a Russian Booker Prize in recent years. He is writing tracts from his cell.

Each year the awarding of the Booker had spurred furious debate about the state of post-Soviet Russian literature. Big Book promises to provide another rich forum for squabbles and hand-wringing.

The state of modern Russian literature was Topic A at a roundtable discussion of Big Book finalists held on the eve of the ceremony at Russian State University for the Humanities. Only a dozen students showed up, but the four finalists who participated grew quite spirited in their comments, fretting about the impact of the Internet and other issues.

Mr. Kabakov, for instance, was troubled by the striking success of Oksana Robski, a popular new novelist who specializes in recreating the lives of Russia’s richest women, including scenes of mind-boggling materialism, bodice-ripping passion and mountains of cocaine.

He pointed, also, to “Dukhless” (or, “Soulless”), the latest best seller to plumb the depths of the moneyed class, depicting the moral disintegration of a young business executive into an orgy of cocaine and casual sex. It reads like “Bright Lights, Big City” Russian style, yet the book has drawn praise from many, something that distressed Mr. Kabakov.

“You can say ‘The Forsyte Saga’ is a book about the middle class, and so is ‘Dukhless,’ ” he said, scoffing. Yet he denied feeling any jealousy for the success of such books. Such writers, he said, are working in “a different profession.”

Thursday, November 23, 2006

When Votes Disappear

By PAUL KRUGMAN

November 24, 2006
Op-Ed Columnist

You know what really had me terrified on Nov. 7? The all-too-real possibility of a highly suspect result. What would we have done if the Republicans had held on to the House by a narrow margin, but circumstantial evidence strongly suggested that a combination of vote suppression and defective — or rigged — electronic voting machines made the difference?

Fortunately, it wasn’t a close election. But the fact that our electoral system worked well enough to register an overwhelming Democratic landslide doesn’t mean that things are O.K. There were many problems with voting in this election — and in at least one Congressional race, the evidence strongly suggests that paperless voting machines failed to count thousands of votes, and that the disappearance of these votes delivered the race to the wrong candidate.

Here’s the background: Florida’s 13th Congressional District is currently represented by Katherine Harris, who as Florida’s secretary of state during the 2000 recount famously acted as a partisan Republican rather than a fair referee. This year Ms. Harris didn’t run for re-election, making an unsuccessful bid for the Senate instead. But according to the official vote count, the Republicans held on to her seat, with Vern Buchanan, the G.O.P. candidate, narrowly defeating Christine Jennings, the Democrat.

The problem is that the official vote count isn’t credible. In much of the 13th District, the voting pattern looks normal. But in Sarasota County, which used touch-screen voting machines made by Election Systems and Software, almost 18,000 voters — nearly 15 percent of those who cast ballots using the machines — supposedly failed to vote for either candidate in the hotly contested Congressional race. That compares with undervote rates ranging from 2.2 to 5.3 percent in neighboring counties.

Reporting by The Herald-Tribune of Sarasota, which interviewed hundreds of voters who called the paper to report problems at the polls, strongly suggests that the huge apparent undervote was caused by bugs in the ES&S software.

About a third of those interviewed by the paper reported that they couldn’t even find the Congressional race on the screen. This could conceivably have been the result of bad ballot design, but many of them insisted that they looked hard for the race. Moreover, more than 60 percent of those interviewed by The Herald-Tribune reported that they did cast a vote in the Congressional race — but that this vote didn’t show up on the ballot summary page they were shown at the end of the voting process.

If there were bugs in the software, the odds are that they threw the election to the wrong candidate. An Orlando Sentinel examination of other votes cast by those who supposedly failed to cast a vote in the Congressional race shows that they strongly favored Democrats, and Mr. Buchanan won the official count by only 369 votes. The fact that Mr. Buchanan won a recount — that is, a recount of the votes the machines happened to record — means nothing.

Although state officials have certified Mr. Buchanan as the victor, they’ve promised an audit of the voting machines. But don’t get your hopes up: as in 2000, state election officials aren’t even trying to look impartial. To oversee the audit, the state has chosen as its “independent” expert Prof. Alec Yasinsac of Florida State University — a Republican partisan who made an appearance on the steps of the Florida Supreme Court during the 2000 recount battle wearing a “Bush Won” sign.

Ms. Jennings has now filed suit with the same court, demanding a new election. She deserves one.

But for the nation as a whole, the important thing isn’t who gets seated to represent Florida’s 13th District. It’s whether the voting disaster there leads to legislation requiring voter verification and a paper trail.

And I have to say that the omens aren’t good. I’ve been shocked at how little national attention the mess in Sarasota has received. Here we have as clear a demonstration as we’re ever likely to see that warnings from computer scientists about the dangers of paperless electronic voting are valid — and most Americans probably haven’t even heard about it.

As far as I can tell, the reason Florida-13 hasn’t become a major national story is that neither control of Congress nor control of the White House is on the line. But do we have to wait for a constitutional crisis to realize that we’re in danger of becoming a digital-age banana republic?

R & B LEGEND RUTH BROWN: FIGHTER FOR MUSICIANS’ RIGHTS


















By JON PARELES
November 18, 2006

Ruth Brown, the gutsy rhythm and blues singer whose career extended to acting and crusading for musicians’ rights, died on Friday in Las Vegas. She was 78 and lived in Las Vegas.

The cause was complications following a heart attack and a stroke she suffered after surgery, and Ms. Brown had been on life support since Oct. 29, said her friend, lawyer and executor, Howell Begle.

“She was one of the original divas,” said the singer Bonnie Raitt, who worked with Ms. Brown and Mr. Begle to improve royalties for rhythm and blues performers. “I can’t really say that I’ve heard anyone that sounds like Ruth, before or after. She was a combination of sass and innocence, and she was extremely funky. She could really put it right on the beat, and the tone of her voice was just mighty. And she had a great heart.”

“What I loved about her,” Ms. Raitt added, “was her combination of vulnerability and resilience and fighting spirit. It was not arrogance, but she was just really not going to lay down and roll over for anyone.”

Ms. Brown sustained a career for six decades: first as a bright, bluesy singer who was called “the girl with a tear in her voice” and then, after some lean years, as the embodiment of an earthy, indomitable black woman. She had a life of hard work, hard luck, determination, audacity and style. Sometimes it was said that R&B stood as much for Ruth Brown as it did for rhythm and blues.

As the 1950s began, Ms. Brown’s singles for the fledgling Atlantic Records — like “(Mama) He Treats Your Daughter Mean” and “5-10-15 Hours” — became both the label’s bankroll and templates for all of rock ’n’ roll. She could sound as if she were hurting, or joyfully lusty, or both at once. Her voice was forthright, feisty and ready for anything.

After Ms. Brown’s string of hits ended, she kept singing but also went on to a career in television, radio and movies ( including a memorable role as the disc jockey Motormouth Maybelle in John Waters’s “Hairspray”) and on Broadway, where she won a Tony Award for her part in “Black and Blue.” She worked clubs, concerts and festivals into the 21st century.

“Whatever I have to say, I get it said,” she said in an interview with The New York Times in 1995. “Like the old spirituals say, ‘I’ve gone too far to turn me ’round now.’ ”

Ms. Brown was born Ruth Weston on Jan. 12, 1928, in Portsmouth, Va., the oldest of seven children. She made her debut when she was 4, and her father, the choir director at the local Emmanuel African Methodist Episcopal Church, lifted her onto the church piano. In summers, she and her siblings picked cotton at her grandmother’s farm in North Carolina. “That made me the strong woman I am,” she said in 1995.

As a teenager, she would tell her family she was going to choir practice and perform instead at U.S.O. clubs at nearby naval stations. She ran away from home at 17, working with a trumpeter named Jimmy Brown and using his last name onstage. She married him, or thought she did; he was already married. But she was making a reputation as Ruth Brown, and the name stuck.

The big-band leader Lucky Millinder heard her in Detroit late in 1946, hired her for his band and fired her in Washington, D.C. . Stranded, she managed to find a club engagement at the Crystal Caverns. There, the disc jockey Willis Conover, who broadcast jazz internationally on Voice of America radio, heard Ms. Brown and recommended her to friends at Atlantic Records.

On the way to New York City, however, she was seriously injured in an automobile accident and hospitalized for most of a year; her legs, which were smashed, would be painful for the rest of her life. She stood on crutches in 1949 to record her first session for Atlantic, and the bluesy ballad “So Long” became a hit.

She wanted to keep singing ballads, but Atlantic pushed her to try upbeat songs, and she tore into them. During the sessions for “Teardrops From My Eyes,” her voice cracked upward to a squeal. Herb Abramson of Atlantic Records liked it, called it a “tear,” and after “Teardrops” reached No. 1 on the rhythm and blues chart, the sound became her trademark for a string of hits.

“If I was getting ready to go and record and I had a bad throat, they’d say, ‘Good!’,” she once recalled.

Ms. Brown was the best-selling black female performer of the early 1950s, even though, in that segregated era, many of her songs were picked up and redone by white singers, like Patti Page and Georgia Gibbs, in tamer versions that became pop hits. The pop singer Frankie Laine gave her a lasting nickname: Miss Rhythm.

Working the rhythm and blues circuit in the 1950s, when dozens of her singles reached the R&B Top 10, Ms. Brown drove a Cadillac and had romances with stars like the saxophonist Willis (Gator Tail) Jackson and the singer Clyde McPhatter of the Drifters. (Her first son, Ronald, was given the last name Jackson; decades later, she told him he was actually Mr. McPhatter’s son, and he now sings with a latter-day lineup of the Drifters.)

In 1955 Ms. Brown married Earl Swanson, a saxophonist, and had a second son, Earl; the marriage ended in divorce. Her two sons survive her: Mr. Jackson, who has three children, of Los Angeles, and Mr. Swanson of Las Vegas. She is also survived by four siblings: Delia Weston of Las Vegas, Leonard Weston of Long Island and Alvin and Benjamin Weston of Portsmouth.

Her streak of hits ended soon after the 1960s began. She lived on Long Island, raised her sons, worked as a teacher’s aide and a maid and was married for three years to a police officer, Bill Blunt. On weekends she sang club dates in the New York area, and she recorded an album in 1968 with the Thad Jones-Mel Lewis Big Band. Although her hits had supported Atlantic Records — sometimes called the House That Ruth Built — she was unable at one point to afford a home telephone.

The comedian Redd Foxx, whom she had once helped out of a financial jam, invited her to Los Angeles in 1975 to play the gospel singer Mahalia Jackson in “Selma,” a musical about civil rights he was producing.

She went on to sing in Las Vegas and continued a comeback that never ended. The television producer Norman Lear gave her a role in the sitcom “Hello, Larry.” She returned to New York City in 1982, appearing in Off Broadway productions including “Stagger Lee,” and in 1985 she went to Paris to perform in the revue “Black and Blue,” rejoining it later for its Broadway run.

Ms. Brown began to speak out, onstage and in interviews, about the exploitative contracts musicians of her generation had signed. Many hit-making musicians had not recouped debts to their labels, according to record company accounting, and so were not receiving royalties at all. Shortly before Atlantic held a 40th-birthday concert at Madison Square Garden in 1988, the label agreed to waive unrecouped debts for Ms. Brown and 35 other musicians of her era and to pay 20 years of retroactive royalties.

Atlantic also contributed nearly $2 million to start the Rhythm and Blues Foundation, which pushed other labels toward royalty reform and distributed millions of dollars directly to musicians in need, although it has struggled to sustain itself in recent years.

“Black and Blue” revitalized Ms. Brown’s recording career, on labels including Fantasy and Bullseye Blues. Her 1989 album “Blues on Broadway” won a Grammy Award for best jazz vocal performance, female. She was a radio host on the public radio shows “Harlem Hit Parade” and “BluesStage.” In 1995 she released her autobiography, “Miss Rhythm” (Dutton), written with Andrew Yule; it won the Gleason Award for music journalism. She was inducted into the Rock and Roll Hall of Fame in 1993.

She toured steadily, working concert halls, festivals and cabarets. This year she recorded songs for the coming movie by John Sayles, “Honeydripper,” and was about to fly to Alabama to act in it when she became ill.

Ms. Brown never learned to read music. “In school we had music classes, but I ducked them,” she said in 1995. “They were just a little too slow. I didn’t want to learn to read no note. I knew I could sing it. I woke up one morning and I could sing.”

----

Afternotes:

Background materials on Ruth Brown

Far-left gains in Dutch election


AMSTERDAM -- The Netherlands is facing political uncertainty after the Christian Democrats retained power in the general election but failed to win a majority in parliament. Jan Peter Balkenende, the Prime Minister, won 41 seats in the 150-seat parliament and could not claim a majority even if his Government renewed an alliance with the liberal VVD party.

The opposition Labour party lost 10 seats and now has 32 as voters switched to the far-left Socialists, who almost tripled their representation to 26 places. The far right anti- immigration Party for Freedom won nine seats, a reminder of voters’ concerns about Muslim integration.

The Christian Democrats and Labour differ on many important issues that could mean battles in government over tax, pensions and immigration policy, which would make a coalition unlikely, experts said. (Reuters)

----

Afternotes:

Dutch general election, 2006

Socialist Party (Netherlands)

====

Jack Werber, 92, a Rescuer of Many at Nazi Death Camp, Dies


By DENNIS HEVESI

Jack Werber, a Holocaust survivor who helped save more than 700 children at the Buchenwald slave labor camp in the last months of World War II, then prospered after arriving in the United States by manufacturing coonskin caps during the Davy Crockett craze of the mid-1950’s, died on Saturday. He was 92 and lived in Great Neck, N.Y.

The cause was a heart attack, his son Martin said.

Mr. Werber, a son of a Jewish furrier from the Polish town of Radom, was the barracks clerk at Buchenwald in August 1944 when a train carrying 2,000 prisoners arrived, many of them young boys. By then, with the Russians advancing into Germany, the number of Nazi guards at the camp had been reduced. Working with the camp’s underground — and with the acquiescence of some guards fearful of their fate after the war — Mr. Werber helped save most of the boys from transport to death camps by hiding them throughout the barracks.

During a trip to Israel in 1999, Mr. Werber’s efforts were acknowledged by Israel Meir Lau, a former chief rabbi of Israel and one of the boys that Mr. Werber saved.

In 1996, with William Helmreich, director of the Center for Jewish Studies at Queens College, Mr. Werber wrote “Saving Children: Diary of a Buchenwald Survivor and Rescuer” (Transaction Books). In it, he wrote, “Suffering a great personal loss drove me in my obsession to save children.”

That loss was the knowledge that his first wife, Rachel, and 3-year-old daughter, Emma, had been killed by the Nazis.

“He heard this from an eyewitness who arrived at the camp,” Professor Helmreich said. “He felt he had nothing to live for.” But soon after, the train bearing the children arrived.

“It’s clear that some Nazi guards knew what the underground was doing,” Professor Helmreich said. “They knew there would be trials and said, ‘Remember that I did this for you.’ ”

Jacob Werber was born to Josef and Faija Werber on Sept. 28, 1914, the youngest of five brothers and three sisters. His oldest brother, Max, 32 years his senior, had already emigrated to the United States. Max and Jacob were the only immediate family members to survive the war.

Mr. Werber was arrested by the Nazis in 1939 and sent to Buchenwald with about 3,200 other men. Of that original contingent, only 11 survived.

In late 1945, while both were searching for relatives, Mr. Werber met Mildred Drezner and soon married her. Besides his wife and son Martin, Mr. Werber is survived by another son, David; six grandchildren; and two great-grandchildren.

Soon after arriving in the United States in 1946, Mr. Werber and a cousin started a company that made novelty items like fur coats for dolls and pompoms for ice skates. By the mid-50s, the Disney television show, starring Fess Parker as Davy Crockett, had little boys all over America clamoring for a coonskin cap.

Mr. Werber’s company was not the only one to seize on the fad. “So many hats were being made that it was hard to get raccoon fur,” Martin Werber said. “Dad came up with an idea: a plastic patch covering the top, sort of like a yarmulke, with the fur around it.”

Mr. Werber could not estimate his father’s share of the market, but said, “he sold thousands” — enough to invest in real estate. Eventually, Mr. Werber owned 30 three-family homes and several apartment buildings, most in Queens.

But bad memories did not fade. A photograph, now infamous, emerged after the war, Professor Helmreich pointed out. It shows three prisoners at Buchenwald. Two are hanging by ropes tied to their hands behind their backs, suspended from a tree. A third prisoner is on the ground. It is Mr. Werber, the professor said, “an officer standing over him with stick under his arm, looking down, a foot jutting into him.”

Lost in the Desert

By Maureen Dowd
The New York Times

Wednesday 22 November 2006

Iraq now evokes that old Jimmy Durante song that goes, "Did you ever have the feeling that you wanted to go and still have the feeling that you wanted to stay?"

It's hard to remember when America has been so stuck. We can't win and we can't leave.

The good news is that the election finished what Katrina started. It dismantled the president's fake reality about Iraq, causing opinions to come gushing forth from all quarters about where to go from here.

The bad news is that no one, and I mean no one, really knows where to go from here. The White House and the Pentagon are ready to shift to Plan B. But Plan B is their empty term for miraculous salvation.

(Dick Cheney and his wormy aides, of course, are still babbling about total victory and completing the mission by raising the stakes and knocking off the mullahs in Tehran. His tombstone will probably say, "Here lies Dick Cheney, still winning.")

Even Henry Kissinger has defected from the Plan A gang. Once he thought the war could work, but now he thinks military victory is out of the question. When he turns against a war, you know the war's in trouble. He also believes leaving quickly would risk a civil war so big it could destabilize the Middle East.

Kofi Annan, who thought the war was crazy, now says that the United States is "trapped in Iraq" and can't leave until the Iraqis can create a "secure environment" - even though the Iraqis evince not the slightest interest in a secure environment. (The death squads even assassinated a popular comedian this week.)

The retired Gen. Anthony Zinni, who thought Mr. Bush's crusade to depose Saddam was foolish and did not want to send in any troops, now thinks we may have to send in more troops so we can eventually get out.

Lt. Gen. Raymond Odierno, whose soldiers pulled Saddam out of his spider hole and who is returning to Iraq to take charge of the day-to-day fight, has given up talking about a Jeffersonian democracy and now wishes only for a government in Iraq that's viewed as legitimate. He has gone from "can do" to "don't know." He talked to The Times's Thom Shanker about his curtailed goals of reducing sectarian violence and restoring civil authority, acknowledging: "Will we attain those? I don't know."

At a Senate hearing last week, Gen. John Abizaid sounded like Goldilocks meets Guernica, asserting two propositions about the war that are logically at war with each other. He said we can't have fewer troops because the Iraqis need us, but we can't have more because we don't want the Iraqis to become dependent on us.

He contended that increasing the number of our troops would make the Iraqi government mad, but also asserted that decreasing the number would intensify sectarian violence.

This is a poor menu of options.

As Peter Beinart wrote in The New Republic this week, "In a particularly cruel twist, the events of recent months have demolished the best arguments both for staying and for leaving." Noting in the same magazine that "we are approaching a Saddam-like magnitude for the murder of innocents," Leon Wieseltier worried that the problem may be deeper than the number of our troops; it may be Iraq itself. "After we invaded Iraq, Iraq invaded itself.... We are at the mercy of Iraq, where there is no mercy."

Kirk Semple, The Times's Baghdad correspondent, wrote about Capt. Stephanie Bagley, the daughter and granddaughter of military policemen who was enthusiastic a year ago about her job of building a new Iraqi police force. But that was before the militia so inexorably began to infiltrate the police, presumably with the support of some leaders in Iraq's dysfunctional government. Now, with the police begging the Americans not to make them patrol Baghdad's mean streets and showing her their shrapnel wounds, she just wants to get her unit home safely, without losing another soldier. She said her orders were to train a local force to deal with crimes like theft and murder, not to teach them how to fight a counterinsurgency.

Aside from telling Israel to be nicer to the Palestinians, as if there lies Iraq salvation, James Baker will mostly try to suggest that the US talk to Iran and Syria. Yesterday, after the Lebanese Industry Minister Pierre Gemayel, an opponent of Syria, was assassinated in Beirut, President Bush said he suspected that Iran and Syria were behind the murder.

Maybe Mr. Baker had better find Plan C.

The Pentagon is trying to decide whether we should Go Big, Go Long or Go Home.

Go figure.

Steam Train Maury, 5-Time Hobo King, Is Dead at 89



By DOUGLAS MARTIN

Steam Train Maury, who started life as Maurice W. Graham until a train whistle’s timeless lament compelled him to hop a freight to freedom and, much later, fame, as the first and only Grand Patriarch of the Hobos, died on Nov. 18 in Napoleon, Ohio, near Toledo.

Mr. Graham was 89 and chief caretaker of the hobo myth, a cornerstone of which is the hobos’ term for death: “taking the westbound.” In his case, that last westbound freight left the yard when he suffered the last of several strokes and slipped into a coma, Phyllis Foos, manager of Walter Funeral Home in Toledo, said.

Mr. Graham wrote a book about his life on “the iron road,” was a founding member of the Hobo Foundation and helped establish the Hobo Museum in Britt, Iowa. At the National Hobo Convention in Britt, he was crowned king five times — in 1973, 1975, 1976, 1978 and 1981 — and, in 2004, was anointed grand patriarch.

No one else has ever been named a hobo patriarch. Mr. Graham also had the title Life King of the Hobos East of the Mississippi.

When itinerant men gathered around stewpots in “hobo jungles” during the Depression and for years afterward, Mr. Graham stirred the pot. He told a wonderful story about a hobo riding Halley’s Comet while brandishing a torch.

He told of characters like the Pennsylvania Kid, who shaved with a piece of glass from a Coke bottle. When The Washington Times asked Mr. Graham in 1989 whether it was true that some hobos used deodorant, he answered:

“It’s a shame, but I don’t know what we can do about it.”

Hobos belong to that part of the American imagination where real history merges with showmanship. Since the Civil War, itinerant men have sneaked free rides on freight trains, and as field hands, loggers and miners they had much to do with building the American West and shaping industry. During the Depression, more than a million desperate people rode the rails in search of work.

They were admired as much as pitied. Steinbeck called hobos “the last free men,” and by the late 19th century, hobos had formed their own tongue-in-cheek union, Tourist Union Local 63. Britt officials offered Local 63 their town for its annual convention in 1900 and were shocked when big-city reporters showed up and did not treat the event as the joke it was intended to be.

By 1933, Britt, by then known as “the hobo town,” decided to capitalize on the unlikely confab. It marketed the convention far and wide, gave away mulligan stew and crowned hobo royalty. The gathering, about 100 miles north of Des Moines, became a four-day affair, drawing tens of thousands.

But now hobos are getting scarce, as boxcars have been sealed and the prosecution of trespassers has tightened. Mr. Graham, who took to showing up at the Britt convention in a camper, said some pretenders were “show-bos, not hobos.”

Mr. Graham was one of the last of the authentic, undisputed, old-time hobos. He gave the crowds what they were looking for, including a flowing white beard, a walking stick decorated with owl feathers, and stories about friends like Frying Pan Jack. He even strove to elevate his itinerant, idiosyncratic ilk, emphasizing that hobos are not bums, winos or reprobates.

“A hobo is a man of the world, who travels to see and observe and then shares those views with others,” he said.

Mr. Graham was born on June 3, 1917, in Atchison, Kan. Because of domestic problems, he was shuffled among parents, an aunt and married siblings. He escaped by hopping a train in 1931, at the age of 14.

He eventually settled down, learned the cement-mason trade and set up a school for masons in Toledo. He was an Army medical technician during World War II.

By 1971, he was a day laborer with a wife, two children and a bad hip that kept him from working much. His hanging around the house was getting on his wife’s nerves, The Los Angeles Times reported in 1989.

So one day in 1971, he hopped a freight on the edge of town with a vague idea he would relive hobo memories and see his wife, Wanda, in a few weeks.

It was 1981 when Mr. Graham finally returned. He had not communicated for more than a decade. Wanda agreed to go out for dinner and talk. (She paid, of course.) He wanted to come home, and she ultimately could not resist his charm.

“It was better than living alone,” she told The Times.

In addition to his wife of 69 years, Mr. Graham is survived by his daughters, Alice Spangler and Karen Carson; five grandchildren; and seven great-grandchildren.

After his return Mr. Graham stayed home, except for trips with his wife to hobo events and visits to people in hospitals and prisons. He lived mainly off Social Security.

In 1990, Mr. Graham and Robert J. Hemming wrote “Tales of the Iron Road: My Life as King of the Hobos.” A review in The Los Angeles Times wondered if it neglected “a darker, hard-drinking, womanizing, gambling side of Graham’s nature” in its emphasis on hobo chivalry.

Mr. Graham returned annually to Britt, where he presided over the yearly gravesite service for hobos interred under a large cross made of railroad ties. The hobo ritual is to circle the plot, holding their walking sticks high over the tombstones.

CLASS STRUGGLE



By Jim Webb
The Wall Street Journal

Wednesday 15 November 2006

The most important-and unfortunately the least debated-issue in politics today is our society's steady drift toward a class-based system, the likes of which we have not seen since the 19th century. America's top tier has grown infinitely richer and more removed over the past 25 years. It is not unfair to say that they are literally living in a different country. Few among them send their children to public schools; fewer still send their loved ones to fight our wars. They own most of our stocks, making the stock market an unreliable indicator of the economic health of working people. The top 1% now takes in an astounding 16% of national income, up from 8% in 1980. The tax codes protect them, just as they protect corporate America, through a vast system of loopholes.

Incestuous corporate boards regularly approve compensation packages for chief executives and others that are out of logic's range. As this newspaper has reported, the average CEO of a sizeable corporation makes more than $10 million a year, while the minimum wage for workers amounts to about $10,000 a year, and has not been raised in nearly a decade. When I graduated from college in the 1960s, the average CEO made 20 times what the average worker made. Today, that CEO makes 400 times as much.

In the age of globalization and outsourcing, and with a vast underground labor pool from illegal immigration, the average American worker is seeing a different life and a troubling future. Trickle-down economics didn't happen. Despite the vaunted all-time highs of the stock market, wages and salaries are at all-time lows as a percentage of the national wealth. At the same time, medical costs have risen 73% in the last six years alone. Half of that increase comes from wage-earners' pockets rather than from insurance, and 47 million Americans have no medical insurance at all.

Manufacturing jobs are disappearing. Many earned pension programs have collapsed in the wake of corporate "reorganization." And workers' ability to negotiate their futures has been eviscerated by the twin threats of modern corporate America: If they complain too loudly, their jobs might either be outsourced overseas or given to illegal immigrants.

This ever-widening divide is too often ignored or downplayed by its beneficiaries. A sense of entitlement has set in among elites, bordering on hubris. When I raised this issue with corporate leaders during the recent political campaign, I was met repeatedly with denials, and, from some, an overt lack of concern for those who are falling behind.

A troubling arrogance is in the air among the nation's most fortunate. Some shrug off large-scale economic and social dislocations as the inevitable byproducts of the "rough road of capitalism." Others claim that it's the fault of the worker or the public education system, that the average American is simply not up to the international challenge, that our education system fails us, or that our workers have become spoiled by old notions of corporate paternalism.

Still others have gone so far as to argue that these divisions are the natural results of a competitive society. Furthermore, an unspoken insinuation seems to be inundating our national debate: Certain immigrant groups have the "right genetics" and thus are natural entrants to the "overclass," while others, as well as those who come from stock that has been here for 200 years and have not made it to the top, simply don't possess the necessary attributes.

Most Americans reject such notions. But the true challenge is for everyone to understand that the current economic divisions in society are harmful to our future. It should be the first order of business for the new Congress to begin addressing these divisions, and to work to bring true fairness back to economic life. Workers already understand this, as they see stagnant wages and disappearing jobs.

America's elites need to understand this reality in terms of their own self-interest. A recent survey in the Economist warned that globalization was affecting the U.S. differently than other "First World" nations, and that white-collar jobs were in as much danger as the blue-collar positions which have thus far been ravaged by outsourcing and illegal immigration. That survey then warned that "unless a solution is found to sluggish real wages and rising inequality, there is a serious risk of a protectionist backlash" in America that would take us away from what they view to be the "biggest economic stimulus in world history."

More troubling is this: If it remains unchecked, this bifurcation of opportunities and advantages along class lines has the potential to bring a period of political unrest. Up to now, most American workers have simply been worried about their job prospects. Once they understand that there are (and were) clear alternatives to the policies that have dislocated careers and altered futures, they will demand more accountability from the leaders who have failed to protect their interests. The "Wal-Marting" of cheap consumer products brought in from places like China, and the easy money from low-interest home mortgage refinancing, have softened the blows in recent years. But the balance point is tipping in both cases, away from the consumer and away from our national interest.

The politics of the Karl Rove era were designed to distract and divide the very people who would ordinarily be rebelling against the deterioration of their way of life. Working Americans have been repeatedly seduced at the polls by emotional issues such as the predictable mantra of "God, guns, gays, abortion and the flag" while their way of life shifted ineluctably beneath their feet. But this election cycle showed an electorate that intends to hold government leaders accountable for allowing every American a fair opportunity to succeed.

With this new Congress, and heading into an important presidential election in 2008, American workers have a chance to be heard in ways that have eluded them for more than a decade. Nothing is more important for the health of our society than to grant them the validity of their concerns. And our government leaders have no greater duty than to confront the growing unfairness in this age of globalization.

-------

Mr. Webb is the Democratic senator-elect from Virginia.

Wednesday, November 22, 2006

Pace of Global Warming Causes Alarm

'Very different and frightening world' coming faster than expected, scientists warn

by Seth Borenstein

Animal and plant species have begun dying off or changing sooner than predicted because of global warming, a review of hundreds of research studies contends.

These fast-moving adaptations come as a surprise even to biologists and ecologists because they are occurring so rapidly.

At least 70 species of frogs, mostly mountain-dwellers that had nowhere to go to escape the creeping heat, have gone extinct because of climate change, the analysis says. It also reports that between 100 and 200 other cold-dependent animal species, such as penguins and polar bears, are in deep trouble.

"We are finally seeing species going extinct," said University of Texas biologist Camille Parmesan, author of the study. "Now we've got the evidence. It's here. It's real. This is not just biologists' intuition. It's what's happening."

Her review of 866 scientific studies is summed up in the journal Annual Review of Ecology, Evolution and Systematics.

Parmesan reports seeing trends of animal populations moving northward if they can, of species adapting slightly because of climate change, of plants blooming earlier, and of an increase in pests and parasites.

Parmesan and others have been predicting such changes for years, but even she was surprised to find evidence that it's already happening; she expected it would be another decade away.

Just five years ago biologists, though not complacent, believed the harmful biological effects of global warming were much farther down the road, said Douglas Futuyma, professor of ecology and evolution at the State University of New York in Stony Brook.

"I feel as though we are staring crisis in the face," Futuyma said. "It's not just down the road somewhere. It is just hurtling toward us. Anyone who is 10 years old right now is going to be facing a very different and frightening world by the time that they are 50 or 60."

While over the past several years studies have shown problems with certain species, animal populations or geographic areas, Parmesan's is the first comprehensive analysis showing the big picture of global-warming induced changes, said Chris Thomas, a professor of conservation biology at the University of York in England.

While it's impossible to prove conclusively that the changes are the result of global warming, the evidence is so strong and other supportable explanations are lacking, Thomas said, so it is "statistically virtually impossible that these are just chance observations."

The most noticeable changes in plants and animals have to do with earlier springs, Parmesan said. The best example can be seen in earlier cherry blossoms and grape harvests and in 65 British bird species that in general are laying their first eggs nearly nine days earlier than 35 years ago.

Parmesan said she worries most about the cold-adapted species, such as emperor penguins that have dropped from 300 breeding pairs to just nine in the western Antarctic Peninsula, or polar bears, which are dropping in numbers and weight in the Arctic.

The cold-dependent species on mountaintops have nowhere to go, which is why two-thirds of a certain grouping of frog species have already gone extinct, Parmesan said.

Populations of animals that adapt better to warmth or can move and live farther north are adapting better than other populations in the same species, Parmesan said.

"We are seeing a lot of evolution now," Parmesan said. However, no new gene mutations have shown themselves, not surprising because that could take millions of years, she said.

Lieberman hires neo-con chameleon

By Evan Derkacz
http://www.alternet.org/bloggers/evan/44604/

With the acquisition of former Christian Coalition Legislative Affairs director, Marshall "Bull Moose" Wittman, Lieberman continues his glacial break from the Democratic Party. Maybe he'll talk about it tonight with Hannity & Colmes on Fox?

Wittman, a former cohort of the CC's scandal-plagued leader, Ralph Reed, has made his home at the withering Democratic Leadership Council's PPI think tank and as an adviser to John McCain, who recently began pandering to right wing bigots like Jerry Falwell in his bid for the '08 presidency. Wittman praised the move as "unconventional." Because, you know, pandering to someone you don't agree with in politics doesn't come often.

He abandoned his "Bull Moose" blog with this sentiment:


The great and grand political development of the past year has been the triumph of Independent - Democrat Senator Joe Lieberman. Joe has bravely revived the great tradition of Scoop Jackson that is so critically needed at this time of international challenge and crisis.

For those unfamiliar with Henry "Scoop" Jackson, he was a (proto-neo-) conservative Democrat who vigorously supported the Vietnam War, nuclear arms, Japanese internment during WWII, and a Steroidal military in general. In a 2002 profile the Guardian UK wrote of Jackson that: "One man more than any other can credibly claim the intellectual and political credit for the Bush administration's bellicose showdown with Iraq and its muscular new doctrine of pre-emption."

Wittman, along with Paul Wolfowitz, Richard Perle (who worked for "Scoop" and retains his Democratic registration in his honor), Elliot Abrams, and Douglas Feith, are all followers of Jackson, along with, of course, Joe Lieberman.

Jackson, Wittman, Lieberman, and the neocons are a tough bunch to pin down in some ways. Largely a bunch of social liberals (they generally, rhetorically at least, support some semblance of environmental responsibility, concern for the poor, equality for people of color etc etc), though their foreign policy is riddled with White Man's Burden-style optimism.

In some oafishly narrow sense they seek to liberate the world (parts of it anyway) from tyranny. The problem with this omelet, of course, is all the eggs that have to be cracked along the way. Delivering liberation at the tip of a gun, they've managed to push for Vietnam and much of our misguided Middle East policy, ironically sapping our military and turning perception of America on its head.

Lieberman has strong ties to the Christian Zionist/Conservative Jewish network, including Pat Robertson, Jerry Falwell, John Hagee and the Left Behind players, and so does Wittman, from his days in Robertson's Christian Coalition.

With the hiring of Wittman, Lieberman rounds the homestretch to neo-con-dom. Nothing new here... just the final creaking sounds of a ship going under.

----

Evan Derkacz is an AlterNet editor. He writes and edits PEEK, the blog of blogs.

BUT WHO WILL PARDON THE PARDONER…? (& which one is the real turkey…?)



















Turkey Spared After Scare From Barney

By THE ASSOCIATED PRESS

WASHINGTON -- He was going to pardon the National Thanksgiving Turkey anyway, but President Bush figured he really owed the bird this time. His dog had just scared the stuffing out of it.

Bush spared the turkey -- named ''Flyer'' in an online vote -- during a Rose Garden ceremony on Wednesday. The backup bird, ''Fryer,'' was also pardoned but nowhere to be seen on this raw day.

The president explained that his Scottish Terrier, Barney, got involved this year. The presidential dog typically gets his exercise by chasing a soccer ball around the Rose Garden.

''He came out a little early, as did Flyer,'' Bush said. ''And instead of chasing the soccer ball, he chased the bird. And it kind of made the turkey nervous. See, the turkey was nervous to begin with. Nobody's told him yet about the pardon I'm about to give him.''

Bush announced that the birds would be sent off to Disneyland in California to be the honorary grand marshals of a Thanksgiving Day Parade, just like their predecessors a year ago.

At one point, Bush moved in for a closer look at Flyer, a well-behaved bird raised in Missouri. He petted the turkey's head and back before inviting a couple dozen Girl Scouts to come up and join him.

''It's a fine looking bird, isn't it?'' Bush said.

The popular pardon ceremony dates to the days of President Harry Truman in 1947.

Yet savoring turkeys, not saving them, is the agenda for millions of people on Thanksgiving Day.

The typical American consumes more than 13 pounds of turkey a year, with a good serving of it coming at Thanksgiving.

People for the Ethical Treatment of Animals urged Bush to send the pardoned turkeys to an animal sanctuary, where ''they will get the exercise and socialization that they need to live longer, happier lives.''

In return, the group offered Bush a feast of Tofu turkey, vegetarian stuffing and a vegan apple pie.

Just back from a trip to Asia, Bush and his wife Laura will spend the holiday at Camp David before another international trip early next week to the Baltics and the Middle East.

The Bushes left the White House early Wednesday afternoon and arrived at the presidential retreat.

The first family's menu for Thanksgiving includes free-range roasted turkey, cornbread dressing, zucchini gratin, whipped maple sweet potatoes, basil chive red potato mash and pumpkin pie.

----

On The Net:

White House Thanksgiving

A Free-for-All on Science and Religion

By GEORGE JOHNSON
The New York Times
November 21, 2006

Maybe the pivotal moment came when Steven Weinberg, a Nobel laureate in physics, warned that “the world needs to wake up from its long nightmare of religious belief,” or when a Nobelist in chemistry, Sir Harold Kroto, called for the John Templeton Foundation to give its next $1.5 million prize for “progress in spiritual discoveries” to an atheist — Richard Dawkins, the Oxford evolutionary biologist whose book “The God Delusion” is a national best-seller.

Or perhaps the turning point occurred at a more solemn moment, when Neil deGrasse Tyson, director of the Hayden Planetarium in New York City and an adviser to the Bush administration on space exploration, hushed the audience with heartbreaking photographs of newborns misshapen by birth defects — testimony, he suggested, that blind nature, not an intelligent overseer, is in control.

Somewhere along the way, a forum this month at the Salk Institute for Biological Studies in La Jolla, Calif., which might have been one more polite dialogue between science and religion, began to resemble the founding convention for a political party built on a single plank: in a world dangerously charged with ideology, science needs to take on an evangelical role, vying with religion as teller of the greatest story ever told.

Carolyn Porco, a senior research scientist at the Space Science Institute in Boulder, Colo., called, half in jest, for the establishment of an alternative church, with Dr. Tyson, whose powerful celebration of scientific discovery had the force and cadence of a good sermon, as its first minister.

She was not entirely kidding. “We should let the success of the religious formula guide us,” Dr. Porco said. “Let’s teach our children from a very young age about the story of the universe and its incredible richness and beauty. It is already so much more glorious and awesome — and even comforting — than anything offered by any scripture or God concept I know.”

She displayed a picture taken by the Cassini spacecraft of Saturn and its glowing rings eclipsing the Sun, revealing in the shadow a barely noticeable speck called Earth.

There has been no shortage of conferences in recent years, commonly organized by the Templeton Foundation, seeking to smooth over the differences between science and religion and ending in a metaphysical draw. Sponsored instead by the Science Network, an educational organization based in California, and underwritten by a San Diego investor, Robert Zeps (who acknowledged his role as a kind of “anti-Templeton”), the La Jolla meeting, “Beyond Belief: Science, Religion, Reason and Survival,” rapidly escalated into an invigorating intellectual free-for-all. (Unedited video of the proceedings will be posted on the Web at tsntv.org.)

A presentation by Joan Roughgarden, a Stanford University biologist, on using biblical metaphor to ease her fellow Christians into accepting evolution (a mutation is “a mustard seed of DNA”) was dismissed by Dr. Dawkins as “bad poetry,” while his own take-no-prisoners approach (religious education is “brainwashing” and “child abuse”) was condemned by the anthropologist Melvin J. Konner, who said he had “not a flicker” of religious faith, as simplistic and uninformed.

After enduring two days of talks in which the Templeton Foundation came under the gun as smudging the line between science and faith, Charles L. Harper Jr., its senior vice president, lashed back, denouncing what he called “pop conflict books” like Dr. Dawkins’s “God Delusion,” as “commercialized ideological scientism” — promoting for profit the philosophy that science has a monopoly on truth.

That brought an angry rejoinder from Richard P. Sloan, a professor of behavioral medicine at Columbia University Medical Center, who said his own book, “Blind Faith: The Unholy Alliance of Religion and Medicine,” was written to counter “garbage research” financed by Templeton on, for example, the healing effects of prayer.

With atheists and agnostics outnumbering the faithful (a few believing scientists, like Francis S. Collins, author of “The Language of God: A Scientist Presents Evidence for Belief,” were invited but could not attend), one speaker after another called on their colleagues to be less timid in challenging teachings about nature based only on scripture and belief. “The core of science is not a mathematical model; it is intellectual honesty,” said Sam Harris, a doctoral student in neuroscience and the author of “The End of Faith: Religion, Terror and the Future of Reason” and “Letter to a Christian Nation.”

“Every religion is making claims about the way the world is,” he said. “These are claims about the divine origin of certain books, about the virgin birth of certain people, about the survival of the human personality after death. These claims purport to be about reality.”

By shying away from questioning people’s deeply felt beliefs, even the skeptics, Mr. Harris said, are providing safe harbor for ideas that are at best mistaken and at worst dangerous. “I don’t know how many more engineers and architects need to fly planes into our buildings before we realize that this is not merely a matter of lack of education or economic despair,” he said.

Dr. Weinberg, who famously wrote toward the end of his 1977 book on cosmology, “The First Three Minutes,” that “the more the universe seems comprehensible, the more it also seems pointless,” went a step further: “Anything that we scientists can do to weaken the hold of religion should be done and may in the end be our greatest contribution to civilization.”

With a rough consensus that the grand stories of evolution by natural selection and the blossoming of the universe from the Big Bang are losing out in the intellectual marketplace, most of the discussion came down to strategy. How can science fight back without appearing to be just one more ideology?

“There are six billion people in the world,” said Francisco J. Ayala, an evolutionary biologist at the University of California, Irvine, and a former Roman Catholic priest. “If we think that we are going to persuade them to live a rational life based on scientific knowledge, we are not only dreaming — it is like believing in the fairy godmother.”

“People need to find meaning and purpose in life,” he said. “I don’t think we want to take that away from them.”

Lawrence M. Krauss, a physicist at Case Western Reserve University known for his staunch opposition to teaching creationism, found himself in the unfamiliar role of playing the moderate. “I think we need to respect people’s philosophical notions unless those notions are wrong,” he said.

“The Earth isn’t 6,000 years old,” he said. “The Kennewick man was not a Umatilla Indian.” But whether there really is some kind of supernatural being — Dr. Krauss said he was a nonbeliever — is a question unanswerable by theology, philosophy or even science. “Science does not make it impossible to believe in God,” Dr. Krauss insisted. “We should recognize that fact and live with it and stop being so pompous about it.”

That was just the kind of accommodating attitude that drove Dr. Dawkins up the wall. “I am utterly fed up with the respect that we — all of us, including the secular among us — are brainwashed into bestowing on religion,” he said. “Children are systematically taught that there is a higher kind of knowledge which comes from faith, which comes from revelation, which comes from scripture, which comes from tradition, and that it is the equal if not the superior of knowledge that comes from real evidence.”

By the third day, the arguments had become so heated that Dr. Konner was reminded of “a den of vipers.”

“With a few notable exceptions,” he said, “the viewpoints have run the gamut from A to B. Should we bash religion with a crowbar or only with a baseball bat?”

His response to Mr. Harris and Dr. Dawkins was scathing. “I think that you and Richard are remarkably apt mirror images of the extremists on the other side,” he said, “and that you generate more fear and hatred of science.”

Dr. Tyson put it more gently. “Persuasion isn’t always ‘Here are the facts — you’re an idiot or you are not,’ ” he said. “I worry that your methods” — he turned toward Dr. Dawkins — “how articulately barbed you can be, end up simply being ineffective, when you have much more power of influence.”

Chastened for a millisecond, Dr. Dawkins replied, “I gratefully accept the rebuke.”

In the end it was Dr. Tyson’s celebration of discovery that stole the show. Scientists may scoff at people who fall back on explanations involving an intelligent designer, he said, but history shows that “the most brilliant people who ever walked this earth were doing the same thing.” When Isaac Newton’s “Principia Mathematica” failed to account for the stability of the solar system — why the planets tugging at one another’s orbits have not collapsed into the Sun — Newton proposed that propping up the mathematical mobile was “an intelligent and powerful being.”

It was left to Pierre Simon Laplace, a century later, to take the next step. Hautily telling Napoleon that he had no need for the God hypothesis, Laplace extended Newton’s mathematics and opened the way to a purely physical theory.

“What concerns me now is that even if you’re as brilliant as Newton, you reach a point where you start basking in the majesty of God and then your discovery stops — it just stops,” Dr. Tyson said. “You’re no good anymore for advancing that frontier, waiting for somebody else to come behind you who doesn’t have God on the brain and who says: ‘That’s a really cool problem. I want to solve it.’ ”

“Science is a philosophy of discovery; intelligent design is a philosophy of ignorance,” he said. “Something fundamental is going on in people’s minds when they confront things they don’t understand.”

He told of a time, more than a millennium ago, when Baghdad reigned as the intellectual center of the world, a history fossilized in the night sky. The names of the constellations are Greek and Roman, Dr. Tyson said, but two-thirds of the stars have Arabic names. The words “algebra” and “algorithm” are Arabic.

But sometime around 1100, a dark age descended. Mathematics became seen as the work of the devil, as Dr. Tyson put it. “Revelation replaced investigation,” he said, and the intellectual foundation collapsed.

He did not have to say so, but the implication was that maybe a century, maybe a millennium from now, the names of new planets, stars and galaxies might be Chinese. Or there may be no one to name them at all.

Before he left to fly back home to Austin, Dr. Weinberg seemed to soften for a moment, describing religion a bit fondly as a crazy old aunt.

“She tells lies, and she stirs up all sorts of mischief and she’s getting on, and she may not have that much life left in her, but she was beautiful once,” he lamented. “When she’s gone, we may miss her.”

Dr. Dawkins wasn’t buying it. “I won't miss her at all,” he said. “Not a scrap. Not a smidgen.”

Tuesday, November 21, 2006

Military Documents Hold Tips on Antiwar Activities


























By ERIC LICHTBLAU and MARK MAZZETTI

WASHINGTON, Nov. 20 — An antiterrorist database used by the Defense Department in an effort to prevent attacks against military installations included intelligence tips about antiwar planning meetings held at churches, libraries, college campuses and other locations, newly disclosed documents show.

One tip in the database in February 2005, for instance, noted that “a church service for peace” would be held in the New York City area the next month. Another entry noted that antiwar protesters would be holding “nonviolence training” sessions at unidentified churches in Brooklyn and Manhattan.

The Defense Department tightened its procedures earlier this year to ensure that only material related to actual terrorist threats — and not peaceable First Amendment activity — was included in the database.

The head of the office that runs the military database, which is known as Talon, said Monday that material on antiwar protests should not have been collected in the first place.

“I don’t want it, we shouldn’t have had it, not interested in it,” said Daniel J. Baur, the acting director of the counterintelligence field activity unit, which runs the Talon program at the Defense Department. “I don’t want to deal with it.”

Mr. Baur said that those operating the database had misinterpreted their mandate and that what was intended as an antiterrorist database became, in some respects, a catch-all for leads on possible disruptions and threats against military installations in the United States, including protests against the military presence in Iraq.

“I don’t think the policy was as clear as it could have been,” he said. Once the problem was discovered, he said, “we fixed it,” and more than 180 entries in the database related to war protests were deleted from the system last year. Out of 13,000 entries in the database, many of them uncorroborated leads on possible terrorist threats, several thousand others were also purged because he said they had “no continuing relevance.”

Amid public controversy over the database, leads from so-called neighborhood watch programs and other tips about possible threats are down significantly this year, Mr. Baur said. While the system had been tightened, he said he was concerned that the public scrutiny had created “a huge chilling effect” that could lead the military to miss legitimate terrorist threats.

Mr. Baur was responding to the latest batch of documents produced by the military under a Freedom of Information Act request brought by the American Civil Liberties Union and other groups. The A.C.L.U. planned to release the documents publicly on Tuesday, and officials with the group said they would push for Democrats, newly empowered in Congress, to hold formal hearings about the Talon database.

Ben Wizner, a lawyer for the A.C.L.U. in New York, said the new documents suggested that the military’s efforts to glean intelligence on protesters went beyond what was previously known. If intelligence officials “are going to be doing investigations or monitoring in a place where people gather to worship or to study, they should have a pretty clear indication that a crime has occurred,” Mr. Wizner added.

The leader of one antiwar group mentioned repeatedly in the latest military documents provided to the A.C.L.U. said he was skeptical that the military had ended its collection of material on war protests.

“I don’t believe it,” said the leader, Michael T. McPhearson, a former Army captain who is the executive director of Veterans for Peace, a group in St. Louis.

Mr. McPhearson said he found the references to his group in the Talon database disappointing but not altogether surprising, and he said the group continued to use public settings and the Internet to plan its protests.

“We don’t have anything to hide,” he said. “We’re not doing anything illegal.”

The latest Talon documents showed that the military used a variety of sources to collect intelligence leads on antiwar protests, including an agent in the Department of Homeland Security, Google searches on the Internet and e-mail messages forwarded by apparent informants with ties to protest groups.

In most cases, entries in the Talon database acknowledged that there was no specific evidence indicating the possibility of terrorism or disruptions at the antiwar events, but they warned of the potential for violence.

One entry on Mr. McPhearson’s group from April 2005, for instance, described a protest at New Mexico State University in Las Cruces at which members handed out antimilitary literature and set up hundreds of white crosses to symbolize soldiers killed in Iraq.

“Veterans for Peace is a peaceful organization,” the entry said, but added there was potential that future protests “could become violent.”

Robert Altman (1925-2006)




Outstanding auteur of modern cinema.

Milton Friedman 1912-2006: “Free market” architect of social reaction

By Nick Beams
21 November 2006

In his afterword to the second edition of Capital in 1873, Karl Marx noted that the scientific character of bourgeois economics had come to an end about 1830. At that point the class tensions generated by the development of the capitalist mode of production itself made further advances impossible. “In place of disinterested inquirers there now stepped forward hired prize-fighters; in place of genuine scientific research, the bad conscience and evil intent of apologetics.”

The economist Milton Friedman, who died last Thursday aged 94, will be remembered in years to come as one of the classic representatives of this tendency. Indeed his own career, culminating in his rise to the position of intellectual godfather of the “free market” over the past four decades, is a graphic example of the very processes to which Marx had pointed.

In the post-war boom, now looked back on as a kind of “golden age” for capitalism, at least in the major economies, Friedman was very much on the margins of bourgeois economics. When this writer begun a university study of economics in the latter half of the 1960s Friedman, and the free market Chicago School in which he was a central figure, were regarded as eccentrics, if not oddities. This was the heyday of Keynesianism, based on the notion that regulation of “effective demand” by government policies—increased spending in times of recession, cutbacks in periods of economic growth and expansion—could prevent the re-emergence of the kind of crisis that had devastated world capitalism in the 1930s.

All that was about to change. The breakdown of the post-war economic boom in the early 1970s, bringing deep recession as well as rapid inflation and high unemployment, saw the collapse of the Keynesian prescriptions. Under the Keynesian program, inflation was regarded as the antidote to unemployment. Now the two were taking place in combination—giving rise to the phenomenon of “stagflation”.

The boom’s demise was not the product of the “failure” of Keynesianism. Rather it was caused by the re-emergence of deep-seated contradictions within the capitalism economy. This meant that the bourgeoisie in the major capitalist countries could no longer continue with the program of class compromise based on concessions to the working class—the pursuit of full employment and the provision of social welfare measures that had characterised the boom—but had to undertake a sharp turn.

Friedman provided the ideological justification for the new orientation: the denunciation of government intervention as the cause of the crisis and insistence on a return to the principles of the “free market” which had been so discredited in the 1930s. Less than a decade after the collapse of the boom, Friedman’s “eccentric” theories had become the new orthodoxy and Keynesianism the new heresy.

In October 1976, the Swedish Academy in Stockholm, sensing the shift in the winds, awarded Friedman the Nobel Prize for economics. One month before, in a major speech to the British Labour Party conference, prime minister James Callaghan summed up what was to become the new conventional wisdom and its implications for government policy.

“We used to think that you could spend your way out of a recession and increase employment by cutting taxes and boosting government spending. I tell you in all candour that that option no longer exists, and in so far as it ever did exist, it only worked on each occasion since the war by injecting a bigger dose of inflation into the economy, followed by a higher level of unemployment as the next step.”


The Great Depression

Milton Friedman was born in Brooklyn, New York, the fourth son of immigrants from central Europe. He later wrote that while the family income was “small and highly uncertain” and financial crisis was a constant companion, there was always enough to eat, and the family atmosphere was warm and supportive.

After graduating from high school before his sixteenth birthday, Friedman won a scholarship to study at Rutgers University, New Jersey. He initially planned to become an actuary and studied mathematics, but his interest in economics grew under the impact of the Great Depression. Graduating in both mathematics and economics in 1932, he gained a masters degree from the University of Chicago 12 months later. Friedman initially obtained a government job at the National Resources Committee—a creation of Roosevelt’s New Deal—and then joined the National Bureau of Economic Research. When the war began he was involved in the development of federal tax policy and is credited with developing the federal withholding tax, which forms the basis of the pay-as-you-go system.

After receiving his doctorate from Colombia University in 1946, Friedman returned to Chicago University to teach economic theory. He remained there until his retirement in 1976, the head of what had become known as the Chicago School of economics, based on the free market and an insistence on the importance of the quantity of money in determining the business cycle.

Friedman was active in Republican policy circles. In 1964 he served as an informal adviser to the presidential candidate and standard-bearer for the Republican right wing, Barry Goldwater, and was an adviser to both Richard Nixon in 1968 and Ronald Reagan in 1980. When Reagan won office, Friedman served as a member of his Economic Policy Advisory Board and in 1988 received the Presidential Medal for Freedom. In 2002, President George W. Bush honoured him for “lifetime achievements” and hailed him as a “hero of freedom” at a White House function on the occasion of his 90th birthday.

Friedman’s work on economic theory was guided by an adherence to what is known as the quantity theory of money. Friedman used this theory, which has a long history going back to the English philosopher David Hume, to formulate his opposition to the Keynesian perspective of demand management and government intervention. According to Friedman, if too much money were created by the monetary authorities, prices would increase—inflation, he insisted was always a monetary phenomenon. The task of government, he claimed, was not to regulate the economy through spending, but to ensure a sufficient expansion of the money supply to account for natural economic growth, and allow the market to solve the problems of unemployment and recession.

However if the Keynesians were to be refuted, Friedman saw that it was essential that the battle take place on their ground, with historical and statistical analyses. This was the background to his major theoretical work A Monetary History of the United States 1867-1960, written jointly with Anna Schwartz and published in 1963. Through an examination of economic history, Friedman and Schwartz sought to reveal the crucial role of the supply of money in determining the level of economic activity and, in doing so, to establish the necessary guidelines for future policy.

In its statement announcing the awarding of the Nobel prize to Friedman, the Swedish academy placed special emphasis on this work. “Most outstanding,” the citation read, “is, perhaps, his original and energetically pursued study of the strategic role played by the policy of the Federal Reserve System in sparking off the 1929 crisis, and in deepening and prolonging the depression that followed.”

But it is through an examination of the 1930s depression—the most important economic event of the twentieth century—that the theoretical bankruptcy of Friedman’s work stands most clearly revealed. According to Friedman, what would have been a normal recession in 1929-30 was transformed into an economic disaster by a series of policy mistakes made by the Federal Reserve, the body responsible for regulating the money supply.

In the first instance, he maintained, the Federal Reserve had wrongly started to tighten monetary policy in the spring of 1928, continuing until the stock market crash of October 1929 under conditions that were not conducive to tighter money—the economy had only just started to move out of the previous business cycle trough in 1927, commodity prices were falling and there was no sign of inflation. The Federal Reserve, however, considered it necessary to rein in the speculative use of credit on the stock market.

In Friedman’s view, however, the most significant impact of the Federal Reserve’s policies was not in sparking the depression but in bringing about the collapse of 1931-32. As banks were going into liquidation, the Federal Reserve, instead of expanding credit and stabilising the financial system, cut the money supply and exacerbated the crisis. Altogether, he and Schwartz found that the money supply in the US contracted by one third between 1929 and 1933. As critics of Friedman have pointed out, this fall was as much a product of the contraction in economic activity as an active cause.


Human “freedom”

Notwithstanding such objections, Friedman’s analysis served important political purposes—it transferred attention from the failures of capitalism and its free market to the role of governments. As Friedman expounded in an interview with Radio Australia in July 1998, the Great Depression was not a “result of the failure of the market system as was widely interpreted” but was “instead a consequence of a very serious government failure, in particular a failure in the monetary authorities to do what they’d initially been set up to do” and prevent banking panics.

The obvious question then was: why did the Federal Reserve fail to prevent a collapse? According to Friedman, the board of the New York Federal reserve was wracked by a series of conflicts following the death of its powerful governor Benjamin Strong. These prevented the implementation of correct policy.

“The fact that bad monetary policy was carried out,” he explained in a television interview for the PBS series the “First Measured Century”, “was, in part, the result of a real accident, which was that the dominant figure in the Federal Reserve System, Benjamin Strong ... had died in 1928. It is my considered opinion that if he had lived two or three more years, you might very well not have had a Great Depression.”

Such were the absurd lengths to which Friedman was prepared to go in order to prevent any critical examination of the role of capitalism and the “free market” in bringing about the greatest economic collapse in history. What was perhaps even more absurd was that his analysis was taken seriously in academic circles, which launched a search to discover Strong’s real views and whether he would have acted differently.

Friedman’s ascendancy to the ranks of “leading economist” had little to do with the intellectual and scientific value of his work. Rather, it was the result of his continuing efforts to extol the virtues of the free market and private property in opposition to the prevailing orthodoxy. Consequently, when the post-war compromise ended, and new prize-fighters were required, he was installed as chief propagandist for a new, socially regressive era based on the unfettered accumulation of wealth by a tiny minority ... all in the name of human “freedom”.

The basis of Friedman’s ideology was the conception that human freedom was inseparable from the unfettered operation of the market and the system of private property. Moreover, the market was not a particular social formation arising at a definite point in the history of human society but had a timeless quality. Just as the ruling classes in feudal times had the priests on hand to assure them that their place in the hierarchy was God-given, so Friedman assured the ruling classes of the present day that the social system which showered wealth and privileges upon them was rooted in the very nature of human social organisation itself.

In his book Capitalism and Freedom, published in 1962, he wrote: “Historical evidence speaks with a single voice on the relation between political freedom and the free market.” Expanding on this theme in a lecture delivered in 1991, he went on to identify the market with all forms of human social interaction.

“A free private market,” he wrote, “is a mechanism for achieving voluntary co-operation among people. It applies to any human activity, not simply to economic transactions. We are speaking a language. Where did that language come from? Did some government construct the language and instruct people to use it? Was there some commission that developed the rules of grammar? No, the language we speak developed through a free private market.”

Friedman’s attempt to turn the development of language, and by implication every human activity, into a market phenomenon collapses upon even the most preliminary analysis. The free market presupposes the existence of separate individuals who exchange the products of their private labour. In language, however, people do not exchange their private creations. In order to understand and in turn be understood, the individual must learn the language that has already been developed by socialised humanity. Friedman’s assertion makes about as much sense as would a claim that individual elements engage in a “market transaction” when they “exchange” electrons to form a compound.


The Chile “experiment”

If Friedman’s free market dogmas had no scientific content, they were nonetheless extremely valuable in the service of definite class interests, as the experience of Chile was to graphically demonstrate.

In 1975, following the overthrow of the elected Allende government in a military coup on September 11, 1973, the head of the junta, Augusto Pinochet, called on Friedman and his “Chicago boys”—economists trained under his tutelage—to reorganise the Chilean economy.

Under the direct guidance of Friedman and his followers, Pinochet set out to implement a “free market” program based on deregulation of the economy and privatization. He abolished the minimum wage, rescinded trade union rights, privatised the pension system, state industries and banks, and lowered taxes on incomes and profits.

The result was a social disaster for the mass of the Chilean population. Unemployment rose from just over 9 percent in 1974 to almost 19 percent in 1975. Output fell by 12.9 percent in the same period—a contraction comparable to that experienced by the United States in the 1930s.

After 1977, the Chilean economy enjoyed something of a recovery, with the growth rate reaching 8 percent. Ronald Reagan proclaimed Chile as a “model” for Third World development, while Friedman claimed that the “Chile experiment” was “comparable to the economic miracle of post-war Germany.” In 1982 he heaped praise on the dictator Pinochet whom, he declared, “has supported a fully free-market economy as a matter of principle. Chile is an economic miracle.”

But the recovery was short-lived. In 1983 the economy was devastated, with unemployment rising, at one point, to 34.6 percent. Manufacturing production contracted by 28 percent. Between 1982 and 1983, gross domestic product contracted by 19 percent. Rather than bringing freedom, the free market resulted in the accumulation of vast wealth at one pole and poverty and misery at the other. In 1970, 20 percent of Chile’s population had lived in poverty. By 1990, the last year of the military dictatorship, this had doubled to 40 percent. At the same time, real wages had declined by more than 40 percent. The wealthy, however, were getting wealthier. In 1970 the top one-fifth of the population controlled 45 percent of the wealth compared to 7.6 percent by the bottom one-fifth. By 1989, the proportions were 55 percent and 4.4 percent respectively.

The Chilean experience was no isolated event. It was simply the first demonstration of the fact that, far from bringing human freedom, the unleashing of the capitalist free market could only take place through the organized violence of the state.

In the United States, the monetarist free market program implemented during the Reagan administration was accompanied by the destruction of the trade unions, starting with the smashing of the air traffic controllers’ union, PATCO, in 1981. As Federal Reserve Board chairman Paul Volcker was later to remark: “The most important single action of the administration in helping the anti-inflation fight was defeating the air traffic controllers’ strike.”

Likewise in Britain, the Thatcherite economic counter-revolution, based on the ideas of Friedman and one of his most influential mentors, Friedrich Hayek, led directly to the smashing of the miners’ union through a massive intervention by the police and other state forces in the year-long strike of 1984-85.

Elsewhere the same processes were at work—notably in Australia, where the program of privatization, deregulation and the free market saw state-organised suppression of the workers’ movement, all carried out by the Hawke-Keating Labor governments between 1983 and 1996.

As Friedman went to his grave, the plaudits filled the air. Bush hailed him as “a revolutionary thinker and extraordinary economist whose work helped advance human dignity and human freedom.” Margaret Thatcher praised his revival of the “economics of liberty” and described him as an “intellectual freedom fighter”. US treasury secretary Henry Paulson said he would always be counted “among the greatest economists.” The New York Times obituary described Friedman as a “giant of economics” for whom criticism of his actions in Chile was “just a bump in the road.” Australian prime minister John Howard called him “a towering figure of world economic theory” while an editorial in Rupert Murdoch’s newspaper the Australian called him “liberty’s champion”.

And so it went on. Nothing, it seems, gratifies the rich and powerful so much as the justification of their elevated position in terms of freedom and liberty. In the coming period, however, under changed social conditions and in different political circumstances, the name Milton Friedman will evoke a very different response.
Link

Web Site Hit Counters
High Speed Internet Services