Thursday, November 27, 2008

Thanksgiving, dammit

Once again Thanksgiving has come and once again we are treated to the historical revisionism that has become as traditional as turkey and cranberry sauce. In place of the happy talk mythologies of peace, love, and harmony we were spoon-fed as children we find people snarling out dark tales of murder and mayhem by the bloodthirsty "Pilgrims."

One such offering is found at The Mahatma X Files and as much as I admire James and regard him as a compatriot, on this I have to call bullshit.

Truth be told, I almost let this pass with just a brief note of dissent because I have long since gotten tired of the annual vituperation I receive from various quarters when I try to bring some sanity to this discussion - but I decided I couldn't. The fact is, the passages James quotes display a truly colossal level of historical ignorance matched to a transparent bias that does more to undermine the well-hidden valid argument about the treatment of native Americans over the years than it does to advance it.

Headlined, as such things usually are, as "The Real Story of Thanksgiving," it (including the sources cited) improperly conflates all the native peoples of the area, refers to incidents sixteen years apart involving different natives as if nothing could have changed in the interim, distorts the historical record, and frankly manufactures claims out of thin air.

Let's start with the only contemporaneous description of the so-called "first Thanksgiving." It was contained in a letter written by, it is believed, Edward Winslow (although no name is actually attached). It's dated December 11, 1621 and thus would have been written shortly after the actual event. It was published in 1622 in a book commonly called Mourt's Relation.
Our harvest being gotten in, our governor sent four men on fowling, that so we might after a special manner rejoice together after we had gathered the fruits of our labors. They four in one day killed as much fowl as, with a little help beside, served the company almost a week. At which time, amongst other recreations, we exercised our arms, many of the Indians coming amongst us, and among the rest their greatest king Massasoit, with some ninety men, whom for three days we entertained and feasted, and they went out and killed five deer, which they brought to the plantation and bestowed on our governor, and upon the captain and others. And though it be not always so plentiful as it was at this time with us, yet by the goodness of God, we are so far from want that we often wish you partakers of our plenty.
The only other near-contemporaneous account comes from William Bradford who wrote about it in his journal (published as Of Plymouth Plantation, 1620-1647) probably some 10 or 12 years later. Even there he just sort of brushes by it, endorsing Winslow by referring to "not feigned but true reports."
They now began to gather in the small harvest they had, and to fit up their houses against the winter, being all well recovered in health and strength and had all things in good plenty. For as some were thus employed in affairs abroad, others were exercised in fishing, about cod and bass and other fish, of which they took in good store, of which every family had its portion. All the summer there was no want; and now began to come in store of fowl, as winter approached, of which this place did abound when they came first (but afterward decreased by degrees). And besides waterfowl there was great store of wild turkeys, of which they took many, besides venison, etc. Besides they had about a peck a meal a week to a person, or now since harvest, Indian corn to the proportion. Which made many afterwards write so large of their plenty here to their friends in England, which were not feigned but true reports.
That's pretty much it; that's pretty much what the entire mythology was built on.

A few other points: No time is given for the celebration except it being post-harvest, which likely would have made it around late September or October. This was not a "thanksgiving," which was to them a religious occasion, a holy day, one set aside as occasion arose to thank God for some special and unexpected blessing. It was, rather, a very traditional, very secular, harvest feast: It was traditional among English that if you had a good harvest, you had a feast to which you invited neighbors and workers who had been helpful to you over the year. Other than the deer and fowl mentioned in Winslow's account, it's uncertain what they ate at the feast, although some reasonable guesses can be made.

Okay. With that in mind, on to James' post. My first correction may be nit-picky, but I decided to include it.
Only two written accounts of the three-day event exist, and one of them, by Governor William Bradford, was written 20 years after the fact.
As I noted, Bradford doesn't really describe the event. And by his own account he started writing his history in 1630, just nine years later. He started with the congregation's escape from England in 1608 and got up to the arrival in Plymouth in December 1620, that same year, i.e., 1630. He said he finished the rest "in pieces" over the ensuing years. It seems very unlikely it took him another 11 years to get to the following fall in the story.

James cites another source as saying - and I have to quote the whole passage - this:
'Thanksgiving' did not begin as a great loving relationship between the pilgrims and the Wampanoag, Pequot and Narragansett people. In fact, in October of 1621 when the pilgrim survivors of their first winter in Turtle Island sat down to share the first unofficial 'Thanksgiving' meal, the Indians who were there were not even invited! There was no turkey, squash, cranberry sauce or pumpkin pie. A few days before this alleged feast took place, a company of 'pilgrims' led by Miles Standish actively sought the head of a local Indian chief, and an 11 foot high wall was erected around the entire Plymouth settlement for the very purpose of keeping Indians out!
It's hard to believe that many historical errors could fit in one paragraph.

First, no native people of the time would have called themselves "Wampanoag." That is a native word meaning roughly “people of the east” or “people of the dawn” and could refer to anyone living to the east of you. While it has been adopted as a modern, general term for the natives of eastern Rhode Island and Massachusetts (including Cape Cod), for a native of the time to have called themselves “Wampanoag” would have been nonsensical; it would have meant something like “I live to the east of where I live.”

Second, it is true enough that the childhood mythology of the "Pilgrims" threw all of the native peoples into one pile - but why is this author doing it, and doing it in a way that distorts the history? Massasoit's people, the Pequot, and the Narragansett no more regarded themselves as one people than English, French, and Spanish did despite their common identification as "Europeans."

Third, the "not even invited" business is nonsense, one in which James himself joins:
Was Chief Massasoit invited to bring 90 Indians with him to dine with 52 colonists, most of them women and children? This seems unlikely. ... It is much more likely that Chief Massasoit either crashed the party, or brought enough men to ensure that he was not kidnapped or harmed by the Pilgrims.
Another nit-pick: It was actually 53 and whether "most" were women and children is a judgment call: There were 22 grown men and nine adolescent boys ranging in age from about 12 to 18.

In any event, what seems far more unlikely is that Massasoit would have shown up unannounced to an event which, if uninvited, he would have no reason to even know of. There is no indication in the account that the settlers were surprised or disturbed by his arrival and in fact inviting him would hardly seem out of place: Remember that inviting to the feast those who had helped you was part of the tradition and certainly the natives had done that.

What's more, the claim that he was defending against the possibility of being "kidnapped or harmed" is built on pure vapor with no basis in fact whatsoever. In actual fact, the very next thing Edward Winslow says after his quote above is this:
We have found the natives very faithful in the covenant of peace with us, very loving and ready to pleasure us. We often go to them, and they come to us; some of us have been fifty miles by land in the country with them....
There is absolutely no basis in the historical record to suggest that Massasoit, who the settlers regarded as an ally and with who "we entertained and feasted," had any reason to fear being "kidnapped or harmed."

(Oh, and by the way, another nit-picky correction: The settlers would not have called themselves "pilgrims," with or without the capital P. That entire notion is built on a single phrase from Bradford, who in describing the original voyage from England, says "they knew they were pilgrims." It appears nowhere else in any even near-contemporaneous account.)

Next, as I noted above, outside of fowl and deer there is no direct knowledge of what was actually eaten. Which means the statement "there was no turkey, squash, cranberry sauce or pumpkin pie" is, again, built on nothing. There undoubtedly was no cranberry sauce (which didn't appear until much later) and very likely no cranberries (known as "fenberries" at the time, they were thought too bitter) and no pumpkin pie as we would know it. But there could well have been a "pie" make from pumpkins, there likely was squash (a normal feature of household gardens), and there very likely was turkey - note that Bradford mentions "great store of wild turkeys, of which they took many." But again, no one knows for sure - which also means no one can say they didn't have turkey or squash (or even a type of pie with pumpkins).

The next sentence in the quoted passage contains so many distortions and boners that I have to repeat it here for reference:
A few days before this alleged feast took place, a company of 'pilgrims' led by Miles Standish actively sought the head of a local Indian chief, and an 11 foot high wall was erected around the entire Plymouth settlement for the very purpose of keeping Indians out!
First, there is nothing "alleged" about the feast. It is a documented historical fact.

Next, the only possible incident to which this could refer did not take place a few days before but a few months before, in the latter part of June. The passage also omits the cause of the incident: The settlers had been told that Massasoit, with who they had what amounted to a mutual defense treaty, had been "put from his country" by the Narragansett, that their interpreter Squanto (or Tisquantum) had been murdered by a sachem ("chief" is an Anglo term) under Massasoit named Corbitant - who was at Namasket (the nearest native town to Plymouth) preaching insurrection against Massasoit. So they were acting both in their own interest and in defense of Massasoit. The part about how Standish "actively sought the head of a local Indian chief" is at best gross exaggeration if not pure fiction: Standish had been instructed to execute Corbitant if he had killed Squanto as he had threatened to do and was feared to have done.

They sent an armed party to Namasket to check this out. When they got there they found Squanto was alive and Corbitant had fled. There were a few injuries, which the settlers helped heal, but no one was killed.

As for the "wall," the "11 foot high" figure is plucked out of the air. In September 1623, a visitor to Plymouth named Emmanuel Altham said the fence was eight feet. It's also not entirely clear just how substantial this wall really was.

More to the point, it was not built until months after the harvest feast! Remember that the feast was at the end of the harvest, which would have been late September or early October. A ship named "Fortune" arrived in late November and left on December 13.
Soon after this ship's departure[, Bradford writes,] that great people of the Narragansetts, in a braving manner, sent a messenger unto them with a bundle of arrows tied about with a great snakeskin, which their interpreters told them was a threatening and a challenge.
That is, this was in December or January, more than two months after the feast. In his book Good News from New England, published in 1624, Edward Winslow writes at length about the incident, including how it was sparked in part by a native messenger who was bearing gifts from Plymouth to Canonicus, the sachem of the Narragansett. Apparently, he stole the best of the gifts for himself, which made the remainder look more like an insult than an honor. The town stared down the challenge (they replaced the arrows with gunpowder and musket balls and sent it back), but
[i]n the mean time, knowing our own weakness, notwithstanding our high words and lofty looks towards them, and still lying open to all casualty, having as yet (under God) no other defense than our arms, we thought it most needful to impale our town, which with all expedition we accomplished in the month of February and some few days....
So in short, this has not one single damned thing to do with the harvest feast. It was done as a military defense in response to a direct threat from a sachem who was a rival of their ally Massasoit and occurred at least four months after the feast.

Wait, we're not finished.
Dr. Tingba Apidta ... surmises that [at the feast] the settlers “brandished their weaponry” early and got drunk soon thereafter. He notes that “each Pilgrim drank at least a half gallon of beer a day, which they preferred even to water. This daily inebriation led their governor, William Bradford, to comment on his people's ‘notorious sin,’ which included their ‘drunkenness and uncleanliness’ and rampant ‘sodomy.’”
This is just garbage, a concoction of ignorance and selective quoting that is ideology, not history. For one thing, the only imaginable source for the reference to brandishing weapons is in Winslow's saying "we exercised our arms" - which he describes as a "recreation." If it had been meant in any way to intimidate the natives, there is no reason at all to doubt that Winslow would have said so; neither he nor Bradford were shy about noting such occasions.

What's more, the whole "beer-drunkenness" business is a moth-eaten canard based on sniggering fantasy. Yes, they preferred beer to water: Water was believed to offer no nutrition, contrary to beer, and was often thought potentially dangerous. (It was said at the time that no one would dare drink from any spring found within the walls of London: "The water may look sweet and clear, but there's death in that cup.") Everyone drank beer; even children, once they were weaned, drank watered beer. That also meant that by the time you were an adult, there simply was no way that drinking a couple of quarts of beer a day would get you drunk. Not when you'd been drinking beer since you were a child. It's bull.

(There's also the matter that the strength of the beer, particularly that available to the settlers, which they had to make themselves, didn't match today's. And unlike much of what Dr. Apidta writes, that's not based on speculation. It's based on people following 17th century directions for making beer and seeing what came out.)

So the idea of "daily inebriation" is utter, complete, nonsense. The highly selective quotes from Bradford about drunkenness are no evidence to the contrary: He was a real bluenose and quoting him on personal judgments of the behavior of the community (as opposed to descriptions of events) is very much like quoting some right-winger on the philosophy and ethics of the left - with each equally disposed to take the actions of one or a few and ascribe it to the target group as a whole.

Next up is the reference to "brutish" Miles Standish, "soon after the feast," getting "his bloody prize."
He went to the Indians, pretended to be a trader, then beheaded an Indian man named Wituwamat. He brought the head to Plymouth, where it was displayed on a wooden spike for many years, according to Gary B. Nash, ‘as a symbol of white power.’
"Soon," in this case, being somewhere around early April 1623 - more than a year and a-half later. And again, we have a truly gross case of selective quoting, one so massive I can't imagine it was not deliberate.

The details of this incident involved are long and rather complex, but here is the Readers' Digest version:

Some natives had become angry at some English settled in the Massachusetts Bay area (not Plymouth) and resolved to attack them. But they were concerned that if they did so, the Plymouth militia would help fellow English. So they tried to organize a conspiracy among all the local natives to attack all the English all at once.

Around this time, Massasoit fell ill. Edward Winslow tended to him and helped him recover. Massasoit said that those of Plymouth had thus proved their friendship to him, so he would now prove his friendship to them. He was the one who told them of the plan and he was the one who named Wituwamat and others and he was the one who “advised us to kill the men of Massachusetts, who were the authors of this intended mischief.”

After a fair amount of discussion, including at a town meeting, it was agreed the town needed to act on Massasoit's advice, but only against those who were actually involved in the plan. So yes, some members of the militia went up there under Standish's command, saying they were there to trade, trading being something they had done before. But that's rather different from Standish "pretending to be a trader" as if he tried to conceal his identity.

In the ensuing skirmish, Wituwamat and five other natives were killed.

One other thing here: Yes, Wituwamat's head was cut off and displayed outside the north gate of Plymouth and yes, the idea was to strike fear into anyone else who might have similar thoughts. But calling it "a symbol of white power" is absurd and displays, yet again, an appalling ignorance of the history it intends to judge. This had nothing to do with his being a native. It had to do with his being considered an “enemy of the State.”

Back in Europe, the same fate awaited anyone held guilty of treason or insurrection. In England, the heads were displayed along London Bridge and it was so common that shopkeepers on the bridge were known to say things like “you can find my shop - it’s by the fifth skull along the bridge.”

So yes, there was a skirmish and yes, there was a head taken. But yet again, we see the settlers, in this case in the form of Miles Standish, being condemned as bloodthirsty and "brutish" - with no consideration whatsoever given to either cause or context.

I say again: Apidta is not engaging in history, but ideology, ideology trying to hide behind a veil of supposed scholarship.

One last final comment: Contrary to James, "the first, official all-Pilgrim 'Thanksgiving'" did not have "to wait until 1637." It occurred in July 1623. There had been a drought, threatening the crops. The town had a day of humiliation, a holy day set aside when the need arose, devoted to prayer and fasting to ask God's forgiveness for whatever they had done for him to bring such a thing down on them. That very afternoon there began a gentle, soaking rain that saved the crops, so the town had a day of thanksgiving, again a holy day done when the occasion arose, this one for prayers to thank God for their rescue.

I remember a friend of mine some years ago talking about “the urge to find angelic forces in the world,” that is, the seeming need many of us have to fix on some group, some movement, some something that we can convince ourselves is utterly pure in its motives and behavior. In our attempts to find some better balance in our understanding of what was done to the natives of North America, the cruelties inflicted on them, the racism and bigotry which targeted them, too many of us in considering the “Pilgrims” of Plymouth have simply swapped the mythology of savage natives and noble settlers for the perhaps more satisfying but equally false mythology of noble natives and savage settlers.

Balance, it seems, is still a long way off.

Footnote: I'm skipping the whole Pequot War except to wonder what events sixteen years later with a different nation of natives has to do with the so-called "first Thanksgiving" of 1621.

Wednesday, November 26, 2008

Everything you need to know...

...about military and "defense" policy under Barack Obama:

1. Robert Gates is staying on as Secretary of Defense.
An official close to the Obama transition team said it was likely Gates would be named defense secretary when the president-elect begins to unveil his national security team in announcements expected next week.

A former government official who has advised the Obama transition said that it was "99 percent certain" that Gates would remain as defense secretary for about a year in the Obama administration.
2. Joe Lieberman is happy with Barack Obama's cabinet.
"Everything that President-elect Obama has done since election night has been just about perfect, both in terms of a tone and also in terms of the strength of the names that have either been announced or are being discussed to fill his administration," Lieberman said during a visit to Hartford.
Tips via TalkingPointsMemo.

I missed this... almost a week but I still want to mention it:

November 20 was the 25th anniversary of the broadcast of "The Day After."

Something else that obviously could not survive the "chilling effect" of the Fairness Doctrine.

What's fair is fair

Writing at the Political Animal a few days ago, Steven Benen said he's
been fascinated of late with the far-right hysteria about the reemergence of the "fairness doctrine," because conservative activists are gearing up for a knock-down brawl against an enemy that doesn't exist. ...

And yet, the nonsense doesn't stop. Perusing the news this morning, there are still more conservative columnists railing against the "plan" to bring back the fairness doctrine, and unhinged propaganda about the "unprecedented government assault upon the First Amendment" that is allegedly on the way.
He goes on to say that no one in Congress or the incoming Obama administration is pushing for a return of the Fairness Doctrine, chalking the whole thing up to "Republican paranoia" and the need to find a new "rallying cry."

Some others have made comments along similar lines, all laughingly dismissing the whole business while tut-tutting that no one is arguing for a return of the Fairness Doctrine.

Which is not completely true but is true as far as officialdom goes; while there are a few Senators and a couple of House members who will say they would support its return, it certainly appears none of them, at least none any in any position of authority, have any intention of doing anything about it.

But Avedon Carol of The Sideshow asks a simple question:
[W]ould a return of the Fairness Doctrine be such a bad thing?
And the only reasonable answer is absolutely not. Rather, it would be a definite good thing. Which is what is so infuriating about so much of the reaction among the self-described "progressives" on this: Their attitude extends the dismissal beyond right-wing fantasies to the idea of the Fairness Doctrine itself. And that is a shame constructed of equal parts ignorance and technophilia.

I was involved (as an individual citizen) in the attempts to block the elimination of the rule: writing letters, submitting public comments, that sort of thing. So I'm at least passingly familiar with what it was about.

So first, let me clear up some of the mass of misunderstanding there is out there about the Fairness Doctrine. For example, it did not require "equal time." (The Equal Time rule was an entirely separate rule that related to stations endorsing political candidates.) It did not require that every or even any individual show be "balanced." It did not require stations that broadcast talk shows hosted by right-wing flakes carry an equal number of hours of talk shows hosted by liberals. It did not require that every conservative guest on every show be "balanced" with a liberal guest.

It required only two things: One, that licensed broadcasters cover issues "of public interest," including some controversial ones. Two, that overall, the station's coverage of those issues be reasonably "fair" with various sides having a "reasonable" opportunity to be heard.


Moreover, the fact is, there had been some version of a Fairness Doctrine in US broadcasting not only since before there was an actual Fairness Doctrine but before there was an FCC.
In the Radio Act of 1927, Congress mandated the FCC’s forerunner, the Federal Radio Commission (FRC), to grant broadcasting licenses in such a manner as to ensure that licensees served the “public convenience, interest or necessity.”
In 1928, in the first direct hints of what became the Fairness Doctrine, the FRC required broadcasters to show “due regard for the opinions of others.”
From the early 1940s, the FCC[, created by the Communications Act of 1934,] had established the "Mayflower Doctrine," which prohibited editorializing by stations. But that absolute ban softened somewhat by the end of the decade, allowing editorializing only if other points of view were aired, balancing that of the station's. During these years, the FCC had established dicta and case law guiding the operation of the doctrine.
Finally, in 1949, the FCC, having felt its way through to a reasonable balance, ruled
that station licensees were "public trustees," and as such had an obligation to afford reasonable opportunity for discussion of contrasting points of view on controversial issues of public importance. The Commission later held that stations were also obligated to actively seek out issues of importance to their community and air programming that addressed those issues.
In 1959, Congress amended the Communications Act to specify that "a broadcast licensee shall afford reasonable opportunity for discussion of conflicting views on matters of public importance.” Ten years later, in the case of Red Lion Broadcasting Co. v. FCC, the Supreme Court upheld the constitutionality of the Fairness Doctrine, arguing in essence that the ability to hear differing views was closely related to the ability to express them.

But of course the corporations were never happy about this: Broadcast licenses had famously been described as licenses to print money - and they wanted the license to print the money without any actual obligations to anything other than their bottom line to go along with it.

With the arrival of the Reagan gang, the broadcasters, seeing their opportunity, started openly lobbying for repeal of the rule. They claimed it had a "chilling effect" on their "free speech" and that with the development of cable, there was no "scarcity" of outlets for a variety of views. But their main argument against the rule was that it was unnecessary because broadcasters could be trusted to address controversies, and do it a way fair to all sides, on their own. I remember that because I also remember wondering at the time that if that was true, that the rule only required them to do what they'd do anyway, why they were devoting such time and energy to getting it repealed. (There is an old saying that "some questions need only be asked.")

In 1985 they were on the verge of success, as an FCC with a Reagan-appointed majority issued a report that parroted industry claims. Then two court cases gave the agency the chance to act on the industry's behalf.

First, in TRAC v. FCC, a panel of the DC Court of Appeals dismissed what would appear to be the plain meaning of the 1959 amendment noted above in order to find that "the fairness doctrine is not a 'binding statutory obligation' under the Communications Act of 1934," and so the FCC was not obligated to enforce it. In December 1986, a request for a hearing before the full court was denied, ending the case. Then, in January 1987 the same court ruled in Meredith Corporation v. FCC that the agency had not given adequate consideration to the company's constitutional challenge to a case involving a Fairness Doctrine violation and sent the case back to the FCC for reconsideration.

That was the opening: The FCC responded by eliminating the doctrine altogether that August.

Interestingly, in a show of Congressional intent that was a clear rejection of the finding in TRAC, in the spring of 1987, before the FCC acted, Congress passed legislation to specifically make the doctrine law. Reagan vetoed it.

And we can see how well it has all worked out. There are so many controversial issues discussed in such a fair and balanced way. A show like "See It Now" could never have been broadcast during the dark days of the Fairness Doctrine. We are so much better informed than we used to be and the mass broadcast media probes so tellingly that it's become impossible, for example, for a president to lie us into a war or for significant numbers of people to honestly believe that a centrist US senator is actually a Muslim terrorist or for the government to give $2 trillion in loans to private corporations without anyone noticing - or for people to be unaware that majorities of their fellow citizens agree with them on issues like national health care. All because we are so well served by our media who need no prompting to discuss serious, controversial issues in a fair manner that gives full voice to actual progressive views just like it does to centrist and conservative ones.

The Fairness Doctrine? Pah! That's so, so... so pre-internet!

Footnote: I wanted to note that the piece on the Fairness Doctrine from the Museum of Broadcast Communications linked above goes to some pains to diss it.

For example, the author, who wrote on broadcast ethics, says "journalists ... considered it a violation of First Amendment rights" and "simply avoided any coverage of some controversial issues" (the "chilling effect"). But that is fundamentally untrue: It was the corporations, not journalists, that dodged controversy for fear of upsetting some viewers - and therefore, more importantly, their sponsors. Journalists had a hell of a lot more trouble getting stories past their bosses than past the Fairness Doctrine.

He also says, echoing the industry, cable eliminated "scarcity" and offered "many other voices in the marketplace of ideas." All those left-wing dominated cable channels are clear proof of that.

To wrap it up, he refers to the 1987 legislation as requiring the FCC to enforce the doctrine, "like it or not" and that the doctrine "remains just beneath the surface of concerns over broadcasting and cablecasting" in the face of the "threat" of legislation.

Footnote Again: According to a Rasmussen study from this summer,
[n]early half of Americans (47%) believe the government should require all radio and television stations to offer equal amounts of conservative and liberal political commentary,
with 39% opposed. That is, a clear plurality of Americans support a level of balance that is far stricter than anything envisioned by the Fairness Doctrine. So maybe, "progressive" pundits, the Fairness Doctrine is neither so silly nor so archaic after all.

Monday, November 24, 2008

Happy Geekiversary

I almost missed it but wanted to squeeze it in. Today, Monday, is the 149th anniversary of the publication of The Origin of Species by Charles Darwin.

I wonder if any plans are afoot for next year? With evolution almost constantly under attack by various wackos (usually) wielding Bibles that they don't actually understand, I think a celebration of the 150th anniversary would be a good thing.

I found out about the anniversary via Crooks & Liars which has a link to a neat page called "This Week in History," containing info about events related to peace and justice, at

Footnote: Just FYI, the "evolution" link above is to posts here at Lotus that mention evolution. That also means, however, that a number of them merely mention it without discussing either it or the attacks on it from the hordes of ignorance. So you'll have to scan some.

Gloom and doom

Updated A colleague at work told me the other day that she's recently read a book (the title of which I did not get) arguing that "every four generations" some sort of significant event occurs which ultimately results in a societal shift toward a different view on whatever spurred that particular crisis. She was wondering if the current economic trends in the US and beyond are such an event and if so what changes that will bring.

Certainly there are dark signs all around us. The collapse of the financial industry; the drying up of credit; the bankruptcies of AIG and Lehman Brothers; the failure of Merrill Lynch; Fannie Mae and Freddie Mac going into what amounted to federal receivership.

And those are just the headlines. What's behind them shows an entire financial industry (and thus an entire economy which runs on credit) even shakier than we've been lead to (or allowed to) believe and far worse than the one seen in the assuring words of Treasury Secretary Henry Paulson,
who told National Public Radio a week ago that people were no longer worried about the possibility of a major bank failure. “I’ve got to tell you,” he said. “I think our major institutions have been stabilized. I believe that very strongly.”
And, he could have added, the fundamentals are sound. Which is doubtless why a week later, at end of trading on Thursday, stock market wealth was down $8.3 trillion from what it was 13 months earlier and why the government has just agreed to a $306 billion bailout for Citigroup, the nation's second largest bank, which had recently lost half its market value in just three days. (The announcement came just hours after Dana Perino insisted she knew of no talks going on between Citigroup and the feds for financial aid. Geez, at least Tony Snow was a good liar.)

Meanwhile, the actual amount of taxpayer support already being given to corporate America was not the nearly $350 billion spent by Paulson and company, it was not even the $700 billion promised on our behalf by Congress. That, Jim Hightower says, is
only the ante. There’s also a secret bailout that Bloomberg News says has now topped $2 trillion! These are emergency rescue loans from U.S taxpayers that the Federal Reserve has quietly committed to America’s biggest banks, investment firms, and insurance companies.

Which financial outfits got this money, and how much did each get? That’s a secret, say the Bushites and the bankers. Well, what did the beneficiaries put up as collateral to protect taxpayers? None of your business, say the insiders.
(Hightower also points out a picture-perfect example of how the game is played:
If you're not a bank and therefore technically ineligible to party with Paulson, don't worry, for he will simply declare you to be a bank. That's what he did for American Express. When the credit card giant knocked at Treasury's door this month, Paulson redefined it as a bank-holding company, gave it the secret password and let AmEx reach in for a $3.6 billion party favor from you and me.
It pays to have friends - literally.)

But of course what loans the government gives to private corporations is our business. It is, after all, our money and ultimately our risk. (Besides, isn't "It's our money!" what the wingers squeal whenever the government wants to spend money on the poor? Where are they now?)

Except apparently, in the view of Paulson and his posse, it's not our business and perhaps not even our money. Not when the business involves business. Not when it involves largess to Wall Street and its nationwide minions.

But while they are succeeding at least for now in keeping that secret, in keeping the whole worm-eaten, weather-beaten edifice afloat the way Monty Python's "the Amazing Mystico" kept buildings up - it only works so long as you believe in it - other very public information tells a story of recession and worse that can't be covered up.

One well-trod example is the possible bankruptcy of GM and the failure of the Big 3 domestic automakers. (I'll get into my thoughts on that in a different post since discussing it first requires dispelling several myths about the current state of the domestic auto industry.) Another, all too familiar and all too painful, example is the rising number of foreclosures, which exploded in 2007 and continued a dramatic climb in 2008: According to RealtyTrac, a company that tracks the figures, foreclosures in the 3rd quarter (i.e., July-September) of 2008 were up an astounding 71% over the same period in 2007. And there's no sign of a let-up.
The number of homeowners caught in the wave of foreclosures in October grew 25 percent nationally over the same month in 2007, data released Thursday showed.

More than 279,500 U.S. homes received at least one foreclosure-related notice in October, an increase of 5 percent over September, according to RealtyTrac Inc. ...

More than 84,000 properties were repossessed in October, RealtyTrac said.
The company predicts that by the end of the year, there will be over a million bank-owned properties on the market. That is, one-third of all properties for sale in the US will be the result of foreclosures. The desperate attempt by Fannie Mae and Freddie Mac to forestall a further collapse by a 45-day freeze on foreclosures on mortgages they hold is unlikely to do more than push many of those foreclosures into the new year.

But there's still more. There are other statistics, numbers that despite their significance too often are a one-day story because they don't have some high official or big corporate PR firm pushing them. But they still tell a devastating story of an economy on the brink of potentially epic collapse.

– New claims for unemployment benefits hit a 16-year high last week. The four-week average was even worse: It was the highest in over 25 years.
- The number of people on unemployment is at its highest level in nearly 26 years. Less than half of the unemployed get benefits.
- The official unemployment rate rose to 6.5% in October, the highest in 14 years; it will be higher in November and many economists expect it will hit 8% or even 8.5% in 2009.
- Even that figure is clearly too low: The "official" figure doesn't include the self-employed, part-timers, those working for commission, the underemployed (those who work part-time but want full-time work), or, significantly, "discouraged" workers, those who have just given up on finding work. At any moment, the real unemployment rate is likely about double the official rate. The National Jobs for All Coalition calculates October unemployment at 13.6%.
- In September, retail sales dropped 2.4%, the largest monthly drop since the data collection began in 2003. In October, sales dropped another 1.5%.
- Imports of consumer goods dropped nearly 8% in September.
- As a result of all that plus the explosion of the credit market, consumer confidence is at an all-time low.
- In October, both housing starts and new construction permits were at their lowest levels since the Commerce Department started tracking the figures nearly 50 years ago.
- Exports, which had helped to prop up the domestic economy, fell victim to the world economy and saw their biggest drop in seven years.

The only bright spot, it seemed, was an uptick in mortgage applications - which happened only because they were coming off an eight-year low.

It is bad, it is getting worse, and it is getting worse than was expected faster than was expected.
"This is obviously very, very serious deterioration in the labor market, more than a lot of people had expected even a couple of months ago," said Scott Brown, chief economist with Raymond James & Associates in St. Petersburg, Fla.

"We are looking at the biggest financial crisis since the Great Depression and the biggest economic crisis we have had in the United States since the early 1980s."
Stuart Schweitzer, a global markets strategist for J.P. Morgan Private Bank, calls it “a full-blown, self-feeding downturn.”

The prospect of this dropping down through recession right into a genuine depression is now sufficiently mainstream (even a former chair of Goldman Sachs is talking about it) that people are starting to speculate on what a 2009 depression would look like. (Hint: not at all like the 1930s.)

But the truth is that for many, a depression-type life already exists.
Some 691,000 children went hungry in America sometime in 2007, while close to one in eight Americans struggled to feed themselves adequately even before this year's sharp economic downtown, the Agriculture Department reported Monday.

The department's annual report on food security showed that during 2007 the number of children who suffered a substantial disruption in the amount of food they typically eat was more than 50 percent above the 430,000 in 2006 and the largest figure since 716,000 in 1998.

Overall, the 36.2 million adults and children who struggled with hunger during the year was up slightly from 35.5 million in 2006. That was 12.2 percent of Americans who didn't have the money or assistance to get enough food to maintain active, healthy lives.

Almost a third of those, 11.9 million adults and children, went hungry at some point. That figure has grown by more than 40 percent since 2000. ...

[James Weill, president of the Food Research and Action Center, an anti-hunger group,] predicted the 2008 numbers will show even more hunger because of the sharp economic downturn this year.

"There's every reason to think the increases in the number of hungry people will be very, very large based on the increased demand we're seeing this year at food stamp agencies, emergency kitchens, Women, Infants and Children clinics, really across the entire social service support structure," [he] said....
And note this well: That increase in hunger was, again, from 2007. The sharp increase in foreclosures began in the spring of 2007. October 2008 was the fourth straight month of declining retail sales. Official unemployment has increased more or less steadily since its low of 4.4% in March 2007.

This is not a crisis that suddenly appeared in the fall, it is one that has gripped millions for some time. We do not have hunger and unemployment triggered by an economic crisis, we have an economic crisis triggered by blind greed and the extremists that celebrate it that has exacerbated existing hunger and unemployment.

And what do we hear as an answer? More of the same. More bailouts, more favors to corporations "too big to fail," more propping up of the whole rotten system that regards the hunger of tens of millions as an unfortunate side effect and the futures of tens of millions of workers to be expendable but sees rich corporate executives foregoing some of their bonuses as a significant news event. The fat cats even had the gall to be "concerned" about the move by Fannie Mae and Freddie Mac to suspend foreclosures on the grounds that it might come "at the expense of profit" and they're afraid that the two acting in a "public policy role" might continue after the federal conservatorships end.

Meanwhile, in what could be the most galling response of all, Commerce Secretary Carlos Gutierrez said
Congress could do its part by approving three pending free trade agreements with Colombia, Panama and South Korea, while the White House keeps pushing for an agreement in the long-running Doha round of world trade talks....
The race to the bottom that is globalization is the answer, always the answer, in their fetid little minds.

But from George Bush, bless his Grinch-sized heart and third-rate intellect, came an unexpected (and wholly unintentional) truth. Speaking at the Manhattan Institute on November 13, he said:
In the wake of the financial crises, voices from the left and right are equating the free enterprise system with greed and exploitation and failure.

It's true this crisis includes failures ... but the crisis was not a failure of the free-market system.
[N.B.: The quote is from the video found at the link, not from the article.]

He's absolutely right: The crisis is not a failure of the free-market system. It is the free-market system. It is a core feature, a basic, persistent, feature of a system that by its nature rewards greed and selfishness, that by its nature creates haves and have-nots, the elite and the evicted, that by its nature goes through boom-and-bust on a regular basis, cycles that over time serve to concentrate wealth in fewer and fewer hands, that by its nature turns everything into a commodity whose worth is measured merely in money, in its potential for profit, that by its nature values the work that people do not on how it, to quote a well-respected document, "promote[s] the general welfare" but on how it promotes some investor's bank account, that by its nature cannot eliminate poverty or unemployment or hunger because by its nature it invariably, unavoidably, necessarily, favors cold "efficiency" over human justice.

It is possible - and some are already suggesting - that this is a crisis that system will not survive. It is safe to say that "the American free-market system" as it has existed will not; in fact, it already hasn't because no matter how hard the elites may try to ignore or downplay it, it is unavoidably true that the federal government is now a bigger player in the private sector than it ever was before, enough that some right-wing flakes are talking about it being "socialism."

That role could be leveraged in the public interest. There could be restrictions on any bailouts, restrictions such as, off the top of my head, oversight of investment decisions or even a direct say in those decisions. A direct public role in management. Credit controls, including demanding that the decision to issue credit, while it must take into account the ability of the borrower to pay it back, should be based on the benefit to the community and judged in accordance with its environmental impact rather than on its potential to produce profit for the borrower.

Would that last drive away a good number of potential developers more interested in the green in their wallets than the green in the environment, more concerned with the cocktail parties they go to than the human parties they employ? Yes, and good riddance to them. Would that stop beneficial development? Of course not: Do you really believe there aren't people and organizations that would be interested in doing non-profit (or low-profit) housing and commercial projects if there was financing at rates they could afford?

But I said that role could be leveraged in the public interest. Will it? I doubt it, at least right now, and surely the haves will not give up their privileged positions of power easily.

There's an old (and false) notion that the Chinese character for "crisis" combines characters for "danger" and "opportunity." Even though the story isn't true (the characters for "crisis" actually read something like "critical moment danger," which is exactly what a crisis is) there is still a degree of wisdom in the notion that a crisis might present an opportunity. I keep thinking about my co-worker's book and how crises can lead to change. The Great Depression did have one salutary effect: It smashed the "The chief business of the American people is business ... The man who builds a factory builds a temple" thinking that drove the Roaring '20s - and the country straight into the depression. It opened a space for increased power of the working classes and labor organizing, and for a new social covenant between government and the public, including such as Social Security, a federal minimum wage, and unemployment insurance. It's taken the elites seventy years to roll back some of those gains (particularly involving organized labor) and even now some basic ones seem beyond their reach.

So yes, we are facing a crisis, yes, we are likely to experience the worst recession since the Great Depression and yes, we may even hit another depression. And yes, we must struggle, we must demand, demand, that Congress and the president act on our behalf, not that of the corporations whose only reason for existence should be to act as economic conduits for our needs and desires rather than as giant vacuum cleaners sucking up economic resources for the 1% of the population that controls 38% of the wealth.

But we need to go beyond those immediate demands. We also need to be thinking about new ways to organize our economy and about how we get from here to there. Now, those ideas are already out there; we don't have to create a vision from scratch. So we could have an opportunity here: The forms to which we have become accustomed will not survive. The changes may be dramatic, they may be subtle but still significant. In either event, we can be assured that the powers-that-be will do their best to make them invisible, to pretend that nothing has really changed. But it already has and it will. We need to take this opening to push new ideas into that gap between what was and what will be.

I intend to drag out some of my old ideas - ones I wrote about in the past one place or another - and toss them up here to see what the reaction might be. I invite anyone reading this to do the same. (Here in comments, preferably, so I'll be sure to see them, but in short form, please.)

Footnote: One of those bits to get you through a cold winter, when every bit of hope is gone.
As more Americans turn to charity amid worsening economic gloom, operators of food banks and other aid groups are relying on the surprisingly resilient generosity of their neighbors and finding that even when times are tough, people still give. ...

The Center on Philanthropy at Indiana University says that historically, charitable giving has been recession-proof.

Contributions to American charities have increased during 39 of the past 40 years in today's dollars, and a change in the tax laws - not the stock market crash - can be blamed for the drop in 1987, said Melissa Brown, associate director of research for the center. Between 69 and 72 percent of people give routinely, she said. ...

"At a time when people have things and they know that other people don't, Americans' generosity wins out," said Justin Greeves, senior vice president of Harris Interactive, which regularly polls Americans about their charitable giving.
In fact, according to a survey by the Christian relief group World Vision, 2008 could actually be a better-than-usual Christmas for donations to charitable organizations.

So let me be the first to cover it all from Thanksgiving right through New Year's Day: Happy Holidays. The human heart lives.

Updated with the information about housing and construction starts and the link to the Hightower Lowdown at "extremists who celebrate it."

Thursday, November 20, 2008

What's wrong, in one easy lesson

Senator Barbara Boxer has announced that come the new session of Congress in January, she intends to introduce a "streamlined" version of her climate change bill.
Boxer also stated that she would introduce a bill to provide $15 billion a year in support for a clean energy program that supports wind, solar, geothermal and other renewable energy sources. ....

Such a bill would represent a huge increase in the amount of money dedicated to clean energy, and there remains the question of how Congress will raise the money.
We are now spending $250 billion a year for wars in Iraq and Afghanistan. None of our Serious Thinkers say "How will we pay for this?" We commit to $700 billion to bail out the financiers. No Serious Thinkers worry about raising the cash.

But suggest spending $15 billion a year on renewable energy and heading off climate disaster, and they slump in their chairs and with their hands over their eyes, rock back and forth and moan "Where will we ever get the money? Dear God, how will we ever come up with the money?"

It's coming!

Rahm Emmanuel. Hillary Clinton. Tom Daschle. Eric Holder, Assistant Attorney General under Bill Clinton. Greg Craig, former Special Counsel to Clinton. David Axelrod, long-time consultant to Democrats.

I feel the tide of "change" just washing over me.

Wednesday, November 19, 2008

Just for fun

I don't expect a damn thing to come from this, but it's fun anyway, especially considering where it happened. From Reuters:
A grand jury in South Texas indicted U.S. Vice President Dick Cheney and former attorney General Alberto Gonzales on Tuesday for "organized criminal activity" related to alleged abuse of inmates in private prisons. ...

The grand jury in Willacy County, in the Rio Grande Valley near the U.S.-Mexico border, said Cheney is "profiteering from depriving human beings of their liberty," according to a copy of the indictment obtained by Reuters.

The indictment cites a "money trail" of Cheney's ownership in prison-related enterprises including the Vanguard Group, which owns an interest in private prisons in south Texas.

Former attorney general Gonzales used his position to "stop the investigations as to the wrong doings" into assaults in county prisons, the indictment said.
Thanks to the folks at Crooks and Liars for the tip. They have up a video clip of the report of the local TV station that apparently broke the story.

Footnote: David Kurtz at Talking Points Memo says the news accounts "don't seem to implicate Cheney or Gonzales directly in any wrongdoing." In Cheney's case that's true and perhaps he's named as an unindicted co-conspirator. But the description of Slagzone's role could easily describe obstruction of justice.

Tuesday, November 18, 2008

Payoffs, Part Two

Now for the other half of that dwindling base, greedy corporate interests.
Firing off another decision that is angering environmental groups, the Bush administration has issued new regulations to develop oil shale deposits straddling almost two million acres of public lands in Colorado, Utah and Wyoming.

The rules lay out the framework to develop these deposits over the next decade, including royalty rates, how to evaluate bids for leases, mitigation requirements and other procedural elements.
Last fall, Congress foolishly allowed a moratorium on the development of oil shale, a singularly dirty and polluting process, to lapse,
[b]ut most experts had expected the rules on how to develop the deposits to be left to the next administration.
Instead, they were fast-tracked to get them out before November 20. That's 61 days before the new Congress would start its session - and therefore beyond the 60-day time frame established by the Congressional Review Act, during which Congress can repeal any new rules by majority vote. That is, they were produced in time that rescinding them would require a whole new regulation-writing process, which could take months.

The White House goons insist this is for our own good.
“Oil shale is a strategically important domestic energy source that should be developed to reduce the nation’s growing dependence on oil from politically and economically unstable foreign sources,” said James Caswell, the director of the B.L.M. [Bureau of Land Management]
But instead it is just another giveaway to the oil companies who are hoping to lock in industry-preferred regulations and favorable leases on public lands for an oil shale industry that doesn't exist - there are no oil shale mines or refineries - and is unlikely to produce any oil for decades.

How bad is oil shale? The Natural Resources Defense Council calls it "the dirtiest fuel on the planet" because, among other reasons, it is expected to emit two to four times more global warming pollution than production of conventional gasoline. What's more, no one knows for sure how much energy will be required or how much air and water pollution it will generate. But it can be said that it will require three or more barrels of water - liquid gold in the arid regions of the US west where the oil shale deposits lie - for every barrel of oil produced. And because it will be more expensive than conventional fuel, it probably wouldn't even cut gas prices.

To top it off, oil shale is lousy fuel.
Pound per pound, oil shale contains just one-tenth the energy of crude oil, one-sixth that of coal, and one-fourth that of recycled phone books. ... Dung cakes have four times more energy than does oil shale. ... Oil shale has one-third the energy density of Cap’n Crunch....
Does nothing have as low an energy density as oil shale? Yes, there is something: a baked potato.

So it's unproven, inefficient, expensive, contributes to global warming, and is dirty. But still, the folks at the White House just gotta give the oil industry their heart's desire: more. (The reference to "heart" is purely figurative since the industry clearly lacks one.) And they didn't stop with regulations: The bailout plan contained a provision providing $4 billion in taxpayer handouts to the oil industry to develop tar sands, oil shale, or liquid coal. More, more.

And then - I almost said finally but I doubt it is - there is the fact that
[e]arlier this month, the Bureau of Land Management expanded its oil and gas lease program in eastern Utah to include tens of thousands of acres on or near the boundaries of three national parks,
a decision made without consulting with the National Park Service. More, more, more.

The incoming Obama administration will be able, if it chooses, to pretty much negate the effect by simply refusing to put leases out for bid until the regs are re-written (or, better yet, rescinded). However, we can be sure it will be under considerable pressure from the energy giants to lease out the lands and how the administration responds might be a good indication of how things are going to go on the energy and corporate fronts.

Payoffs, Part One

The Shrub gang's fanatical devotion to fanatics is continuing right up to the end.
A last-minute Bush administration plan to grant sweeping new protections to health care providers who oppose abortion and other procedures on religious or moral grounds has provoked a torrent of objections, including a strenuous protest from the government agency that enforces job discrimination laws.

The proposed rule would prohibit recipients of federal money from discriminating against doctors, nurses and other health care workers who refuse to perform or to assist in the performance of abortions or sterilization procedures because of their “religious beliefs or moral convictions.”

It would also prevent hospitals, clinics, doctors’ offices and drugstores from requiring employees with religious or moral objections to “assist in the performance of any part of a health service program or research activity” financed by the Department of Health and Human Services.
As an indication just how important this is to the WHS*, consider that
[t]he White House Office of Management and Budget received the proposal on Aug. 21 and cleared it on the same day, according to a government Web site that keeps track of the rule-making process.
What's more,
[t]o avoid the usual rush of last-minute rules, the White House said in May that new regulations should be proposed by June 1 and issued by Nov. 1. The “provider conscience” rule missed both deadlines.

Under the White House directive, the deadlines can be waived “in extraordinary circumstances.” Administration officials were unable to say immediately why an exception might be justified in this case.
I'm prepared to assume they were not "unable" but rather unwilling, since I would argue the reason is to do a final favor to the wingnuts who are half of their dwindling base.

The thing is, federal law already prohibits discrimination in employment based on religion
and the courts have defined “religion” broadly to include “moral or ethical beliefs as to what is right and wrong, which are sincerely held with the strength of traditional religious views.” ...

Under the Civil Rights Act, an employer must make reasonable accommodations for an employee’s religious practices, unless the employer can show that doing so would cause “undue hardship on the conduct of its business.”
(As sidebar, I note that you should thank two Vietnam-era draft resisters, that is, two very definitions of DFHs, named Daniel Seeger and Elliott Welsh for that protection of non-religious-based ethical and moral beliefs.)

What the Shrub gang wants to do would overturn 40 years of law, regulations, and court decisions and look to establish a new, broader standard at a single stroke, one that
[p]harmacies said ... would allow their employees to refuse to fill prescriptions for contraceptives and could “lead to Medicaid patients being turned away.” State officials said the rule could void state laws that require insurance plans to cover contraceptives and require hospitals to offer emergency contraception to rape victims.

The Ohio Health Department said the rule “could force family planning providers to hire employees who may refuse to do their jobs” - a concern echoed by Cecile Richards, president of the Planned Parenthood Federation of America.
Significantly, according to senior staff at the Equal Employment Opportunity Commission this was undertaken without any consultation with the agency. Two members of the Commission and its legal counsel have called for the proposed rule to be withdrawn. So have
the National Association of Chain Drug Stores, the American Hospital Association, the American Medical Association, 28 senators, more than 110 representatives and the attorneys general of 13 states.
Who supports it? Outside of the White House mouth-breathers, no surprise: the US Conference of Catholic Bishops and the Catholic Health Association, a trade association of Catholic hospitals.
Sister Carol Keehan, president of the Catholic Health Association, said that in recent years, “we have seen a variety of efforts to force Catholic and other health care providers to perform or refer for abortions and sterilizations.” [emphasis added]
That emphasized phrase holds a key. They want not only to be able to refuse to perform abortions - they already for at least the most part have that right - but to be able to just turn people away, to be able to go from saying "We don't do that here; you'll have to go here or here for that" to "We don't do that here, go away, sinner" - and do it without risking any possible federal grants or programs. It's "give us the bucks and shut up."

I think that's one of the things I find most offensive about this whole business. Someone's conscience tells them they can't do abortions? Fine. I have no problem with that. The individual conscience is the closest thing to supreme there is in my worldview. But a first principle is that individuals have consciences. Institutions do not. Institutions, I believe, can be legitimately constrained by law in ways that individuals cannot. We can, and assuredly should, make reasonable allowances; we should, for example, accept that Catholic hospitals will follow Catholic teachings and not expect abortions to be among the services offered there so long as there is a reasonable alternative available elsewhere.

But we should not, we must not, accept that such institutions can inhibit the ability of others to obtain those same services (by, for example, refusing to do referrals) and even less should we accept that long-standing rules should be changed to allow them to do so without any cost to themselves. Conscience should be more selfless, not more selfish; it should be more about the benefit brought to others, not lessening the burden on yourself. I can't shake the feeling that what we are seeing here is less about morality than it is about money - and that is not conscience, it is convenience.

And for the Bush cabal, it's less about justice than about favoritism. And it's never about conscience.

Footnote: Some may see a contradiction between my stand here about health care personnel being involved in abortions and my previous comments about pharmacists refusing to fill prescriptions for birth control. I said there that any pharmacist who could not do so should not continue to be a pharmacist. But there is no contradiction: Performing abortions is not part and parcel of being a doctor or a nurse. Filling prescriptions is part and parcel of being a pharmacist. Refusing to do so for any reason that is not health-related (e.g., a drug interaction or a known allergy) is a failure to do the job.

*WHS = White House Sociopaths, a label I expect to be able to retire after January 20. Obama and his crew are corporatist centrists, but they are not, so far as anyone can tell so far, sociopaths.

Another reason for national health care

The Boston Globe has uncovered data showing that the amount insurance companies reimburse hospitals in Massachusetts can vary dramatically from hospital to hospital.
Private insurance data obtained by the Globe's Spotlight Team show that the Brigham [& Women's Hospital], Mass[achsetts] General, Children's Hospital[, all in Boston,] and a few others are, on average, paid about 15 percent to 60 percent more than their rivals by insurance companies such as Blue Cross Blue Shield of Massachusetts and Harvard Pilgrim Health Care. The gap is even more striking for many individual procedures, which can be two or three times more expensive in one hospital than in another.
A driving (but not the only) force behind this, the Globe reports, is Partners Health Care, originally formed as an alliance between Brigham and Mass General - both among the nation's top 10 hospitals according the US News & World Report's annual listing - to combat what the founders saw as the stinginess of insurance companies. Their prominence allowed them to tell the insurance companies, in effect, "pay us what we say or else" - the "or else" being that if a company balked, the hospitals would refuse to accept their insurance, which could cost the company thousands of subscribers panicking at the thought of being denied access to those top-drawer facilities. One company that tried to resist - Tufts Health Plan - caved within days.

The result is that Partners Health Care hospitals, overall and on average, get paid 30% more for the same services as non-affiliated hospitals.

It's important to note that these differences in no way correlate with quality of care and they persist even in cases not only of identical procedures using identical equipment but treatment by identical physicians - the only difference being what hospital their patient is in. In fact, sometimes the correlation is negative:
Massachusetts General Hospital, for example, earns 15 percent more than Beth Israel Deaconess Medical Center for treating heart-failure patients even though government figures show that Beth Israel has for years reported lower patient death rates.
Simply put, price does not equal quality. What it does equal is profit: Partners has netted $1.7 billion over the past few years.

Ultimately, this has done two things: shift the center of power in the health care industry in Massachusetts somewhat from the insurance companies toward the hospitals and drive up the cost of health insurance in the state. What it decidedly has not done is improve the quality of health care.

Nor has it improved access to health care; indeed, it may well have damaged it: Massachusetts requires that everyone in the state have health insurance and offers subsidized insurance to those who can't otherwise afford it. Rising premiums in the private market means more people who can't afford the cost, throwing them back on the state system, raising its cost and inhibiting the goal of universal coverage.

This sort of thing will continue until the profit is taken out of the health care industry. We hear about "national health insurance." Screw that. We hear about "single payer." Screw that too. You want to know what I want to see? Very quickly, it's this:

I want to see a national health care system, layered from neighborhood-level clinics through community hospitals and regional health centers up to a small number of national district hospitals for special, rare, or unusually complex treatments. The workers in all those facilities are federal employees. Ethical and financial oversight is exercised by committees of the public and health care workers at each level. The system is primarily financed through taxes with payment, if any, for services based strictly on ability to pay.

If alongside that a private system persists for those who can afford the luxury, fine. In fact, good, because those people will still be paying their full share of taxes to support the system (no tax deductions for private insurance) while reducing the demands on it.

My wife is a registered nurse who often laments the idea that the health care industry is becoming ever-more "industry" and ever-less "health care." She continues to cling to the ideal that the needs of the patient, not the needs of accountants or investors, should be the focus of health care workers. Ultimately, a not-for-profit national health care system is the only way to get there.

Saturday, November 15, 2008

Tales from the Geek, Part Six

Pictures of Mars? Big deal. We got pictures of planets around other stars! I'm sure you've heard about this and probably seen the pictures, but it is just too cool to let pass without mention here.
The first pictures of planets outside our solar system have been taken, two groups report in the journal Science.

Visible and infrared images have been snapped of a planet orbiting a star 25 light-years away.

The planet is believed to be the coolest, lowest-mass object ever seen outside our own solar neighbourhood.

In a separate study, an exoplanetary system comprising three planets, has been directly imaged, circling a star in the constellation Pegasus.
The single planet is orbiting the visually-bright star Fomalhaut ("mouth of the whale" or "mouth of the fish") in the constellation Pisces. It's a blue star, hotter and younger than our Sun, with about double its mass. In the picture to the left, the light from Fomalhaut is blocked out in order to make out the planet which otherwise would be lost in the glare.

The presence of the planet in the disc of dust and gas around the star, which is leftovers from its formation, plus the fact that the disc is so sharply defined add strength to the leading hypothesis of planetary formation, that of accretion, where bits of dust in the disc cling together by gravitational or electrostatic attraction. Over time, these bits collide with other bits which become bigger bits, and so on.
The team estimates that the planet, dubbed Fomalhaut b, is 11bn miles away from its star, about as massive as Jupiter and completes an orbit in about 870 years. It may also have a ring around it.
The three-planet system is around a star 130 light years away. It's designated HR 8799, and is perhaps just visible to the naked eye in the constellation Pegasus. They planets were imaged using infrared light coming from them.
According to a theoretical model that accounts for the light coming from the planets, they range in size from five to 13 times the mass of Jupiter and are probably only about 60 million years old.
Which accounts for the infrared light: They are still pretty hot, having only recently (in astronomical terms) formed.

The really exciting thing about this is that being able to see the planets directly rather than inferring their presence either from their dimming effect on their star's observed light as they transit it or by their gravitational effect on its rotation, it will be possible to study them - including any atmosphere - in far greater detail.

There are now 228 known exoplanets around nearby stars. Just a few years ago people were seriously wondering if there were any.

Tales from the Geek, Part Five

As a footnote to the preceding, opal had previously been found at Gusev Crater on Mars by the Mars rover Spirit.

And speaking of Spirit and Opportunity, they are still at it.

Having climbed out of Victoria Crater where it has spent the last two years, Opportunity has begun a marathon trip to Endeavor Crater, which is 20 times larger than Victoria - but it's seven miles away, nearly as far as the entire distance Opportunity has traveled since landing on Mars nearly five years ago. Even so,
"things are looking great," [NASA's Steve] Squyres noted. That robot is in some very difficult terrain at the moment, but still routinely driving some 260 feet (80 meters) or more per Mars day."
Spirit, hit with a dust storm that darkened the skies and covered its solar panels with a coating of red dust, was feared to be in real trouble, as it produced only 89 watt hours of energy one day last weekend - the lowest recorded for either rover and less than what it needs to be fully operational.

However, on Thursday, Spirit communicated with ground control right when it was supposed to, proving it was still alive, albeit weakened, and able to communicate normally. Now they just have to figure out the best course of action. "But this is all good news," said Project Manager John Callas.

Spirit and Opportunity are now in day 1,730 and day 1,711, respectively, of their 90-day missions.

Oh, and the picture? It's of a Martian sunset, taken by Spirit in May 2005.

Tales from the Geek, Part Four

RIP Phoenix.

The Mars Phoenix lander, which came down on the northern plains of that planet on May 25, has not been heard from since November 2 and is presumed dead, frozen to death in the Martian winter.
The US space agency says it will continue to try to contact the craft but does not expect to hear from it.

"We are actually ceasing operations, declaring an end to operations at this point," Phoenix mission project manager Barry Goldstein said at Nasa's Jet Propulsion Laboratory in Pasadena, California.
Don't feel let down, though: The lander's mission was supposed to be for three months and it lasted more than five.
During its ground operations, the robot dug, scooped, baked, sniffed and tasted the Martian soil to test whether it has ever been capable of supporting life.
And, memorialized in the photo above, it did something never done before: touched Martian water. More exactly, ice. Actual honest to gosh water ice, right below the topsoil. What's more, it found minerals which form only in the presence of liquid water, water which could have supported life.

You want more? We got more.

The Mars Reconnaissance Orbiter has been in orbit around Mars since 2006. One of the things it has recently discovered is that hydrated silica, better known as opal, is spread across large regions of the planet.
The find suggests liquid water remained on Mars' surface a billion years later than scientists had previously thought. ...

The discovery adds to the growing body of evidence that water played a crucial role in shaping the Martian landscape and - possibly - in sustaining life.
NASA's Mars Science Laboratory, which will look for signs of life there, is set to be launched in 2009. The European Space Agency also has plans for a Mars rover, but that mission, called ExoMars, is still several years out, the target date being 2016.

Tales from the Geek, Part Three

In what could become an important breakthrough in AIDS treatment, genetically engineered immune cells known as T-cells
were able to recognize other cells infected by HIV and slow the spread of the virus in lab dishes,
Reuters reported earlier this week. What made this especially important is that one of the reasons that the AIDS virus is so hard to combat is that it can mutate in ways that enable it to in effect hide from the body's immune system. But
[n]ot only could the engineered T-cells see HIV strains that had escaped detection by natural T-cells, "but the engineered T cells responded in a much more vigorous fashion so that far fewer T-cells were required to control infection," James Riley [of the University of Pennsylvania, who was among the researchers], said in a statement.

"In the face of our engineered assassin cells, the virus will either die or be forced to change its disguises again, weakening itself along the way," added Andy Sewell of Britain's Cardiff University.
Initial patient trials are to start as soon as next year.

Footnote: The same article also noted a report that an experimental vaccine prevented AIDS infection in six monkeys. The vaccine uses a virus dangerous to humans and so is not ready for human tests, but researchers said it showed there is still hope for a vaccine against (not just a treatment for) AIDS.

Tales from the Geek, Part Two

Oetzi the iceman may have been frozen in time in more ways than one. NPR reported recently that
British and Italian scientists have painstakingly spelled out the entire genome of the mitochondria — the tiny powerhouses within each cell — taken from intestinal cells of the remarkably well-preserved iceman. ...

The new analysis, published in the Nov. 1 issue of Current Biology, shows that his mitochondrial genes don't match up with any retrieved so far from modern-day humans. That virtually rules out any descendants from Oetzi's maternal lineage....

Mitochondrial DNA is passed down only from mothers to their offspring. It's an important tool for constructing family trees and tracing the movement of people across time and space. Mitochondrial DNA has only about 16,500 genetic units, called base-pairs, instead of the 3 billion in the entire human genome. And the mitochondrial genome is peppered with lots of mutations, unique genetic tags that make it easier for scientists to track genetic lineages.
What this means is that there are, as far as in known, no living descendents from the maternal side of Oetzi's family, no one from any aunts or sisters. That doesn't, however, rule out the possibility of descendents: If he had sons, they would not have received his mitochondrial DNA - which is, again, passed down only through females - but they would have inherited his Y chromosome.

So while Martin Richards of the University of Leeds says that "the maternal lineage of the iceman has apparently gone extinct," he's thinking of sequencing Oetzi's Y chromosome - a far bigger undertaking because it has far more genetic elements than the mitochondria. But Oetzi just may still have some distant relatives around.

Understand that this is "not earth-shattering," in the words of molecular anthropologist Ann Stone of Arizona State University, because "a sample size of one is difficult to do much with." Still, it ultimately might provide some additional clues to life in the copper age.

And besides, as Stone says, it's "kind of cool."

Tales from the Geek, Part One

There have been some neato-keen geeky stories recently, so I'm going to plow through some of them. Savor these few moments; the bad news will return soon enough.

This time, it's news via the BBC that
[f]ive lines of ancient script on a shard of pottery could be the oldest example of Hebrew writing ever discovered, an archaeologist in Israel says. ...

Experts at Hebrew University said dating showed it was written 3,000 years ago - about 1,000 years earlier than the Dead Sea Scrolls.
The writing is in proto-Canaanite, a precursor of the Hebrew alphabet, and only a few words - including judge, slave, and king - have been deciphered so far. Researchers say they believe it's Hebrew because of the presence of a word that is only used in that language, but others aren't so sure: Lead archaeologist Yosef Garfinkel's
colleagues at Hebrew University said the Israelites were not the only ones using proto-Canaanite characters, therefore making it difficult to prove it was Hebrew and not a related tongue spoken in the area at the time.
But not matter if it's Hebrew or not,
Hebrew University archaeologist Amihai Mazar said the inscription was "very important", as it is the longest proto-Canaanite text ever found.
The idea of "lost languages," the fact that there are ancient languages which can't be translated because there aren't enough surviving texts long enough to do analysis, has long fascinated me. In fact, the very idea of language itself interests me. Noam Chomsky made his name by arguing that there is a basic, common, underlying syntax to all languages - which strongly suggests that not only the ability for, but the root structure of, language is hard-wired into the human brain. (His famous line "Colorless green ideas sleep furiously" was to illustrate the idea: Even though the statement is complete, internally-contradictory nonsense that is unlikely to ever have been spoken before he came up with it, we instantly recognize its syntactic correctness.)

Others - notably, Stephen Pinker - have challenged that idea, but even he admits that the ability to language is inherent in people.

Related to that is the recent finding that birds "learn to sing from a hymn sheet in their head," as the BBC put it a couple of days ago.
Swiss researchers have identified a region of the Zebra Finch brain which they believe has an internal recording of how the birds ought to be singing.

A separate region seems to enable the birds to identify mistakes in their songs, they wrote in Nature journal.
What happened was that a team from the University of Zurich monitored the brain activity of the birds as they sang and as they listened to a recording of other zebra finches singing. They found that while the birds were singing, parts of their brain associated with listening were always active and other neurons became active when the birds made a mistake. According to team leader Professor Richard Hahnloser,
"This is a proof of concept that birds do actually listen to their own songs, and they do seem to be comparing it to something that they expect, or would like to hear." ...

The authors believe their research could also shed light on how humans learn to speak.

It has long been assumed that, like songbirds, humans learn complex vocal patterns by first listening to their speech and then comparing it to patterns stored in the brain.

But very little is known about the neural mechanisms involved.
This could well add to that knowledge.

Friday, November 14, 2008

Not all the news is bad

As a follow-up to this post from September about an attempt by the National Park Service to circumvent a court decision regarding public protests along the route of the presidential inaugural parade, I hear from Eli at Left I on the News that the NPS has apparently backed down.

More specifically, the Washington Post said yesterday that
[f]ewer bleachers will be set up this time, meaning more space for standing-room crowds. It has nothing to do with the record crowds expected for the Jan. 20 celebration, although it will allow more people to attend the parade for free.

The changes stem from a lawsuit filed by war protesters, who said they were unfairly swept aside during President Bush's parade. In 2005, so many bleachers were set up, for people who bought tickets, that space was extremely tight for those who wanted to stand curbside and watch for free.

There will be 8,700 reserved bleacher seats this year, compared with 20,000 in 2005, officials said.

Of course, with all the Obama-love going on I'm not sure what kind of protests to expect. The ANSWER Coalition, on whose behalf the suit was originally filed, says it will be there, focusing on foreclosures, but considering the people who during the last presidential go-round opposed plans for demonstrations at the GOPper convention for fear they'd be labeled "disrespectful," wondered in all seriousness why anyone other than "the usual loonies" and "the crazies" would protest at the Dimcrat convention, and decried the idea of anti-Bush demonstrations at his inaugural in 2005 on the grounds "it would make us look like sore losers," I can't imagine any groundswell of support now even for some action that said "Congratulations, President Obama and we're really on your side but we hope you'll do these things."

But ultimately, that wasn't the point, at least as far as I was concerned. The point was to keep the inauguration of a new president from being turned into a private event available only to the well-heeled and well-connected. And for the moment, it hasn't been.

// I Support The Occupy Movement : banner and script by @jeffcouturer / (v1.2) document.write('
I support the OCCUPY movement
');function occupySwap(whichState){if(whichState==1){document.getElementById('occupyimg').src=""}else{document.getElementById('occupyimg').src=""}} document.write('');