Playing the Indian Card

Monday, January 29, 2007

Social Science

Sailing down the Mekong in a very slow boat, I met a young Australian. Cheerful and idealistic as young Australians are inclined to be, and also probably a little drunk, he told me about his life so far. He was having trouble settling into a career. Twice, he had tried to qualify to become a teacher, in two different fields, but he couldn’t take it. “The corruption was too much for me,” he explained. “I mean the whole system was too corrupt.”

He did not elaborate. I presume that to his mind what he meant was obvious enough that it did not need saying.

Nor did I dare ask him. If I claimed I did not understand, I feared, I would appear to him a part of that corruption. I am, after all, a teacher of sorts myself, though at a different level and in a different country.

But I guess I do know what he means.

It is not that teachers are individually corrupt. The issue is, as he says, systemic—the good people shake their heads and try their best to work around it. It is as Vaclav Havel said of the communists at the fall of the Berlin Wall: it is not fair to blame or punish the communist functionaries for what communism has done. Most communist functionaries I have met, just as most teachers I have met, are honourable people trying to do their best. In a bad system, everyone must make unpleasant choices.

The problem, rather, is that teaching, which is properly a vocation, indeed one of the charisms of the Holy Spirit, decided instead a generation or so ago to become a “profession.” Making the same mistake as journalism at about the same time.

Ever since, note, both teaching and journalism have been in moral decline. A vocation is something you do out of a sense of mission and a love of your fellow man. A profession is something you do, ultimately, out of self-interest. A society whose teachers and journalists are out for their own self-interests is a society in trouble.

Both teaching and journalism made the further mistake of basing their certification process on the social sciences. This may not have been an error, exactly; it may have been done cynically. But the result is the same. The social sciences simply do not work. They have produced essentially no new knowledge since their creation by Marx more than a century and a half ago. Worse, they can be directly credited with most of the worst horrors of that same period: Nazism, social Darwinism, eugenics, Freudian psychology and the obsession with sex it produced, behaviourism, Marxism, Stalinism, feminism, the Khmer Rouge, the ennui of the modern suburb, the ugliness of the international style, and on and on.

If the social sciences worked, here is the evidence we would expect to see: the prisons would be emptying over time. Mental illness would be declining over time. Divorces would be declining over time. The general avowed level of human happiness would be rising over time.

Instead, we see the reverse: the prison populations are growing, the rate of mental illness is expanding geometrically, divorce is leveling off after a period of great growth; and the suicide rate is going up.

The evidence seems clear. Not only does social science not work. It does harm.

At any given moment, mind, there is one theorem somewhere that many are prepared herald as the “first finding of social science that is neither trivial nor wrong.” The idea of the unconscious; of repressed memories; of language universals; of comparative advantage. Note that all of these but the last have now been generally disproved. I still feel the last may prove correct—but a product of logical inference from first principles, not of social science.

It is time to throw in the towel.

For, I submit, social science never will work. It is based on a logical error as fundamental as the classic one about “how many angels can dance on the head of a pin.” Indeed, it is really the same error.

“Social science” is the notion that we can take the tools of empirical science, designed to study the natural world, and apply them to human beings, or more precisely, human minds. This is no more reasonable on the face of it than supposing that we can drill a hole with a hammer. This is not what the tool is for. But it is also wrong on several other specific grounds:

First, it is fundamentally immoral to conduct controlled experiments on other human beings. This makes all social science morally objectionable.

Second, it is fundamentally impossible to conduct controlled experiments on other human beings. They are by nature too complex to ever reasonably ensure that you have controlled for all other variables.

Third, you face an insurmountable observer paradox: science itself, let alone the individual scientific paper or experiment, is a much less complex system than the human mind. This is necessarily so, since it is a product of the human mind. Therefore, it is wildly unlikely if not exactly impossible that the one could comprehend the other. For analogy, it is as if someone set out to swallow himself whole using chopsticks. Or it is as if someone carved or cast for himself a statue out of some precious metal, and then set it up on a pedestal and worshipped it as his creator.

Fourth, social science obliges one to think of other human beings not as equals, but as objects. This violates the Golden Rule and categorical imperative, something which Kant has shown is not just immoral, but logically incoherent. (For teaching, it also makes for appalling training for what should be a vocation to one’s fellow man.)

Fifth, empirical experiments are based on sense observations: rather than applying universal rules, one is relying on what can be directly observed. Yet the human soul or mind is in the realm of the unobservable, in sensory terms. It cannot be seen, smelled, tasted, heard, or touched. By its nature, it cannot be directly studied by this method.

Sixth, scientific procedure requires objectivity—that is, one observes not oneself as a conscious entity, but objects. This is exactly upside down for the human mind, because our own mind is the only one we can actually directly observe in any sense. We know the minds of others, with the possible exception of God, only indirectly.

All social science should be eradicated by any enlightened society. It is by its nature opposed to human equality, and so a danger to any society founded on the doctrine of human equality or the rights of man. It is by its nature anti-democratic, and so a danger to democracy. It is also necessarily anti-religious, and so a danger to any society’s spiritual and moral health, as well as to the spiritual and moral health of its individual members. These features may be more or less concealed or denied by individual social science movements; but in the long run they are always so. The logic demands it sooner or later

Friday, January 26, 2007

Letter in National Post

A letter of mine is in today's National Post.

Beyond Words

There seem to be technical difficulties between YouTube and Blogger these days. Odd, since both are now owned by Google.

So you may have to click over on your own to YouTube and search for "Traditional Catholic Monks of Papa Stronsay, Scotland".

Obama Passes

It looks as though, as predicted here, the Barack Obama bubble is already bursting. Polls and Al Sharpton suggest African-Americans have noticed he is not, in any meaningful sense, African-American. He's just a white guy with black skin. African-American voter support breaks heavily in favour of Hillary Clinton. Clinton also doubles Obama's support in polls of the Democratic faithful.

But I still don't think the eventual nominee will be Clinton. She has more negatives than a Ford Motor Company balance sheet.

Thursday, January 25, 2007

The Village Atheist Writes

“Every religious sect, without empirical proof, believes its view is superior to everyone else’s.” With that first sentence, Jeff Harmsen employs a fudge factor big enough to invalidate the rest of his letter in today’s National Post (“In praise of atheism,” January 25). There are proofs other than empirical proofs. Indeed, as philosophical proofs go, empirical ones are generally less than compelling.

Most religious groups worthy of the name can indeed marshal logical reasons why their beliefs are correct. This is called theology.

Harmsen goes on:

“In essence, the underlying message is that, if we don’t believe as they do, we are less of a person than they are. Through atheistic humanism, on the other hand, we are all truly equal, because, instead of divisive creeds, we simply believe in each other.”

In essence, the underlying message is that, if we don’t believe as he does, and are not atheists, we are less of a person than he is.

“What if our ideology were based on a concept such as universal empathy, whereby mere birth warranted the unadulterated respect of all other fellow human beings? The consideration we show will be reciprocated. How could it not be?”

He might ask a Christian. They’ve been trying this little experiment for two thousand years.

“By seeing each other, not as Jews or Christians or Muslims, but as fellow human beings, we can obsolesce [sic] war and terrorism and maybe, just maybe, live in an unprecedented era of enlightenment.”

Not quite unprecedented: “There is neither Jew nor Greek, slave nor free, male nor female, for you are all one in Christ Jesus.” Galatians 3:28.

On Snow White

The traditional academic approach to fairy tales (Marchen) is to see them as undigested bits of paganism or expressions of Freudian passions suppressed by Christianity.

This is silly; the people who have told them have been Christians for many generations. It is a classist assumption that the average person is too stupid to realize a conflict with their own religion. And it is even worse to assume, as Freudians and Jungians do, that the average person is actually unconscious.

On seeing Disney’s Snow White—actually for the first time, though of course I already knew the story—nothing could be more obvious than the specifically Christian overtones of the tale. It is surely perverse to overlook the similarity between the apple Snow White eats, against the warnings of the dwarfs, and the apple Eve eats in Eden. In both cases, the effect is the same, too. “On that day, you will surely die.” A spiritual sleep, a spiritual death, until awakened by love.

Snow White is recognizable as the human soul, always feminine (Greek Psyche), and her story is the epic of human salvation. We begin innocent, Snow White, but eventually we all eat the apple. Original sin.

The Wicked Queen is Lucifer. Like Lucifer, she is beautiful—the most beautiful of the angels—and like Lucifer her chief sin is pride. In the Muslim conception, Satan falls when he refuses God’s command to bow before man—envy of humanity is implied here. And note that, as here, in many Medieval and Renaissance depictions, the serpent in the garden is female. She is female again in Gibson’s Passion of the Christ. She is our step-parent and our ruler—Satan is “the Lord of this World.”

The Handsome Prince who awakens her with his love is, of course, the Christ; and he takes her in the end to live happily ever after in his father’s house, which has many mansions.

Some day our prince will come—indeed.

The seven dwarves? They are the saints, the believing church, the faithful, who intercede for and protect the soul pending the second coming of the Christ. They know where the true gold is, buried deep in this world. They are small of stature in this world, because that is how true Christians are supposed to be—the “little ones” of the Beatitudes, the anawin. In the world, but not of it.

It is indeed also possible to find nature references in the story. This pleases those who want to see it all as a pagan survival, because paganism is supposed to be about nature worship. Paganism is in fact not at all about nature worship; but never mind. One can see Snow White as the earth itself, barren in a deathlike sleep over the winter. The Handsome Prince is the vegetation God, like Adonis or Dionysus, who disappears each fall and reappears each spring.

The seven dwarfs could be interpreted as the seven stars of the Big Dipper, who remain loyally in the sky, overlooking the prostrate earth, throughout the fallow period, while many other stars fall below the horizon. They represent faithfulness, because they are always in the sky, and because they always point to the north pole, the centre of the sky’s motion, the centre of all things. One can even see them as miners, digging the other stars up from the earth, where they disappear for part of the year, like diamonds from a mine.

Continuing the celestial imagery, the bright red apple is the sun, which, devoured by darkness, causes winter to come. The Queen’s magic mirror that sees all upon the earth is perhaps the moon, which sees in dark places with reflected light. And the Wicked Queen, possessing both these things, as if one in either hand, is time, which is measured by the sun and moon, and which gives birth to and devours all things—like a mother who is also homicidal.

But note that this explanation works rather less well than the Christian one. For example, it makes the claim that Snow White lives “happily ever after” literally false. It makes no sense that the earth should retire, as Snow White does, to the house of the seven stars/dwarfs—i.e., the sky--for the winter: it doesn’t. Nor does the Big Dipper point to the earth; can Snow White represent both the changing earth and the permanent North Star? While digging diamonds sounds like a reference to the stars in the night sky, this is Disney’s addition: in the Grimm story, they dig gold. And so on.

It is necessary, therefore, I think, to see the Christian signification as the primary one, and these interesting allusions to nature as secondary.

Why are they there? It is possible that there is an earlier pagan story underlying the story we know, which has been altered to make it Christian. But there is, I think, a more plausible alternative explanation. I would argue that all art, and certainly all mythoi, seek to uncover the logos, which is to say, the basic structure or order or plan underlying all things. Think of Hesse’s Glass Bead Game, which I take to be his grand image of art. To the extent that it does so, makes sense of a variety of diverse elements of human experience, can demonstrate the same pattern underlying disparate things, it works as art. And it also works as cosmology, mythos, making sense of human experience and showing God through his plan of creation.

So, while the story’s main intent is to show the basic pattern of human life—which is, after all, of greater intrinsic importance to any of us than the mere cycles of nature—it succeeds better as art if it can also make this a coherent and compelling narrative, plus an accurate description of the cycles of nature, plus an accurate description of the organization of human society, plus a completely logical process, plus referring in correct detail to the movements of the night sky that represent time, and so on and on.

It is therefore a measure of the success of Snow White as art and mythos that it works so well at once as a story, as a schema of human salvation, and as a description of the cycles of nature and of the sky.

But there is nothing pagan about that. The logos itself is an essential Christian concept.

It is also, incidentally, what science is all about—that other quintessentially Christian activity.

I only wish more recent Disney products had some tiny fraction of the artistic depth of Snow White. But genius is genius, and it is rare.

Tuesday, January 23, 2007

A Party of Firsts

So Bill Richardson has just announced for the 2008 Democratic nomination, in hopes of becoming the “first Hispanic president.” He joins frontrunners Barack Obama, “first African-American president,” and Hillary Clinton, “first woman president.”

You’d think the Democrats were vying for a place in the Guinness Book of World Records, rather than for the presidency itself.

But actually, each one’s claim is flawed; Guinness might have reason to refuse them the title even if elected. Richardson is not a particularly Hispanic name. His father, William Blaney Richardson, was the scion of a prominent Boston family. His mother and grandmother are Hispanic.

Barack Obama’s name sounds pretty African American; but his father, from whom he inherited it, left the family when Obama was two. His mother was from Wichita, Kansas, and he was raised by his maternal grandparents. His upbringing, in other words, must have been more or less identical to that of a white child. If he is “African-American,” it is purely on the basis of having the blood of a father who was racially African.

But if that standard holds, Richardson must be considered white, from his father’s blood. Conversely, if Richardson is to be considered Hispanic, then Obama must be considered white, following his mother.

Hillary Clinton, by contrast, is fairly obviously a woman. On the other hand, she fairly obviously arrives at her present political prominence through her husband. Any feminist ought to cringe at the thought that one should attain high office through their spouse; this is simply the position of “First Ladies,” and wives, throughout history. This is what feminism rose against. George Wallace’s wife also followed him as Alabama governor; Juan Peron’s wife followed him as Argentine president. Were these “progressive” regimes? Were these breakthroughs for women?

All this illustrates that the Democrats are, in the pure sense of the word, still deeply racist, as they have been for much of their history. They see people as members of a particular race even when it is perfectly arbitrary to do so. They refuse to see people as unique individuals even when it is almost necessary to do so, as with Richardson and Obama. And they can be appealed to and counted on to vote on the basis of sex, class or race.

Monday, January 22, 2007

Reliving My Lost Youth

Having been in Vietnam recently, I have also been reading, as I mentioned recently here, A Popular History of the Vietnam War, bought on the streets of Ho Chi Minh City, written by an avowed Marxist, Jonathan Neale. Coming, as it does, from a Marxist, such comments as the following have added credibility:

- The Viet Minh was not a bunch of peasants or proletarians. “Its members were mostly, like Ho Chi Minh, the educated sons and daughters of landlords and government bureaucrats.” (p. 18). “Vo Nguyen Giap, for instance, was …the son of a mandarin.” (p.20). “In North Vietnam, … party bureaucrats, not workers, ran the state.” (p. 109).

- The Viet Minh was, indeed, popular among the peasants. Beyond the traditional deference to social superiors, in particular to those with a better education, this was because of their promise to take the land from the landlords, and give it to the peasants. Not hard to understand the appeal.

- They were not popular, though, among the proletariat. So much for Marxism properly speaking. The leader of the Communist underground in Saigon during the Tet offensive later admitted publicly that the organization of the workers was “worse than bad.” (p. 107) “The Communists did better at organizing the managers… [who were] the same sort of people with the same sort of education.” (Neale, p. 109).

This seems true everywhere. Communism does worst in terms of gaining power in industrialized countries, and best in feudal countries. This is the opposite of what Marx predicts.


- “[Conventional] wisdom says that anti-Communism was invented and led by Senator Joe McCarthy, when in fact it was organized by President Harry Truman and FBI Director J. Edgar Hoover…. It was in fact started by Democrats.” (p. 50). Neale cites Hubert Humphrey and Adlai Stevenson as especially cold Cold Warriors. He does not mention the young Robert Kennedy, but might have. He claims that McCarthy was scapegoated, and that the scapegoating of McCarthy came once the liberals had become “ashamed of what they had done” (p. 57).

Maybe; at a minimum, "liberals" do seem to have changed their "principles" faster than principled people are usually inclined to.


- “Those who believe that Kennedy had said America should not be involved in Vietnam need to explain both why he sent troops there and why all his senior advisors except Ball wanted to send more troops.” – p. 67.


- Neale points out that the domino theory was perfectly reasonable in historical terms. “There are many examples of how the domino theory has worked in great social movements.” The Russian Revolution led to copy-cat revolutions in Hungary, Germany, China, Korea, Vietnam, and Yugoslavia. Fascism’s victory in Italy led to copycat governments in Germany, Austria, Hungary, and Spain. The defeat of Portuguese colonialism in Angola and Mozambique meant the days of white-ruled Rhodesia and South Africa were numbered. Dominos all.

- “Overall blacks accounted for 12.6 percent of combat deaths in Vietnam, roughly equivalent to their share of the [US] population” (p. 129). It was not, as often claimed, a case of black people made to fight a white man’s war, or of blacks being sent disproportionately into the line of fire.


- “Until 1970, blacks, low-income families, and the over-sixties were the only sections of the population in which greater numbers favoured withdrawal rather than escalation.” – p. 131. The higher the education level, the more likely people were to support the war.

It was actually the rednecks and the old fogies who were against the war. The young and the college educated were for it.

Friday, January 19, 2007

Science Says You're Wrong, Mr. Einstein!

In the field of comparative religion, they—we—talk about a religion called “Scientism.” Without being recognized as such, it is the dominant religion of our day. It is the belief that “science” is a body of certain knowledge, and that it can potentially and will eventually explain all things: life, the universe, and everything.

Both assumptions are nonsense, but they are the bedrock beliefs of the general public. Back in the intensely scientistic early Sixties, I used to read DC comics. Almost every one had, as a public service, a page called “Science says you’re wrong if you believe that…” I wish I could still find a copy of one of those pages; but I recall one tidbit: “Science says you’re wrong if you believe different races differ in their abilities.” That’s Scientism.

DC’s basic idea seems to have been that it was an obvious public service to propagate “science”; that science was a set body of certainties; and that science and public morality were somehow the same thing. But real science is a method, and if it is working today’s conclusions may be overturned tomorrow. Indeed, even today, science is never going to speak with one voice. “Science” says nothing at all. Scientists do, and they will always disagree.

Here’s a scientific article disputing some of our common assumptions about AIDS. It argues that there is no real evidence of an epidemic of HIV infection in Africa, that AIDS in Africa is not clearly caused by HIV, and that there is no real evidence that HIV can even be transmitted heterosexually.

Is it “unscientific”? Hardly. This is what real science is supposed to do: challenge established assumptions. Science says you’re wrong if you believe science says anything is finally wrong.

Similarly, some “expert” recently proposed decertifying any TV weatherman who refused to accept the doctrine of “global warming.” This is Scientism at its worst: “science” as a body of set beliefs one must ascribe to, or else one is a heretic.

Here’s a brief rebuttal from an ABC weatherman.

In the end, for all the harm it does, Scientism does more harm to science than to anything else.

Thursday, January 18, 2007

Mallick Aforethought

Heather Mallick writes a column for the CBC about what she calls a “surreal police overreaction.”

I see it a little differently. I think Mallick is being shockingly classist. She apparently holds it an unquestionable truth that laws should not be enforced against university professors.

One is apparently only supposed to arrest members of the lower class; certainly not professors, albeit for the very same act. God forbid that a member of the upper class should have to suffer a ride in a “fetid paddywagon” (sic). A policeman ought not to question such a man, and ought to accept questioning of his own authority, or being called a “hominid,” with good grace from such a man, because, after all, a professor socially outranks any mere policeman. The policeman should have realized this immediately when the suspect had “an accent that resembles that of the Queen.” The stupid peasant. The “thug,” to use Mallick’s word.

Mallick is pleased to report, however, that at least the Atlanta police chief was reprimanded over the incident. And, of course, the professor was released by a judge without charge the instant, as Mallick recounts it, the latter heard the Queenly accent: once he had said “three or four words.” This, of course, to Mallick, is plain justice. God forbid such a thing as justice should be blind. God forbid that we should all receive equal protection of law, no matter how refined we might be.

For that "fetid" ride, though, and for spending eight hours in a police station, Mallick declares the poor long-suffering professor a “martyr.”

Better yet, Mallick reveals, it all explains American actions in Somalia.

Yep. They’re probably nefariously trying to spread democracy and equality before the law even there.

Tuesday, January 16, 2007

DiNovo Answers Coyne on the Minimum Wage

Cheri DiNovo, MPP for Parkdale High Park and author of the “Living Wage” Bill, has written to the National Post to “refute” Andrew Coyne's column on the minimum wage with “the facts.” Coyne rather gallantly has not commented, and actually says she has “torn it to shreds.”

But what she has actually done is provide us what could be a textbook example of almost the full range of logical fallacies.

Here are her stated reasons for supporting a rise in the minimum wage to $10 per hour—all her stated reasons in turn.

1. “Andrew Coyne is being flippant.”

Ad hominem.

2. “because that is what Campaign 2000 — the campaign to end child poverty by 2000, signed on to by all the major political parties — has asked for.” (And, she adds, Ontario Social Development Council, the Interfaith Social Assistance Reform, social planning councils, children’s aid societies, unions, immigrant associations and others.)

Appeal to authority. It does not matter who has said a thing—what matters are their reasons for saying it. Is Campaign 2000 justified in this request? We do not know.

3. “I also propose an end to the claw back of the National Child Supplement provincially, new housing, and support for small business in equalizing property tax. I would also support the sort of actions Ireland has undertaken that have brought its poverty rate down to a third of ours.”

Red herrings. These are separate questions, and have no bearing on the wisdom of the minimum wage law.

4. “…one would think Ireland was sabotaging its economy by raising the minimum wage to over $11 per hour. But as everyone knows, it is one of Europe’s great success stories.”

This is the fallacy of the hasty generalization: a case cannot be made from only one example, even if it is Ireland. There are too many other possible factors involved.

Oh yes, and “everyone knows”? Ad populum.

5. “Ten dollars an hour will allow someone who works 40 hours a week to actually pay the rent and feed their children without using a food bank.”

Begging the question. DiNovo is simply ignoring Coyne’s prior rebuttal, that almost nobody works full time for the minimum wage—and that ten dollars an hour is no more clearly a “living wage” in any objective sense than $8 or $12.

6. “Giving a single mother a chance to stop receiving social assistance and return to work without losing money is another good justification for a $10 minimum wage.”

Begging the question again. Coyne has already answered this point, and DiNovo ignores his answer: “When we think about it, it’s not a minimum wage we’re really aiming for: it’s a minimum income. If so, then the proper approach is to supplement the incomes of the working poor, through the tax-and-transfer system -- not fix their wages and hope for the best.”

7. “It’s good for the economy to raise wages, because it lessens the social service cost which costs all of us, and gives families money to spend.”

Begging the question. This ignores Coyne’s main point, that raising the minimum wage causes unemployment. This would presumably raise social service costs and give families less money to spend.


8. “Raising wages always costs money to business.”

Imputing motive. A form of ad hominem. Motive is irrelevant; the actual points made must be addressed.

But it is also begging the question, because Coyne has already pointed out that this is not so: “employers can always sidestep any attempt to impose a ‘just’ wage simply by hiring fewer workers.”

9. “That is why slavery was supported.”

False dilemma, aka the fallacy of the false alternative. It is entirely possible to reject a minimum wage without supporting slavery.

10. “But the issue of paying people enough to live on is about ethics, not just economics. Poverty costs us,…”

Straw man: Coyne is not advocating poverty. His argument is that the minimum wage causes it.

11. “Approximately 200,000 earn minimum wage. Again, if you are one of them, your economic circumstances would certainly change for the better under a $10 minimum wage.”

Begging the question. Coyne has already argued that your circumstances in this case could indeed change for the worse—you could lose your job.

12. “By his [Coyne’s] reasoning, we should all be unemployed by now.”

Fallacy of the false alternative. There are indeed other possibilities somewhere in between having no effect on employment, and throwing everyone out of work.

13. “I have heard the cry from women who work long hours and want for everything. The cry is for the well-being of their children. I will listen. I will respond.”

Appeal to pity. Aka tear-jerking. The outrageous implication is that we should agree with her because she claims to be more emotionally committed to her position than is Coyne to his.

The title of the piece, “Andrew Coyne, have you no heart?” is also of course imputing motive. But titles are from editors, not writers.

But note that, when you discount all the logical fallacies, there is absolutely nothing left of her argument.

Monday, January 15, 2007

More on Vietnam and Face

“The day before the coup the French were the respected masters of the country; the day after it they were uninvited guests with the worst of reputations.” – Paul Mus fleeing Hanoi in 1945, as quoted in Frances Fitzgerald, Fire in the Lake.

This is the effect of losing face. The French had lost Vietnam to the Japanese, and had not returned until after the war was won. Bah! Frauds!

The Bao Dai and Ngo Dinh Diem governments that followed never acquired any face to begin with, because they did not win office by vote or force of arms or the inspirational power of their ideas or their clever organization—in other words, by proving themselves more capable than anyone else. It was handed to them by the French. As Fitzgerald notes, at the village level, they were the same government as under the French—that is, the average Vietnamese dealt with all the same guys as before. In 1959 one third of active civil servants were still holdovers from the French regime. This is most significant: when the regime loses face, all members of the regime lose face. Confucian honour requires all officials to resign when a dynasty falls, and not to serve with a successor dynasty. For an official to continue in office is a further loss of face.

Generally the capital is even moved. Yet Saigon’s ministries remained in the same buildings, with the same people sitting behind the same desks.

This was obviously not going to work. Both Bao Dai and Ngo Dinh Diem must have known it. It was of course the right thing to do from the Western perspective, capitalizing on the supposed continuity with the previous government, appearing as its “legal successor”; but everything worked just the other way in a Confucian society. Bao Dai or Ngo Dinh Diem could do nothing. They had no organization or personal following. They could not have pulled off firing everyone; they had no one to replace them with. In the meantime, they would have alienated their French or American masters, and being supported by the foreigners was the sole source of their power.

Bao Dai had some claim to the imperial line. Accordingly, he voluntarily abdicated. He knew he could not hope to rule; but being associated with the project would lose face, and he had some face to lose. Having no reputation to begin with, Diem was prepared to put on a show for the foreigners, but his own understanding that it was never going to work doubtless inspired the overwhelming corruption that marked his and the later military regimes. Knowing it was all a sham and must end, the thing to do was to squirrel away as much as possible for yourself and your family in foreign bank accounts before the Potemkin Village collapsed.

And so on for every other official down the line.

Accordingly, the US attempt to establish a strong anti-communist government in the South was doomed. Each general who took the reins in turn knew he had no legitimacy other than as a US protégé, and his only object was to acquire as much loot as possible before this became apparent.

Was there any was out for the Americans? It looks as if there was one. There were three non-communist organizations that might have been capable of taking power and holding it in the Confucian system—that is, three organizations that held some social respect and organizational structure: the Buddhist orders, the Catholic Church, and the Cao Dai. Unfortunately, all were invisible to the Americans, because all were religious organizations.

The Buddhists actually made a bid for power, and probably could have won a popular election, had the US allowed it. Better yet, a coalition of the three religions might have been arranged; this would actually have fit very well with Vietnamese traditions, and when the Buddhists rose against Diem they had explicit Catholic support. And, as religious organizations, they all had every reason to be opposed to Communism.

Sadly, just as with the Shiites in Iraq today, the Americans would not allow it.

They may have feared Communism. But they feared religion even more.

Sunday, January 14, 2007

More on Mormon Presidents

A full page in yesterday’s National Post (“Words from the Prophet,” p. A19) is devoted to the burning issue of whether it is safe for the US to elect a Mormon President—more evidence of the growing religious intolerance in North America. With the smirking subhead “Is America Ready for a Mormon President?” As if the issue were simply the timing. The piece, by Damon Linker, reveals that Mormonism differs from both Protestantism and Catholicism in holding that “revelation,” prophecy, continues even today. This is scary, he suggests, because it means a President Romney would be taking orders from the current prophet in Salt Lake City, as opposed to published writings or an established body of belief.

Shades of the old rumours that, if Al Smith won the Presidency, the Pope would move the Vatican to New York City.

This is wrong; there is no real difference between Mormonism and mainstream Christianity on this. Both Catholicism and Protestantism also believe that revelations continue today. Prophecy is one of the gifts of the Holy Spirit cited in the New Testament. The many Catholic apparitions of Mary often offer private revelations. Among Protestants, Pat Robertson has just released his annual list of prophecies for the new year.

And any Catholic or Protestant president sincere in his faith would also be taking advice from a spiritual advisor of some sort.

The author of the Post article is trying to make Mormonism sound stranger and less mainstream than it is: appealing to prejudice, in effect.

As for me, I would far rather elect somebody to office who admitted a moral check on his actions than someone who did not.

The piece raises the question, what if Prophet Hinckley in Utah ordered Romney to murder someone? Would he go and do it?

But the question is silly: “Thou shalt not kill” is in the Ten Commandments for Mormons just as it is for Catholics or Protestants. Moreover, when someone embraces a religious teaching, he does not surrender his reason, or his conscience, or his sense of right and wrong. These things are absolutes, and are not subject to personal whim. They cannot be changed. A religious teaching informs and educates our conscience and reason; it does not go against them, or we simply, if we are not either madmen or evil, change our religion.

Only the non-religious suppose these things, conscience and reason, can be altered at will or on command. That is the doctrine of relativists, and religions are, per se, absolutist.

If, therefore, Prophet Hinckley ordered something obviously immoral, he would simply be revealing himself, to Romney just as much as to the rest of us, as a false prophet. “By their fruits you shall know them.” Just as, if the Pope in Rome came out advocating something obviously immoral ex cathedra, he would simply be demonstrating to me as a Catholic that he is not legitimately pope, or not really infallible.

At this point, if I were an American and eligible to vote in their primaries, I think I’d support Romney purely as a matter of principle, to fight anti-religious prejudice.

Thursday, January 11, 2007

Vietnam, 1968: Happy New Year!

Could the war in Vietnam really have been won? It is a common belief among American conservatives that it could have, had it been fought with no holds barred—and had the troops not been betrayed by politics back home.

I resist this argument. For one thing, it sounds too much like the “stabbed in the back” myth popular in Germany after the First World War. It is too easy for people who have lost to convince themselves they were cheated.

It is indeed clear from the records of both sides that, in fact, the Tet offensive in 1968 was a huge defeat for the Communists. It was actually their last-gasp attempt to seize power. “The Front had been fighting for a long time. Its generals were not sure they could continue. One big push, they told their people.” (Neale, A People’s History of the Vietnam War, p. 106).

Their desperate hope was that, given the opportunity, the cities would rise and support them. It did not happen. Their organization in the South was virtually wiped out in the operation.

Worse, according to Communist operatives, they lost a great deal of their popular support. They had been levying huge taxes on the promise that they would overthrow the government. They failed, and their wells of support were accordingly drying up.

In the event, though, the offensive convinced the American public that they wanted out.

Could the outcome have been different, then, had the Americans chose a different course? General Westmoreland in fact saw a great opportunity at the time, and wanted to administer the coup de grace. He wanted two hundred thousand more troops, which he would use to invade Cambodia and Laos to cut of the Ho Chi Minh Trail, and for an “Inchon-style” landing behind the front lines in North Vietnam.

The possibilities are intriguing. But personally, I don’t think it would have made any difference in the end. There were, reportedly, close to 170,000 Chinese troops in North Vietnam. Presumably, they were there as the proverbial “trip-wire”: if the US invaded the North, China was saying, it would come in.

And the domestic American situation really does need to be considered. As Johnson, the most skilled of politicians, understood, the American public would not have accepted this. The American armed forces might even have mutinied. When Nixon eventually did go in to Cambodia, there were serious consequences on the home front and in terms of US prestige abroad.

Meantime, the Communists had the perfect political platform to retain the support of the countryside. They simply promised that, if in power, they would take the land from the big landowners, and dole it out to the peasants. Since there were a lot more poor peasants than rich landowners, this won a lot of support.

Of course, it was a cruel lie. Once in power, they didn’t really do that. They took the land from the rich landowners, and gave it to themselves, as the government. The peasants continued to work someone else’s land for next to nothing. The party officials who benefited were even from the same class, as Neale points out, as the rich landowners.

But this proposition had the double advantage of being instantly understandable by the peasants, and obviously to their benefit. Abstractions like the right to vote or a free market were much less clear and much less clearly attractive.

This put the US in an untenable situation morally and ideologically from the start. They were supposed to be there for democracy. But the fact was that, had free elections been held, the Communists would have won.

Perhaps the one way the US might have won the war, once they had passed up on the Buddhist option, was by taking the money they were investing in war materiel, and using it instead to obviously improve the life of the peasants. It could have itself, for example, bought the land for a fair price from the large landowners, and doled it out to the peasants working it. And the US could have offered tax breaks to American firms setting up industries in the cities, factories offering good wages. This could even have been a profitable venture.

But then again, an even better, more honest, approach might have been to hold genuinely free and fair elections, watch the Communists win, and declare mission accomplished.

Wednesday, January 10, 2007

Coyne's prose

I just can't resist quoting a bit from Andrew Coyne's piece in today's National Post. Nobody does this sort of relentless logic better:


"The whole point of a minimum wage is that the market wage for some workers — the wage that would just balance the supply of and demand for unskilled, transient, or young workers in highly unstable service industries — is deemed to be too low. If, accordingly, it is fixed by law above the market level, it must be at a point where the supply exceeds the demand. Economists have a technical term for that gap. It’s called “unemployment.”

Advocates of minimum wages either reject that elementary logic, or they don’t care. ..."


A thing of beauty.

Marxism and Racism

I have said before in this space that Marxism’s notion of class and class warfare segues naturally and almost inevitably, as it did in Germany and Italy in the twenties, into racism and race warfare. Once you start the hunt for scapegoats, a foreign scapegoat is the best. And the very term “bourgeois” is telling: it means the people of the town. The people who, unlike peasants and landowners, were not tied down to the land. The people, mostly traders, most likely to have come from elsewhere, and to have contacts elsewhere.

I was aware already that in Vietnam, the hostility to “bourgeois” almost automatically became a hostility to the Chinese minority who tended to run the businesses in the South. Hence the sufferings and exodus of the “boat people.” I was unaware, until pointed out by Neale in his People’s History of the Vietnam War, that the same was true of the “Killing Fields” of Cambodia. Wiping out the “bourgeoisie” there actually began with wiping out the Muslim Cham and the Vietnamese minority in Cambodia.

Tuesday, January 09, 2007

The Philippines: Islands Lost in Time

It is always reassuring to be back in the Philippines. To return to China is to be more than a little nostalgic at how it has changed. To return to Vietnam is to see all the cyclos gone and new scooters everywhere. But to go back to the Philippines is mostly to see everything the same as it was. Just a little seedier.

Why can’t the Philippines develop at the pace everywhere else in East, Southeast, and now even South Asia seems to? They even started ahead—in 1960 they were wealthier than Korea or Taiwan. They also have the obvious advantages of a large English-speaking population, perfect for call centres, and an American-modelled system that should be comfortable for US investors. They even seem to have a good education system, relatively speaking. Why isn’t the investment pouring in?

Indeed, it’s downright embarrassing, for Americans, that their piece of Asia stands out only for standing still.

An American friend argues that it is the Americans’ fault: they left the unfortunate Philippines not with the American system so much as with the New Deal and its tendency to bureaucracy. And bureaucracy in the Philippines is stifling.

But that doesn’t really explain it, to my mind. For one thing, bureaucracy has not stifled America itself in the years since 1945—even though it has actually grown since then, in America.

Nor is bureaucracy really likely to be a heavier burden in the PR than in pseudo-communist countries like China and Vietnam.

And much bureaucracy is made necessary by corruption. When documents are commonly forged, for example, it is that much more necessary to ask for additional forms of ID. When bribes are commonly paid, it makes sense to have two screenings instead of one.

The Filipinos themselves, to their credit, do not put the blame on their colonial experience. Nor does it seem fair. After all, they have been on their own since 1945. If Germany and Japan could rebuild, in the same period, from the rubble of lost war into world leadership, and Eastern Europe be already pulling out of the trauma of Soviet dominance up to 1989, there is little solace there for the Philippines.

My American friend also, frankly, blames Filipino laziness. A Thai with some Philippines experience does too. He insists the Philippines will never develop as quickly as Thailand because the Filipinos are just plain lazy by comparison.

It is not politically correct to say such things. But we must, in honesty, admit that there are national characteristics, and that laziness may be one of them. And observing Filipinos at home does leave me with this strong impression. One evening, I had supper in the hotel restaurant. When I wanted to pay the bill, all the waiters had disappeared. When one finally returned, and I offered her a bill requiring change, she would not get it for me. Instead, she waved me to the cashier to get it myself. Despite the fact that I was offering her a tip. (And despite the fact that she delivered my meal with no silverware.)

A couple of days earlier, I ordered an iced café latte in a Starbucks in Manila. When the order came, it was a hot café latte. When I brought it to the server and pointed this out, as politely as I could, she handed me a paper cup full of ice.

Another time, I walked into an empty restaurant for supper. I counted three waitresses and one waiter visible at the counter. It took me twenty minutes to catch someone’s eye and place an order. And that required raising my voice.

In Camiguin, a beautiful volcanic island which has become a tourist haven, our vacation was cut short by lack of funds. It is not that we didn’t have the money in the bank; it is that none of the banks on the island have an ATM hooked up to the international networks, and none will accept travellers’ checks. They apparently just cannot be bothered with the business.

This is not a case of a shortage of work: it is a shortage of people wanting to work. When you go out on errands in the morning, you never know what percentage you will be able to complete; because some shopkeepers will always have decided to take a spontaneous holiday. Government workers abandon their posts for days if there seems to be something better to do.

Mind, Filipinos abroad seem to work quite hard. (And if it is not prejudice to note this, why is it prejudice to note the opposite of those who stay home?)

When I pointed this out to my American friend, he suggested that this was because anyone who really wanted to work left the country.

The left traditionally blames the problem on the lack of land reform. They suppose that, if larger land holdings were seized by the government and doled out to the poor, all would be well.

Why not? It worked in Zimbabwe.

Or not.

It is hard to see how this would help the Philippine economy. In the first place, land is almost irrelevant to wealth in the post-industrial world. An influx of landless labour to the cities is just as likely—or more—to cause a boom in manufacturing or service industries. And in the second place, efficiencies probably argue, in the PR as in Canada, for farms to be larger, not smaller. Breaking up bigger farming operations is liable to slow the agricultural sector, so long as it is already in private hands. Indeed, when I asked a local agriculture department rep why the Philippines did not grow rubber (or cinnamon, or tea, or nutmeg…), a hugely profitable crop in Vietnam, she explained that it requires a large plantation, and these were rare in the Philippines. It is a nation of subsistence farmers.

The Filipinos themselves blame corruption. A taxi driver this morning claimed international studies reckon the Philippines the second most corrupt nation on earth.

This makes sense too of the apparent lack of initiative: corruption saps the drive to do anything, because results are not related to effort expended.

Partly, I suspect that “corruption” is a convenient alibi for many Filipinos. It is indeed grossly corrupt—fourteenth most corrupt country in the World, according to Transparency International, though, not second most corrupt. The tendency to exaggerate this might suggest a too-great willingness to believe it. If you don’t want to work hard in the first place, convincing yourself that there’s no advantage to doing so anyway helps salve the conscience.

And after all, if this is the problem, and everyone knows it is the problem, how did it happen? After all, the Philippines are a functioning democracy, and have been, except in the Marcos era, for some time. Why can’t the voters fix the problem by voting the rascals out?

Oddly, because the voters themselves seem to vote for corruption. My Filipina wife explains how the candidate who pays most for votes always wins in her home town. Now, I can understand poor people being prepared to promise their vote in exchange for money. But I can’t see why they would then vote the way they said, in the privacy of the booth. In the Philippines, a bought vote stays bought, as a matter of honour.

Filipinos can also, it seems, be counted on to vote for the scion of the local “big family”; it almost goes without saying that the senator from Camiguin province will have the last name Romualdo, while the senator from Agusan del Norte will be named Plaza. Surely, if either the corruption of the current elite or the concentration of land in too few hands were a genuine concern among the people, this would not happen.

Instead, the Philippines seems to be a genuine elective aristocracy. The people consider themselves tied to their feudal lords by mutual bonds: they owe the lord allegiance, and the lord, in turn, is supposed to look after them. That is, these days, to regularly give them money in one form or another. It does not seem to register that it is their own money.

An aristocracy, however, as opposed to a good old-fashioned bourgeois democracy, is a system that promotes idleness, not business. On the one hand, idleness is a trait of the upper classes, and is admired as such. It would be a scandal among these classes, as with the British upper class a century ago, to soil those hands with “trade.” A gentleman does not work for a living.

One sees traces of the same attitude in Filipinos all the time. Nothing is so important to the Filipina as to have pale skin, probably because this implies having avoided any hard outdoor labour. My wife was deeply impressed, when she shook Imelda Marcos’s hand, at the softness of her skin. Though young and fit, she also insists when home on riding everywhere in a tricycle, where I would walk as a matter of course.

To work hard, accordingly, is rather in bad taste. Better to win a lottery, or bet on a cockfight, or pan for gold, or marry a rich foreigner. At worst, perhaps better even to con, or pick pockets, or walk the streets for trade.

Every Filipino kid knows three words of English: “give me money!”

And, if one gets money, one immediately incurs certain obligations. One has a perceived moral obligation, as do the big families, to hand it around generously to relatives and neighbours—or failing this, to let them take it. This is difficult, if one does not hold a position allowing access to tax money, and acts in turn as a strong disincentive to actually acquiring anything. You’ll only end up losing it, or all your friends.

If one must come up with one word for this attitude, I guess “laziness” might do. But it is really a far more complex set of social expectations. Whatever word you use, it is an obvious impediment to material development.

Given all this, sending foreign aid to the Philippines is probably just a waste of money. So, perhaps, is investing in the Philippines. Accepting Filipino immigrants and guest workers may be something we can do: letting those who want to work do so.

And there may be something to the Filipino system, after all. Those who want to work, as my American friend observed, can leave. Historically, they always have. The Filipinos are seafarers; for centuries they have formed a good proportion of the world’s ships’ crews. Today, they account for about one fifth worldwide.

As long as working abroad is possible, from their point of view, why not preserve this part of the world, so beautiful and so tropical and so easy to live in, as a vacation or shore leave or retirement paradise? Why not keep it as they remember it? Let it develop, and it would only be more expensive to retire to.

Then where would they be?

Monday, January 08, 2007

Happy Feet: Unidentified Flying Objections

Happy Feet offers an interesting message to our children. The new animated feature is visually striking; though it crucially cannot convincingly convey its own central image, that of a penguin tap dancing. But a large part of the theme of the piece seems to be that religion is a bad thing.

There are two obviously religious figures in the movie: an old penguin with a Scottish accent, “Noah the Elder,” who advocates the worship of a divine penguin, “Guin,” and who is the recognized religious leader of the flock; and a fat rockhopper penguin, “Lovelace,” with bit of plastic lid from a six-pack stuck around his neck as a “sacred talisman,” who leads another group. Noah obviously represents Calvinism: he is mortally opposed, of course, to dancing. He is also opposed to all new ideas, all non-conformity. And, the movie points out, he contradicts himself. Not a sympathetic portrayal of one large branch of Christianity.

Lovelace, equally obviously, represents Pentecostalism. He claims to have the answers to all questions, thanks to his “mystic inspiration”; but demands money before he will pronounce. He is also sexually profligate, and admits later to being a plain fraud.

There is one more religious image—of a small church on a hill. It appears at the end of a journey, and I thought briefly that it might represent some sort of positive reference. But no—the camera’s eye sweeps down the hill, to pollution below. It is there only, I gather, to show hypocrisy, as introduction to and apparent symbol for a race of beings referred to as “the destroyers” (humans, of course). They kill everything that comes near and leave the waters littered with garbage. Presumably if the church were not there this would not be so.

Talk about being clubbed on the head like a baby seal with anti-Christian propaganda.

These harsh caricatures might be acceptable, and lift the film out of the realm of mere hate speech, if the movie offered some valid spiritual, and, properly, recognizably Christian, alternative. But what is its proposed truth, to answer man’s and penguin’s spiritual strivings? To solve the intractable problems of the world?

UFOs. No kidding. UFOs and alien abductions. Okay, and maybe an international ban on fishing. No credulously believing in things like God and the afterlife here; our children are urged instead to believe that our ultimate welfare is in the hands of a superior race of alien beings. And what is our proper response to this revelation? Simple: they will treat us well if we are cute and can amuse them.

I worry. The same day we saw the movie together, I was sitting with my five year old, and talking with him about a Christmas tree. He pointed to the star on top, and I told him it represented the star in the night sky over Bethlehem that guided the wise men to the baby Jesus.

He turned to me and responded, matter-of-factly, “There is no Jesus.”

Where, at his age, is he getting this? Not growing up in a family of practicing Catholics. At kindergarten? On the playground? Or from the media?

Talking, tapdancing penguins he has no trouble with. But Jesus? Obviously piffle.



A brief word about our sponsors…

To be fair, perhaps here I should give my own opinion of the whole UFO-visiting-aliens concept. That aliens from other planets are visiting earth, I have no problem with. Yet, as I understand it, requires some of the accepted laws of physics to be overcome. Of course, our knowledge is not perfect, and this may be possible. But believing in aliens is therefore obviously unscientific, in a way believing in God clearly is not.

Second, let’s allow that some other group of physical beings might be sufficiently advanced in material technology to be capable of visiting our planet. Why would they want to? Presumably, if they are that advanced, there is nothing much we can offer them in either material or intellectual terms. If there were, what could be worthwhile hauling back across those vast interstellar distances?

All I can think of is scientific study, or the idle curiosity we feel visiting a zoo. But if so, such a materially advanced civilization would surely have no need to come here physically. They could accomplish as much with, say, captured light and radio waves; at an extreme, with robot drones.

No, Occam’s Razor assumes that, when they are not swamp gas or weather balloons, UFOs are much more likely to be spiritual or mental than physical entities.

Sunday, January 07, 2007

Coming Soon to a Classroom near You: Lord of the Flies

I was trapped a few years ago in a room full of earnest educators for a talk by a high-paid expert from McGill. He had been flown across the continent to advise all of us in the English Department of that College on how to teach writing. In a full day of talking, he made only one point, or rather, assertion: writing must be taught in groups. All group decisions must be by consensus. No students should have any defined roles or tasks.

What if the students didn’t like it? Anyone who regularly dissented was to be disciplined as a “bully.”

This approach has obviously been sanctioned by social science studies, and officially found to be highly effective—because I keep coming across it. It is something of a social science dogma. This one instance sticks especially in mind because of the obvious weirdness of teaching writing in particular, a profoundly solitary pursuit, as a group activity. Yet as far as one could tell from this expert, this was the whole trick—nothing else need be done by the teacher but to enforce the group model, and good writing would apparently result.

Odd that Shakespeare or Milton never twigged to the idea.

I suspect that a large part of the attraction of the “group work” model in teaching writing—maybe in teaching other things as well—is that it covers up a tricky problem: nobody actually knows how to teach writing, and most writing teachers probably cannot even write particularly well themselves. Concentrate on group work, and you’re off the hook. You can even refuse to answer any student questions germane to the subject.

Of course, the students don’t learn anything.

Sometimes group work is valuable and necessary, and I am the first to advocate that students be taught how to conduct a meeting. But note the bizarre riders, that decisions must be by consensus and nobody should have an assigned task. Group meetings can work if and only if run under proper rules of order. These riders short-circuit any attempt to do that. They seem designed to force the groups into stalemate over procedure and so to avoid tackling the subject itself. The more so since most people find it easier to talk than to write: group discussion seems an excellent excuse to avoid writing altogether.

In all cases, group work is likely to frustrate any particularly good student. It is not just that he or she will have to carry the rest of the group intellectually; it is unfair to expect more work of the bright, but that would not be the big problem. It is that he will have to explain all his ideas and win the rest of the group over to them. In this, he will be fighting against two tendencies: first, resentment against someone who “does all the talking” or gets all of his ideas adopted—against, in sum, anyone smarter than we are; second, the genuine difficulty the less bright have in grasping the ideas of the more bright. For the poor prodigy, it will be like wearing lead shoes while swimming in molasses. He is very likely to be singled out, as well, as a “bully” for being the one the rest of the group resents—or for expressing his resentment of the process.

Any student who wants to work too hard will face many of the same problems: group consensus will fight against him.

So much for excellence.

As for the weakest students, group work gives them useful camouflage. In a group, they can avoid work; group members who give a damn will be forced to carry them. They can pass, even without having to work or learn.

Worse, given the sort of universal group approach currently advocated, they will graduate without anyone knowing this.

Without rules, it is most likely to be the participant with the strongest will—the natural bully, in fact—who wins out. He is prepared to push the hardest; the others, without rules to appeal to, will eventually give up, back off, throw in the towel out of exhaustion. It would all be rather like Lord of the Flies, that fictional paradigm of children living in groups without rules. Ralph, the responsible one, is doomed. Piglet, the bright one, is doomed. Simon, the odd one, is doomed.

Welcome to the brave new world of education.

Lots of fun for Jack and the choir, though.

Saturday, January 06, 2007

The Real World of Democracy

Here’s the riddle: how could the British hold Iraq under mandate for many years, 1919-1958 or so, with nominal forces, yet the Americans now cannot keep reasonable order with two hundred thousand?

Or consider the parallel of Vietnam. How could the French hold all of Indochina from 1885 to 1945 with only 14,000 or so troops, yet the US could not hold it with 500,000?

Are the Americans simply incompetent?

Not militarily; perhaps diplomatically. The answer seems to be that it is much easier to hold a country as a colony than to turn it into a democracy.

And this stands to reason. Making a democracy is not merely a matter of holding elections. Elections are not going to produce a viable government unless there are existing organizations—parties, if you will—with established lines of authority, ready to run in them and then, if they win, step in and take over government in an orderly fashion.

Such organizations may be thick on the ground in established democracies. But in a country that has previously had an autocratic government, this is unlikely to be the case. For the simple reason that, in order to seize and hold power, an autocratic government is likely to systematically suppress all such organizations.

There are only two likely exceptions. Most obvious is the armed forces. By its nature, it has the discipline and command structure to operate as a large unit. This is far more important than its possession of weapons. Autocratic governments are caught in a bind here: they hobble or suppress the army, they leave themselves vulnerable to threats from without. Accordingly, the army is the most common source of government overthrow.

Unfortunately, in Iraq, the US disbanded the army. Therefore, it could not be used as an alternative government. In Vietnam, similarly, there was no established, professional Vietnamese officer corps when the American became involved. Defense had been the business of the French, at least at higher levels.

The second possibility is a religious organization. This is why the mosque, and “Islamism,” is the centre of opposition to Middle Eastern governments generally today: it is not easy for an autocratic government to suppress a majority religion, though many have tried. That religious structure, organization, and system of loyalties can then be adapted as a political tool. This worked very effectively, for example, in casting off repressive governments in Communist Poland, in British India, and among blacks in the US South. It is why the government of China is so concerned about Falun Gong.

Unfortunately, with the Shiites in Iraq, as with the Buddhists in Vietnam, the US finds this an unacceptable option. Indeed, faced with a Buddhist demand for popular elections in Vietnam, the US opposed them—because they feared the Buddhist parties would win. Similarly, Donald Rumsfeld is on record saying that Iraq can have any sort of government it wants—except a religious one.

This perhaps reveals a disturbing and self-destructive anti-religious prejudice on the part of US policymakers—anti-Buddhist in the 1960s, anti-Islam today.

But there is also some reason for concern. Majority rule is always dangerous to minorities. Democracy can, accordingly, be a disaster in terms of human rights. The US tends to think of the two as going together; they do not. Hitler, for example, was democratically elected. To create a democracy that is prepared to respect the rights of the minority is no small accomplishment.

And so, in both Vietnam and Iraq, minorities have had reason to fear democracy: Catholics, Cao Daists, and Montagnards in Vietnam, Catholics, Kurds, and Sunnis in Iraq. Accordingly, the attempt to establish majority rule is quite likely to lead to civil war. It happened in South Vietnam; it happened in Ireland; it happened in India. Heck, it happened in the US. It seems to be happening in Iraq.

And then there’s the very hardest bit: not only do you need organizations outside government, and plausible guarantees of respect for minority rights. You also need a gentleman’s agreement that, once in power, you will leave the doors open to the other side to one day overthrow you. And perhaps then hang you. That’s asking a lot, and requires massive trust on all sides. But without it, the first free election is also going to be the last.

Unfortunately, the US may have just blown that one as well, with the hanging of Saddam Hussein.


So: in Iraq, how does one get there from here? Given all the mistakes that have already been made?

The Americans are not likely to want to hear it, but my answer is this: historically, the situation most likely to transition to a working democracy appears to be a monarchy. An established, secure monarch can always run the experiment with the proviso that, should things get out of hand, he can step in and call a new election or appoint a new prime minister or even take back the reins of government for a time. If trusted personally, he can stand as guarantor for any threatened minority, and for any defeated government. This has worked to peacefully produce democracy in the UK, Denmark, Sweden, Norway, the Netherlands, Spain, Thailand, Canada, Australia, New Zealand, India, Malaysia, Belgium; to a lesser extent, a period of monarchy seems to have aided the transition in Japan, Italy, Cambodia, the US, and Greece.

It also, not incidentally, worked passably well for the British in Iraq. Finding the country too hot to handle at first, they brought in Faisal ibn Husayn, made him king through a plebiscite, gave him a constitution including an elected assembly, and retired comfortably to the background for the rest of the mandate.

It worked once, and could still work again.


Failing this, military regimes can also sometimes produce democracy; again with the military able to step in to referee if things go sour. This has worked, if not always smoothly, in Turkey, Argentina, Chile, Korea, Taiwan. In a sense, it worked in the US: George Washington, a military leader, got things on their feet, and eventually handed over to a civilian regime.


Can a religious regime also produce democracy? Yes indeed: they have in Massachusetts, Connecticut, Rhode Island, Utah, Maryland, Maine, New Hampshire, Holland, Switzerland. It was the origin of one of the world’s oldest democracies, San Marino. Again, the religious authority, if generally respected, can serve as a guarantor and a referee in those first few, awkward years of building trust. The same process may be taking place right now in Iran—but again, the US doesn’t like it, because its leaders have come, despite its own history, to fear religion.

And religious regimes have a natural check—the ethics of that religion—against turning into a bloodthirsty tyranny, which military regimes and aristocracies do not.

The specific moral I would derive here is that the US should lose its anti-religious prejudice, and cooperate fully with the Shiite clerics in Iraq. With the sectarian tensions, it might be well too to float the idea of a return of the Hashemite throne of Iraq; though that would at this late date require a rewritten constitution.

The general moral: strong independent organizations outside government are our ultimate guarantee of liberty. Resist any government that seeks to disempower them.

Friday, January 05, 2007

A Child's Right to Work

A recent worldwide survey claims teenagers in the Third World are up to six times happier than those in the developed world.

Now, it is one thing to accept that wealth cannot buy happiness. But this seems to suggest that it actually leads to misery.

But I suspect the real issue here is not wealth at all. It is education.

Do-gooders in the West protest a great deal the use of “child labour” in the undeveloped world. After all, these kids should be in school, right?

Leaving aside the obvious problem that there are no schools to go to, though, who really thinks school is more fun or fulfilling than work? And did they have the same teachers I did?

Not usually. And, perhaps more to the point, it pays far worse. Teenagers in the Third World are already adults, in control of their own lives, and making their own decisions. That feels pretty good. Teenagers in the First World, full of the vitality of youth, are stuck in endless classes, required to obey teachers’ arbitrary commands, almost always taught things that, even when true or somewhat interesting, will be of little use to them in their later lives. How many jobs really require calculus or fluency in French?

This, of course, would be even truer in the Third World, where the only job on offer is usually agricultural labour.

We enlightened first-worlders nevertheless force our young through years of appalling ennui, poverty, and external control. They are fully equipped for it, physically and intellectually, but we will not let them take control of their own lives, marry, and have children. We will not let them earn a living, or do as they wish with their days. Imagine doing this to any distinct segment of the adult population, and the proper term for it would be “slavery.”

And we have the gall to condemn the Third World for not following suit? At the same time, insisting with perfect inconsistency that they must make their women work outside the home, but must not permit their children to do so.

Those from the First World who go off to teach in the Third are almost always struck by the same thought: outside the West, there seems to be no such phenomenon as “adolescence.” People just seem to go one day from childhood to adulthood.

Those who well remember adolescence will recognize this as a good thing.

Let’s send the little beggars down the mines where they belong.

Thursday, January 04, 2007

George Romney

A search of Google's news archive finds no mentions at the time that he announced his candidacy of George Romney's Mormonism. QED.

Of course, to be fair, we should ask: is it the US public that has grown less tolerant, or the US media?

Mitt Romney

Mitt Romney has launched his exploratory committee. And every story covering this seems to bring up his Mormonism, with the suggestion that Americans may not be ready to elect a Mormon president.

This strikes me as odd. After all, his father also ran for president, back in 1968, and I don't recall his Mormonism being made much of at the time. Has the US grown less religiously tolerant?

I suspect so.

Did anyone make a fuss, at the time, about Dwight Eisenhower being a Jehovah's Witness?

And note that there has never been another Catholic president, after JFK, despite Catholicism being the largest demonination in the US.