New information suggests the Vatican had plans to move to Portugal and elect a new pope had Hitler kidnapped Pius XII.
http://www.telegraph.co.uk/news/newstopics/religion/5195584/Vatican-planned-to-move-to-Portugal-if-Nazis-captured-wartime-Pope.html
Thursday, December 31, 2009
Was Shakespeare Catholic?
Yes he was. This much is uncontroversial. On the evidence of his plays, most Shakespeare scholars agree that his own religious persuasion was plainly Catholic, whether or not he was open about it.
But now the English college in Rome thinks it may have evidence that Shakespeare studied there:
http://www.telegraph.co.uk/news/worldnews/europe/vaticancityandholysee/6864627/Shakespeare-was-a-secret-Catholic-new-exhibition-shows.html
This would explain, for example, his taste for Italian settings for his plays.
Who knows?
But now the English college in Rome thinks it may have evidence that Shakespeare studied there:
http://www.telegraph.co.uk/news/worldnews/europe/vaticancityandholysee/6864627/Shakespeare-was-a-secret-Catholic-new-exhibition-shows.html
This would explain, for example, his taste for Italian settings for his plays.
Who knows?
Wednesday, December 30, 2009
Everything I Need to Know I Learned in Kindergarten
or:
Are You Smarter than a Five-Year-Old?
Children today are not always taught the old fairy tales, the old nursery stories. Indeed, there is a systematic campaign in some quarters not to teach them. Too often, they are not politically correct; they have no place in the common Marxist/”cognitive science” agenda to alter the culture and control thought for political purposes.
But mostly, we just don't understand them, and don't get the point. I recall in grad school the professorial assertion that they were products of the human unconscious, mostly of psychological interest; the argument was over whether they were truly “pure,” or “mixed with ego.”
No kidding—apparently they were written by people when they were unconscious. I wonder if Hans Christian Anderson realized he was asleep?
Apparently, too, they made no more sense to the Freudians and Jungians than the contents of a typical dream.
Yet fairy tales and nursery tales are the encapsulated wisdom of mankind. In this, the Marxists are partly right: they are our initial social programming. Cultural literacy is an issue here; but more than that, without our fairy tales, we lack our user's manual for life, and are doomed to repeat all our ancestors' worst mistakes, without benefit of their prior experience.
Which, of course, predictably, is what is commonly happening these days, since we have largely forgotten them.
Because of global warming, for example, as we all know, the world is going to end. A correspondent wrote recently, “we are spitting ourselves out... of existence.” Before that, we were going to destroy the world with overpopulation, or pollution, or ozone holes, or resource depletion; we are still running out of oil, and water. But in fact, even in the worst case scenario, even if we grant that it is completely proven, there is no real probability that global warming could end human life on the planet. Nor, realistically, could overpopulation. We ought by now to realize that the idea that “the end of the world is nigh” is almost a human instinct, and we ought accordingly to be skeptical of any such assumptions, ever.
But isn't that the moral of the story of Henny-Penny? That foolish people are always prone to stampede into such panics, and that this makes them prey to any unscrupulous person ready to exploit this instinct for his own ends? Any child who was paying proper attention during Henny Penny's sad tale should be proof against millennial con games of all sorts--including those used so skillfully by charlatans like Stalin, Mao, and Hitler.
How about the laments of feminists? Many of us have accepted their claims following the simple logic that, if there were not something wrong with their current lot in life, they would not be complaining. This is a common formula throughout the feminist ethic, and the victimhood game more generally: if a woman feels oppressed, or feels threatened, or feels harassed, ipso facto, obviously, she is. Recognized “victim groups” have gotten quite far on the same formula—believing it, to be fair, completely, themselves.
But this simple logic should, in a properly-educated child, be immediately tempered by the story of the Princess and the Pea. It is, instead, precisely those most used to privilege who will complain most loudly of their lot—for they are those least inured to oppression, threat, harassment, or discomfort of any kind. One who has always been a slave—why and when should he dare take it into his head to object?
Accordingly, recent immigrants from South Asia, Africa, or the Arab world complain loudly of the discrimination they have faced, here and at home, and are given affirmative action programs. But these are wealthy members of the upper class in the countries they come from. Did the dirt-poor Irish, Polish, Ukrainians, or Armenians of two or three generations ago complain similarly? Just the reverse—because they really were poor, and really did come from a history of oppression.
Missing this important insight, we tend to systematically increase the privileges of the most privileged, and the oppression of the most oppressed, all the while believing we are doing the opposite.
And how much of the world's current folly could have been avoided if only a five-year-old who knew the story of The Emperor's New Clothes had been consulted? It has been rightly observed that the craziest things of all are those things commonly believed by academics: global warming, Marxism, feminism, psychoanalysis, postmodernism, affirmative action, political correctness, speech codes, queer studies, Fascism in its day. Most are based on a simple logical fallacy that should be apparent to a casual observer within a few minutes. The Emperor's story explains succinctly why this is entirely likely to be so. If it is possible for any intellectual phony or slacker to convincingly present something as “believed by all the best authorities,” then, even if we personally suspect it makes no sense at all, most people who want to appear intelligent and well-educated will pretend to believe it as well, and assert it with that much more energy to avoid the suspicion that they do not really get the point. Anyone who becomes an academic, in turn, considering the hard slog it requires, is probably deeply invested in projecting the idea that they are unusually smart, and deeply insecure about it. The child who truly understands this is protected against most such nonsense, and knows enough to think for himself.
Aesop's Fables, of course, are full of such lessons. The perpetual urge for a big, powerful government with detailed laws and regulations to impose proper order upon us all is well analysed in the fable of King Log and King Stork. The scapegoating of “rich corporate interests,” “rich capitalists,” “rich Jews,” and so forth, and the notion that governments can pay for everything by simply confiscating thier wealth, has been a common fallacy, or con job, in Marxism and well beyond. Hitler, Mugabe, Amin, all tried it. Most left-leaning governments base their core policies on it, in milder form. All with eventual results easily predicted by any child who knew the tale of the Goose that Laid the Golden Eggs.
And the apparently shocking, unheard-of notion that scoundrels might take the chasuble of priesthood in order to exploit the vulnerable? It should not so surprise anyone who was once read the story of the Wolf in Sheep's Clothing. They would be far less inclined, as children, to fall for it; nor would they be inclined as adults to suppose it reflected somehow on the truth of the Catholic faith. Any more than the wolf's actions reflected on the true nature of sheep. How cockeyed is that?
We will probably never end the horror of child abuse. But our present efforts, besides being terribly expensive, are probably also making matters worse, by scapegoating fathers, by weakening families and by handing children over to professional bureaucrats who necessarily have no special feeling for them. Here's a perfectly cost-free measure we all can take: any child with access to nursery stories would at least be partly armoured against emotional abuse, by far the worst aspect of the problem. They would know the dangers of evil “step-parents,” and what they are capable of. They would understand that parents do not always love their children. They would have understood that parents can also envy their children. They would have learned that the fault was not necessarily theirs, and understood that there was still hope for the future. This is indeed one of the most common lessons of the nursery tales: consider Cinderella, or Snow White, or Rapunzel, or the Ugly Duckling.
Sexual abuse of children, pedophilia, is another hot topic currently. Want to “street-proof” your kids against sexual abuse? What could better “street-proofing” than the stories of Hansel and Gretel, or Little Red Riding Hood? The nursery tales are even far more realistic than modern treatments as to its probable source. Don't even trust grandma--you never know...
Sadly, these lessons are largely or completely lost in modern retellings. In the versions most common nowadays, everything scary or violent has been stripped out, supposedly to protect little ears from any possible hint that all is not wonderful in this world. It isn't. Real wolves are not huggable. In doing so, we are in fact setting our children up for all the real horrors the stories only ask them to imagine.
Of course, the politically correct will object that our nursery tales are only about our own culture; they are no longer “appropriate” for a globalized, multicultural, world. Perhaps they are cultural imperialism, in a multicultural classroom; perhaps they teach intolerance.
Nothing could be further from the truth: only academics could believe this. Nursery stories delight in that which happened not only “long ago,” but “far, far away.” As a result, they systematically encourage multiculturalism, globalism, and an interest in other cultures. Of all forms of literature, they are the most open to assimilating from other cultures. Few familiar English nursery tales are originally English: Grimm is from Germany, Anderson from Denmark, Lafontaine from France, Aesop from Greece, Uncle Remus from Africa, the 1001 Nights from Arabia, Persia, and India. The Nightingale is transparently from China, and the original Cinderella lived in Korea. Folklorists find near-identical stories told in areas and cultures as widely dispersed as Tajikistan and Tonawanda—among the Iroquois Indians.
Besides being of vital importance to individual children, and to the adults they become, this stratum of nursery wisdom could, if emphasized in our education systems, actually become an important element of international, and indeed global, understanding.
Make sure your kids don't leave home without it.
Are You Smarter than a Five-Year-Old?
Children today are not always taught the old fairy tales, the old nursery stories. Indeed, there is a systematic campaign in some quarters not to teach them. Too often, they are not politically correct; they have no place in the common Marxist/”cognitive science” agenda to alter the culture and control thought for political purposes.
But mostly, we just don't understand them, and don't get the point. I recall in grad school the professorial assertion that they were products of the human unconscious, mostly of psychological interest; the argument was over whether they were truly “pure,” or “mixed with ego.”
No kidding—apparently they were written by people when they were unconscious. I wonder if Hans Christian Anderson realized he was asleep?
Apparently, too, they made no more sense to the Freudians and Jungians than the contents of a typical dream.
Yet fairy tales and nursery tales are the encapsulated wisdom of mankind. In this, the Marxists are partly right: they are our initial social programming. Cultural literacy is an issue here; but more than that, without our fairy tales, we lack our user's manual for life, and are doomed to repeat all our ancestors' worst mistakes, without benefit of their prior experience.
Which, of course, predictably, is what is commonly happening these days, since we have largely forgotten them.
Because of global warming, for example, as we all know, the world is going to end. A correspondent wrote recently, “we are spitting ourselves out... of existence.” Before that, we were going to destroy the world with overpopulation, or pollution, or ozone holes, or resource depletion; we are still running out of oil, and water. But in fact, even in the worst case scenario, even if we grant that it is completely proven, there is no real probability that global warming could end human life on the planet. Nor, realistically, could overpopulation. We ought by now to realize that the idea that “the end of the world is nigh” is almost a human instinct, and we ought accordingly to be skeptical of any such assumptions, ever.
But isn't that the moral of the story of Henny-Penny? That foolish people are always prone to stampede into such panics, and that this makes them prey to any unscrupulous person ready to exploit this instinct for his own ends? Any child who was paying proper attention during Henny Penny's sad tale should be proof against millennial con games of all sorts--including those used so skillfully by charlatans like Stalin, Mao, and Hitler.
How about the laments of feminists? Many of us have accepted their claims following the simple logic that, if there were not something wrong with their current lot in life, they would not be complaining. This is a common formula throughout the feminist ethic, and the victimhood game more generally: if a woman feels oppressed, or feels threatened, or feels harassed, ipso facto, obviously, she is. Recognized “victim groups” have gotten quite far on the same formula—believing it, to be fair, completely, themselves.
But this simple logic should, in a properly-educated child, be immediately tempered by the story of the Princess and the Pea. It is, instead, precisely those most used to privilege who will complain most loudly of their lot—for they are those least inured to oppression, threat, harassment, or discomfort of any kind. One who has always been a slave—why and when should he dare take it into his head to object?
Accordingly, recent immigrants from South Asia, Africa, or the Arab world complain loudly of the discrimination they have faced, here and at home, and are given affirmative action programs. But these are wealthy members of the upper class in the countries they come from. Did the dirt-poor Irish, Polish, Ukrainians, or Armenians of two or three generations ago complain similarly? Just the reverse—because they really were poor, and really did come from a history of oppression.
Missing this important insight, we tend to systematically increase the privileges of the most privileged, and the oppression of the most oppressed, all the while believing we are doing the opposite.
And how much of the world's current folly could have been avoided if only a five-year-old who knew the story of The Emperor's New Clothes had been consulted? It has been rightly observed that the craziest things of all are those things commonly believed by academics: global warming, Marxism, feminism, psychoanalysis, postmodernism, affirmative action, political correctness, speech codes, queer studies, Fascism in its day. Most are based on a simple logical fallacy that should be apparent to a casual observer within a few minutes. The Emperor's story explains succinctly why this is entirely likely to be so. If it is possible for any intellectual phony or slacker to convincingly present something as “believed by all the best authorities,” then, even if we personally suspect it makes no sense at all, most people who want to appear intelligent and well-educated will pretend to believe it as well, and assert it with that much more energy to avoid the suspicion that they do not really get the point. Anyone who becomes an academic, in turn, considering the hard slog it requires, is probably deeply invested in projecting the idea that they are unusually smart, and deeply insecure about it. The child who truly understands this is protected against most such nonsense, and knows enough to think for himself.
Aesop's Fables, of course, are full of such lessons. The perpetual urge for a big, powerful government with detailed laws and regulations to impose proper order upon us all is well analysed in the fable of King Log and King Stork. The scapegoating of “rich corporate interests,” “rich capitalists,” “rich Jews,” and so forth, and the notion that governments can pay for everything by simply confiscating thier wealth, has been a common fallacy, or con job, in Marxism and well beyond. Hitler, Mugabe, Amin, all tried it. Most left-leaning governments base their core policies on it, in milder form. All with eventual results easily predicted by any child who knew the tale of the Goose that Laid the Golden Eggs.
And the apparently shocking, unheard-of notion that scoundrels might take the chasuble of priesthood in order to exploit the vulnerable? It should not so surprise anyone who was once read the story of the Wolf in Sheep's Clothing. They would be far less inclined, as children, to fall for it; nor would they be inclined as adults to suppose it reflected somehow on the truth of the Catholic faith. Any more than the wolf's actions reflected on the true nature of sheep. How cockeyed is that?
We will probably never end the horror of child abuse. But our present efforts, besides being terribly expensive, are probably also making matters worse, by scapegoating fathers, by weakening families and by handing children over to professional bureaucrats who necessarily have no special feeling for them. Here's a perfectly cost-free measure we all can take: any child with access to nursery stories would at least be partly armoured against emotional abuse, by far the worst aspect of the problem. They would know the dangers of evil “step-parents,” and what they are capable of. They would understand that parents do not always love their children. They would have understood that parents can also envy their children. They would have learned that the fault was not necessarily theirs, and understood that there was still hope for the future. This is indeed one of the most common lessons of the nursery tales: consider Cinderella, or Snow White, or Rapunzel, or the Ugly Duckling.
Sexual abuse of children, pedophilia, is another hot topic currently. Want to “street-proof” your kids against sexual abuse? What could better “street-proofing” than the stories of Hansel and Gretel, or Little Red Riding Hood? The nursery tales are even far more realistic than modern treatments as to its probable source. Don't even trust grandma--you never know...
Sadly, these lessons are largely or completely lost in modern retellings. In the versions most common nowadays, everything scary or violent has been stripped out, supposedly to protect little ears from any possible hint that all is not wonderful in this world. It isn't. Real wolves are not huggable. In doing so, we are in fact setting our children up for all the real horrors the stories only ask them to imagine.
Of course, the politically correct will object that our nursery tales are only about our own culture; they are no longer “appropriate” for a globalized, multicultural, world. Perhaps they are cultural imperialism, in a multicultural classroom; perhaps they teach intolerance.
Nothing could be further from the truth: only academics could believe this. Nursery stories delight in that which happened not only “long ago,” but “far, far away.” As a result, they systematically encourage multiculturalism, globalism, and an interest in other cultures. Of all forms of literature, they are the most open to assimilating from other cultures. Few familiar English nursery tales are originally English: Grimm is from Germany, Anderson from Denmark, Lafontaine from France, Aesop from Greece, Uncle Remus from Africa, the 1001 Nights from Arabia, Persia, and India. The Nightingale is transparently from China, and the original Cinderella lived in Korea. Folklorists find near-identical stories told in areas and cultures as widely dispersed as Tajikistan and Tonawanda—among the Iroquois Indians.
Besides being of vital importance to individual children, and to the adults they become, this stratum of nursery wisdom could, if emphasized in our education systems, actually become an important element of international, and indeed global, understanding.
Make sure your kids don't leave home without it.
Sunday, December 27, 2009
Western Civilization in One Volume
According to the theories of E.D. Hirsch, developing reading fluency is largely a matter of “cultural literacy”--that is, knowing the information that a good writer will assume in his audience. This is an important way in which reading differs from listening: a speaker can judge his current audience and their comprehension, and adjust accordingly. An author cannot, and must make assumptions. Cultural allusions, metaphors, and casual references are not in a dictionary: if a student does not catch the reference, he may not even recognize it for an allusion.
Students entering college, therefore, must have the cultural background the authors they read will typically assume, or they are going to struggle with the readings.
Hirsch found this to be a problem for ill-educated native speakers. But it is bound to be doubly a problem for ESL students, coming from a possibly quite different culture. What does this mean but a different set of cultural references and assumptions?
This leads to an interesting, and vitally important, speculation: what are the snippets of information that a foreign student should have, and may not have, in order to be able to read English fluently at the college level? Hirsch has his own ideas, of course, but they are specifically for American students studying in America; and, of course, one is free to differ on what is important.
I'm thinking in particular of ESL students, many of whome can be from a dramatically different culture, from China, Africa, or the Arabian Gulf. They may well need a background, not just in English-speaking culture, but in European civilization generally. What do they need to catch up on?
This will of course differ widely country to country. The best precise mix could be determined by each individual institution or even teacnher through a standard questionnaire testing for knowledge of each element of this set of basic materials.
Of course, some will raise the objection that Hirsch's ideas have faced in America: that such an established canon “privileges” the culture of dead white European males, and so is a sort of cultural imperialism.
That is not our affair. We are not, presumably, obliging anyone to learn English, or to study in North America. Assuming that they do want to learn English, however, and to study in North America, the authors they are going to have to read in a North American or British college are, by and large, going to be dead Europeans. If we have Marxist notions of perfecting the world by deliberately changing the culture, our ESL students are not the place to do it; any more than we have the right to alter the rules of English grammar to suit our own preferences. That would simply be malpractice.
Here are a few ideas I have come up with. Other suggestions are welcome:
Plato's Cave
Aristotle's Law of Non-Contradiction
Aristotle on the syllogism
Aristotle's argument for the Prime Mover
Anselm's Proof of the Existence of God
Occam's Razor
Descartes' Meditations
Shakespeare's “Julius Caesar”
Genesis
Exodus (highlighting the Ten Commandments)
23rd Psalm
John 1
Luke's birth narrative
The Sermon on the Mount from Matthew
Matthew's passion
The Lord's Prayer
The Nicene Creed
Jonathan Swift, “A Modest Proposal”
The US Declaration of Independence
The US Bill of Rights
ML King Jr., “I Have a Dream”
Concise summary of Robert's Rules of Order
The Wedding Ceremony from the Book of Common Prayer
The Miranda Statement
Faust Legend
Story of Jonah
Story of Daniel
Story of Job
Story of Odyssey
Story of Iliad
Story of Robinson Crusoe
Story of King Lear
Story of Romeo and Juliet
Story of Hamlet
Story of The Merchant of Venice
Story of Moby Dick
Marlowe, “The Passionate Shepherd to His Love”
Donne, “No Man is an Island”
Rudyard Kipling, “If...”
“Casey at the Bat”
“In Flanders Fields”
“Twas the Night Before Christmas”
It seems to me that all of this could fit into one printed volume, and might be dealt with in one semester of work. I think every ESL college prep program should include this course. Had they read all of this, I suspect that the average ESL student would in fact be better prepared for reading at the college level than is the typical native speaker at the time of college graduation; for, as Hirsch pointed out, our own schools now neglect to teach this.
Students entering college, therefore, must have the cultural background the authors they read will typically assume, or they are going to struggle with the readings.
Hirsch found this to be a problem for ill-educated native speakers. But it is bound to be doubly a problem for ESL students, coming from a possibly quite different culture. What does this mean but a different set of cultural references and assumptions?
This leads to an interesting, and vitally important, speculation: what are the snippets of information that a foreign student should have, and may not have, in order to be able to read English fluently at the college level? Hirsch has his own ideas, of course, but they are specifically for American students studying in America; and, of course, one is free to differ on what is important.
I'm thinking in particular of ESL students, many of whome can be from a dramatically different culture, from China, Africa, or the Arabian Gulf. They may well need a background, not just in English-speaking culture, but in European civilization generally. What do they need to catch up on?
This will of course differ widely country to country. The best precise mix could be determined by each individual institution or even teacnher through a standard questionnaire testing for knowledge of each element of this set of basic materials.
Of course, some will raise the objection that Hirsch's ideas have faced in America: that such an established canon “privileges” the culture of dead white European males, and so is a sort of cultural imperialism.
That is not our affair. We are not, presumably, obliging anyone to learn English, or to study in North America. Assuming that they do want to learn English, however, and to study in North America, the authors they are going to have to read in a North American or British college are, by and large, going to be dead Europeans. If we have Marxist notions of perfecting the world by deliberately changing the culture, our ESL students are not the place to do it; any more than we have the right to alter the rules of English grammar to suit our own preferences. That would simply be malpractice.
Here are a few ideas I have come up with. Other suggestions are welcome:
Plato's Cave
Aristotle's Law of Non-Contradiction
Aristotle on the syllogism
Aristotle's argument for the Prime Mover
Anselm's Proof of the Existence of God
Occam's Razor
Descartes' Meditations
Shakespeare's “Julius Caesar”
Genesis
Exodus (highlighting the Ten Commandments)
23rd Psalm
John 1
Luke's birth narrative
The Sermon on the Mount from Matthew
Matthew's passion
The Lord's Prayer
The Nicene Creed
Jonathan Swift, “A Modest Proposal”
The US Declaration of Independence
The US Bill of Rights
ML King Jr., “I Have a Dream”
Concise summary of Robert's Rules of Order
The Wedding Ceremony from the Book of Common Prayer
The Miranda Statement
Faust Legend
Story of Jonah
Story of Daniel
Story of Job
Story of Odyssey
Story of Iliad
Story of Robinson Crusoe
Story of King Lear
Story of Romeo and Juliet
Story of Hamlet
Story of The Merchant of Venice
Story of Moby Dick
Marlowe, “The Passionate Shepherd to His Love”
Donne, “No Man is an Island”
Rudyard Kipling, “If...”
“Casey at the Bat”
“In Flanders Fields”
“Twas the Night Before Christmas”
It seems to me that all of this could fit into one printed volume, and might be dealt with in one semester of work. I think every ESL college prep program should include this course. Had they read all of this, I suspect that the average ESL student would in fact be better prepared for reading at the college level than is the typical native speaker at the time of college graduation; for, as Hirsch pointed out, our own schools now neglect to teach this.
Thursday, December 24, 2009
NORAD Tracks Santa
It's 4:30 PM on Christmas Eve here in the Persian Gulf, and NORAD has already started picking up Santa on their DEW system. He's currently over Australia.
Merry Christmas, everyone!
Don't miss the Santacams.
Merry Christmas, everyone!
Don't miss the Santacams.
Wednesday, December 23, 2009
Juramentado
In the wake of the 9/11 attacks, we read a great deal about how this kind of suicide bombing attack was “unprecedented.” This shows, above all, the general lack of a historical memory in the US. Everyone seemed to have forgotten already the Japanese kamikaze tactics of only fifty-five years ago. Or, for that matter, the death of Rajiv Gandhi in India only ten years earlier.
Was this, at least, something new in America's experience of Islam?
Actually, no. Americans have seen this before. From 1899 to 1913, Americans fought a pitched war, against Filipinos, to annex their islands. They remained in possession until 1946. During this time they tangled in particular with the Moros, the Muslim ethnic group that inhabits areas in the south and west of Mindanao, and that continues to be in a state of more or less permanent insurrection today. And they became familiar with the Muslim Moro practice locally called “juramentado.” A description from Wikipedia:
Zsolt Aradi, writing for CatholicCulture.org, reports from local Christian sources that the juramentado often “mixes in a crowd celebrating a Christian holy day” in order to “kill as many persons as he can.”
Here are some choice passages from a Time magazine article on the continuing problem. Dateline December 1, 1941:
“Through the Isle of Jolo spread a familiar, deadly-chilling fear. On that speck in the Sulu Archipelago, southwesternmost part of the Philippines, the Moros were going juramentado again.
When a Moro goes juramentado, he takes a fanatic oath to kill as many Christians as he can before he is killed himself.
... For the past two months, juramentado murders in Sulu have averaged one every other day. In Jolo, the biggest city (pop. 6,000), Moro Aharaji went juramentado after being conscripted, chopped off the head of a Chinese baker, killed one Filipino soldier and slashed another before he was stopped by a policeman's shotgun blast. He fell dead on exactly the spot where the same policeman had killed another juramentado ten days earlier. Townspeople shivered, waited for the next attack.”
Sound familiar?
The same Time article, speaking in a somehow much simpler time, offers a solution to the problem that, it says, has worked before:
“The story goes that General John J. Pershing, when he commanded in Sulu, developed a workable formula. Once when the Moros went wild, Pershing asked their Sultan to stop them. The Sultan said it was impossible. Pershing had warships shell the coastal villages. When the Sultan demanded that the shelling be stopped, he was told that the Navy had gone juramentado too. After that, Pershing and the Moros got along much better.”
Juramentado has been going on in the Southern Philippines for centuries.
Accordingly, contrary to much that has been written recently, this type of attack is nothing new in Islam. It is not something that appeared with Wahhabism in the nineteenth century, and Wahhabism accordingly, does not deserve the blame for it. The Crusaders encountered it in the Assassins (Hashshashin) during the Crusades. It has been a longstanding and traditional, if always very much a minority, interpretation of the Muslim doctrine of jihad.
It also seems silly to ponder what the motive of Major Hasan might have been at Fort Hood, as the newspapers have been doing. He was following a recognizable Muslim tradition.
Most top Muslim authorities would probably have condemned it at any point in the history of Islam. It would be wrong, therefore, to blame all Muslims for it, any more than we should blame all Hindus for the Thuggee practice of ritual human sacrifice. To my judgement, it is a corruption of Islam, and cannot really be justified by it. It is more of a folk-Muslim belief, like the witch hunts within Christianity, which were usually condemned by the proper religious authorities.
It is time, however, that we did a better job of studying history, and of studying Islam. Our lives might depend on it.
Was this, at least, something new in America's experience of Islam?
Actually, no. Americans have seen this before. From 1899 to 1913, Americans fought a pitched war, against Filipinos, to annex their islands. They remained in possession until 1946. During this time they tangled in particular with the Moros, the Muslim ethnic group that inhabits areas in the south and west of Mindanao, and that continues to be in a state of more or less permanent insurrection today. And they became familiar with the Muslim Moro practice locally called “juramentado.” A description from Wikipedia:
"a juramentado was a dedicated, premeditated, and often highly-skilled killer who prepared himself through a ritual of binding, shaving, and prayer in order to accomplish audacious public religious murder armed only with edged weapons....
"Undertaken as an unorthodox form of personal jihad, mag-sabil, 'who endure the pangs of death,' were selected from fanatical Muslim youth inspired to martyrdom by the teaching of Imams.
"At the moment of attack, the mag-sabil would approach a large group of Christians, shout "La ilaha il-la'l-lahu" ("There is no god but Allah"), draw kris or barong and then rush into the group swinging his sword.”
Zsolt Aradi, writing for CatholicCulture.org, reports from local Christian sources that the juramentado often “mixes in a crowd celebrating a Christian holy day” in order to “kill as many persons as he can.”
Here are some choice passages from a Time magazine article on the continuing problem. Dateline December 1, 1941:
“Through the Isle of Jolo spread a familiar, deadly-chilling fear. On that speck in the Sulu Archipelago, southwesternmost part of the Philippines, the Moros were going juramentado again.
When a Moro goes juramentado, he takes a fanatic oath to kill as many Christians as he can before he is killed himself.
... For the past two months, juramentado murders in Sulu have averaged one every other day. In Jolo, the biggest city (pop. 6,000), Moro Aharaji went juramentado after being conscripted, chopped off the head of a Chinese baker, killed one Filipino soldier and slashed another before he was stopped by a policeman's shotgun blast. He fell dead on exactly the spot where the same policeman had killed another juramentado ten days earlier. Townspeople shivered, waited for the next attack.”
Sound familiar?
The same Time article, speaking in a somehow much simpler time, offers a solution to the problem that, it says, has worked before:
“The story goes that General John J. Pershing, when he commanded in Sulu, developed a workable formula. Once when the Moros went wild, Pershing asked their Sultan to stop them. The Sultan said it was impossible. Pershing had warships shell the coastal villages. When the Sultan demanded that the shelling be stopped, he was told that the Navy had gone juramentado too. After that, Pershing and the Moros got along much better.”
Juramentado has been going on in the Southern Philippines for centuries.
Accordingly, contrary to much that has been written recently, this type of attack is nothing new in Islam. It is not something that appeared with Wahhabism in the nineteenth century, and Wahhabism accordingly, does not deserve the blame for it. The Crusaders encountered it in the Assassins (Hashshashin) during the Crusades. It has been a longstanding and traditional, if always very much a minority, interpretation of the Muslim doctrine of jihad.
It also seems silly to ponder what the motive of Major Hasan might have been at Fort Hood, as the newspapers have been doing. He was following a recognizable Muslim tradition.
Most top Muslim authorities would probably have condemned it at any point in the history of Islam. It would be wrong, therefore, to blame all Muslims for it, any more than we should blame all Hindus for the Thuggee practice of ritual human sacrifice. To my judgement, it is a corruption of Islam, and cannot really be justified by it. It is more of a folk-Muslim belief, like the witch hunts within Christianity, which were usually condemned by the proper religious authorities.
It is time, however, that we did a better job of studying history, and of studying Islam. Our lives might depend on it.
Tuesday, December 22, 2009
William Kurelek's Passion of Christ
Here, available online, is the full set of Canadian artist Willaim Kurelek's illustrations of Matthew's passion:
http://www.niagarafallsartgallery.ca/passion.htm
William Kurelek is a wonderful artist, and a personal favourite. As a young man, because of an abusive upbringing, he suffered terribly from depression, so bad that is was diagnosed as schizophrenia. His art, and his devotion (he was a convert to the Catholic Church) seemed to pull him out of it.
http://www.niagarafallsartgallery.ca/passion.htm
William Kurelek is a wonderful artist, and a personal favourite. As a young man, because of an abusive upbringing, he suffered terribly from depression, so bad that is was diagnosed as schizophrenia. His art, and his devotion (he was a convert to the Catholic Church) seemed to pull him out of it.
Lost Civilizations
In the Old Testament, God advocates genocide. The Hebrews are encouraged to take the land of Canaan from its original inhabitants, and in doing so to slaughter entire cities. How can this be justified?
Entire cities? Wait—didn't Harry Truman slaughter entire cities in Japan with the first two atom bombs? Didn't Churchill kill a large part of the population of Dresden? Apparently, then, there are circumstances in which it is morally justified to kill civilians in war. In the days of walled cities, for example, killing everyone in a city that held out could encourage the next one to give up without a long seige—possibly saving lives in the long run.
What about the territorial aggression? What right did the Hebrews have to take the land of another ethnic group?
For the sake of the greater good of world peace, we all agree in these times that no one government has the right to seize the land of another; but this principle of international law arguably dates only from about 1919. Theoretically, after all, there could be issues greater than this or that nation's territorial integrity. Tony Blair, for example, has recently argued that the rest of the world has a right to intervene when the present government of any country is in flagrant violation of human rights. This was the justification for Kosovo.
Bingo. That is exactly the reasoning given in the Bible. Canaan was given over to the Hebrews not because they were entitled to it, or better than other people, but because of the wickedness of its previous inhabitants: “It is not for your righteousness or for the uprightness of your heart that you are going to possess their land, but it is because of the wickedness of these nations that the LORD your God is driving them out before you” ().
Specifically, the evil of which the Canaanites were charged was that of burning their children as sacrifices to their gods (Deuteronomy 12:31).
Given that this was indeed their common practice, and that it was so accepted as to have been central to their very religion, it does seem to be possible to argue that even the Canaanites themselves were better off if this particular culture were wiped out, as a culture. “Cultural genocide,” in other words, would have been justified, and might be elsewhere as well.
Why not? We are inclined these days, Romantically, to assert that all cultures are good and equal and worthy of preservation, albeit with the odd exception of white European male culture, which is oppressive. But Shintoist/Imperialist Japan was also a culture, with longstanding religious traditions, and we found it worth wiping them out by force. Child sacrifice does sound like a reasonably serious violation of human rights.
So Yehweh and the Hebrews might indeed have been fully morally justified. After all, if God exists and indeed loves us, he must consider it of some importance to protect us from false doctrines: “Do not be afraid of those who kill the body but cannot kill the soul. Rather, be afraid of the One who can destroy both soul and body in hell” (Matthew 10:27-29). Simply dying by an earthquake or a Hebrew sword is trivial; we all die. But to be led astray by false doctrine—that's more serious, because the results can be eternal. So a culture that not merely practices but preaches true systemic evil may, by its nature, and in the natural run of things, be crying out to heaven for destruction. So the Old Testament supposes. If not by Hebrew armies, by fire and brimstone falling from heaven, like Sodom and Gomorrah, the cities of the plain, guilty almost to a man of the sin of sodomy, or more precisely, of traditions of homosexual gang rape of wayfaring strangers.
Now, this leads to another thought. If the Old Testament is right about this, might it also be possible to find a pattern of cultures that overtly promote evil being wiped out, suddenly and dramatically, throughout history? If so, we would seem to have one more proof of the existence of God, just as he Old Testament envisions him. Of course, we must be cautious: as they say, history tends to be written by the winners.
It does look as though we do. The first example that naturally occurs is the neopagan and ethics-denying regimes of Germany, Italy, and Japan, within living memory; few of us would argue, surely, that their doctrines were not intrinsically evil, regardless of who writes the histories. As regimes and as a social theory, they were wiped out within a couple of dozen years at most. Within three or four years they went from the apparent inevitable wave of the future to a belief that no longer dared say its name.
Next, also within living memory, we have atheistic Stalinism, which lasted seventy years, then seemed to pop like a balloon. “Communism” still survives nominally in China and a few other countries; but this communism seems to have been able to transform itself into a rather different, and less malicious doctrine, as did the people of Nineveh after the warnings of Jonah.
Let's go further back in time, to consider other regimes and nations that have faced total or near-total destruction: what about the famous ancient example of Carthage, burned to ashes by the Romans and the ashes salted over?
Interestingly, the charge against Carthage is exactly the same as that against the Canaanites: they sacrificed children to their gods. This is not too surprising—Carthage was a branch of the same cultural tree as was Canaan. The accusations of child sacrifice against Carthage, therefore, support the historical truth of those against Canaan, and vice versa.
Another legendary collapse is that of Minoan civilization on Crete. The Minoans get a good press these days, because feminists believe they were a matriarchal society; but there is definite evidence that they, too, practiced human sacrifice. The bones of children have been discovered which seem to show them to have been ritually sacrificed, then eaten. The Greek legend of Theseus and the Minotaur seems to confirm this with an ethnic memory of regular human sacrifice on Crete.
The Minoans disappeared so suddenly that it is commonly believed that the culture was wiped out by the eruption of the Thera volcano in 1600-1500 BC—like Sodom, a death by fire falling from the sky. It is also often suggested that its fate is the origin of the myth of Atlantis.
Now we get really controversial: how about the many disappeared cultures of Native North America, pre-Columbus? They too collapsed suddenly, spectacularly, and almost completely. Not primarily because of Europeans, either: they seem to have been devastated by diseases before they fought, the Aztec and Inca Empires fell before a mere handful of Spaniards, and the Mayan civilization collapsed before the first Europeans arrived. It all almost looks like a judgement from God.
We know that they, too, practiced human sacrifice on a large scale. We know that they practiced ritual torture and cannibalism. When Columbus first encountered the Carib Indians, his journal describes "limbs of human bodies hung up in houses as if curing for provisions," and "body parts ... roasting before the fire." Three young Carib slaves who had been captured and castrated by another tribe, fled to him and sought shelter, claiming they were soon to be eaten. His men theorized that they might have discovered Hell.
The Caribs then quickly and almost completely disappeared.
Easter Island is another famous example of a disappearing culture. Without foreign intervention, the islanders seem to have deliberately desecrated their huge religious statues over a period of just four years, and a large, rich island population rapidly dwindled to a poor and tiny remnant.
Here too, human sacrifice and cannibalism had been part of the culture, with children the preferred victims.
It is all, at least, as I said, evocative.
Note that the issue is not simply that of killing people, or of killing children. One might suppose that any culture or civilization that takes to killing its young is almost committing deliberate suicide, with or without divine intervention. But that's too obvious—the number of killed were unlikely ever to be that great. Moreover, the exposure of unwanted infants was a common practice in many other ancient societies that saw no similar collapse: Greece, Japan, China—not to mention our own current taste for abortion. Yet it seems to be only the societies in which killing was part of the religion or ideology, part of the official culture at its deepest core, that experience this phenomenon of sudden collapse.
Exactly the cases that call for God's attention.
One might also suppose that such bloodthirsty cultures would be unlikely to generate strong affections among their followers, so that, once the culture were threatened by whatever means, they could command little loyalty, and so tend to sudden collapse. Yet there is evidence this is not a factor: we know from recent polls, for example, that Josef Stalin is still rather popular even in modern Russia, and contemporary Chinese still more or less revere Mao Zedong. My country, it would seem, remains my country, right or wrong; and this perhaps further justifies the Biblical need to extirpate such doctrines in dramatic fashion.
In most of these cases, too, there seems to be something uncanny, almost miraculous, about the suddenness of the collapse, beginning with the falling walls of Jericho. Generally, the fall is sudden, and we do not understand and cannot explain its causes; at most, there are conflicting theories. It is as though God has left his calling card, repeatedly, pressed between the pages of history, for those who have eyes to see.
Entire cities? Wait—didn't Harry Truman slaughter entire cities in Japan with the first two atom bombs? Didn't Churchill kill a large part of the population of Dresden? Apparently, then, there are circumstances in which it is morally justified to kill civilians in war. In the days of walled cities, for example, killing everyone in a city that held out could encourage the next one to give up without a long seige—possibly saving lives in the long run.
What about the territorial aggression? What right did the Hebrews have to take the land of another ethnic group?
For the sake of the greater good of world peace, we all agree in these times that no one government has the right to seize the land of another; but this principle of international law arguably dates only from about 1919. Theoretically, after all, there could be issues greater than this or that nation's territorial integrity. Tony Blair, for example, has recently argued that the rest of the world has a right to intervene when the present government of any country is in flagrant violation of human rights. This was the justification for Kosovo.
Bingo. That is exactly the reasoning given in the Bible. Canaan was given over to the Hebrews not because they were entitled to it, or better than other people, but because of the wickedness of its previous inhabitants: “It is not for your righteousness or for the uprightness of your heart that you are going to possess their land, but it is because of the wickedness of these nations that the LORD your God is driving them out before you” ().
Specifically, the evil of which the Canaanites were charged was that of burning their children as sacrifices to their gods (Deuteronomy 12:31).
Given that this was indeed their common practice, and that it was so accepted as to have been central to their very religion, it does seem to be possible to argue that even the Canaanites themselves were better off if this particular culture were wiped out, as a culture. “Cultural genocide,” in other words, would have been justified, and might be elsewhere as well.
Why not? We are inclined these days, Romantically, to assert that all cultures are good and equal and worthy of preservation, albeit with the odd exception of white European male culture, which is oppressive. But Shintoist/Imperialist Japan was also a culture, with longstanding religious traditions, and we found it worth wiping them out by force. Child sacrifice does sound like a reasonably serious violation of human rights.
So Yehweh and the Hebrews might indeed have been fully morally justified. After all, if God exists and indeed loves us, he must consider it of some importance to protect us from false doctrines: “Do not be afraid of those who kill the body but cannot kill the soul. Rather, be afraid of the One who can destroy both soul and body in hell” (Matthew 10:27-29). Simply dying by an earthquake or a Hebrew sword is trivial; we all die. But to be led astray by false doctrine—that's more serious, because the results can be eternal. So a culture that not merely practices but preaches true systemic evil may, by its nature, and in the natural run of things, be crying out to heaven for destruction. So the Old Testament supposes. If not by Hebrew armies, by fire and brimstone falling from heaven, like Sodom and Gomorrah, the cities of the plain, guilty almost to a man of the sin of sodomy, or more precisely, of traditions of homosexual gang rape of wayfaring strangers.
Now, this leads to another thought. If the Old Testament is right about this, might it also be possible to find a pattern of cultures that overtly promote evil being wiped out, suddenly and dramatically, throughout history? If so, we would seem to have one more proof of the existence of God, just as he Old Testament envisions him. Of course, we must be cautious: as they say, history tends to be written by the winners.
It does look as though we do. The first example that naturally occurs is the neopagan and ethics-denying regimes of Germany, Italy, and Japan, within living memory; few of us would argue, surely, that their doctrines were not intrinsically evil, regardless of who writes the histories. As regimes and as a social theory, they were wiped out within a couple of dozen years at most. Within three or four years they went from the apparent inevitable wave of the future to a belief that no longer dared say its name.
Next, also within living memory, we have atheistic Stalinism, which lasted seventy years, then seemed to pop like a balloon. “Communism” still survives nominally in China and a few other countries; but this communism seems to have been able to transform itself into a rather different, and less malicious doctrine, as did the people of Nineveh after the warnings of Jonah.
Let's go further back in time, to consider other regimes and nations that have faced total or near-total destruction: what about the famous ancient example of Carthage, burned to ashes by the Romans and the ashes salted over?
Interestingly, the charge against Carthage is exactly the same as that against the Canaanites: they sacrificed children to their gods. This is not too surprising—Carthage was a branch of the same cultural tree as was Canaan. The accusations of child sacrifice against Carthage, therefore, support the historical truth of those against Canaan, and vice versa.
Another legendary collapse is that of Minoan civilization on Crete. The Minoans get a good press these days, because feminists believe they were a matriarchal society; but there is definite evidence that they, too, practiced human sacrifice. The bones of children have been discovered which seem to show them to have been ritually sacrificed, then eaten. The Greek legend of Theseus and the Minotaur seems to confirm this with an ethnic memory of regular human sacrifice on Crete.
The Minoans disappeared so suddenly that it is commonly believed that the culture was wiped out by the eruption of the Thera volcano in 1600-1500 BC—like Sodom, a death by fire falling from the sky. It is also often suggested that its fate is the origin of the myth of Atlantis.
Now we get really controversial: how about the many disappeared cultures of Native North America, pre-Columbus? They too collapsed suddenly, spectacularly, and almost completely. Not primarily because of Europeans, either: they seem to have been devastated by diseases before they fought, the Aztec and Inca Empires fell before a mere handful of Spaniards, and the Mayan civilization collapsed before the first Europeans arrived. It all almost looks like a judgement from God.
We know that they, too, practiced human sacrifice on a large scale. We know that they practiced ritual torture and cannibalism. When Columbus first encountered the Carib Indians, his journal describes "limbs of human bodies hung up in houses as if curing for provisions," and "body parts ... roasting before the fire." Three young Carib slaves who had been captured and castrated by another tribe, fled to him and sought shelter, claiming they were soon to be eaten. His men theorized that they might have discovered Hell.
The Caribs then quickly and almost completely disappeared.
Easter Island is another famous example of a disappearing culture. Without foreign intervention, the islanders seem to have deliberately desecrated their huge religious statues over a period of just four years, and a large, rich island population rapidly dwindled to a poor and tiny remnant.
Here too, human sacrifice and cannibalism had been part of the culture, with children the preferred victims.
It is all, at least, as I said, evocative.
Note that the issue is not simply that of killing people, or of killing children. One might suppose that any culture or civilization that takes to killing its young is almost committing deliberate suicide, with or without divine intervention. But that's too obvious—the number of killed were unlikely ever to be that great. Moreover, the exposure of unwanted infants was a common practice in many other ancient societies that saw no similar collapse: Greece, Japan, China—not to mention our own current taste for abortion. Yet it seems to be only the societies in which killing was part of the religion or ideology, part of the official culture at its deepest core, that experience this phenomenon of sudden collapse.
Exactly the cases that call for God's attention.
One might also suppose that such bloodthirsty cultures would be unlikely to generate strong affections among their followers, so that, once the culture were threatened by whatever means, they could command little loyalty, and so tend to sudden collapse. Yet there is evidence this is not a factor: we know from recent polls, for example, that Josef Stalin is still rather popular even in modern Russia, and contemporary Chinese still more or less revere Mao Zedong. My country, it would seem, remains my country, right or wrong; and this perhaps further justifies the Biblical need to extirpate such doctrines in dramatic fashion.
In most of these cases, too, there seems to be something uncanny, almost miraculous, about the suddenness of the collapse, beginning with the falling walls of Jericho. Generally, the fall is sudden, and we do not understand and cannot explain its causes; at most, there are conflicting theories. It is as though God has left his calling card, repeatedly, pressed between the pages of history, for those who have eyes to see.
Monday, December 21, 2009
Popular Misconceptions about Witch Hunts
1.It was not a Medieval practice. The prosecution of witches began in the 15th and peaked in the 17th century, the “Age of Reason.”
2.It was not specifically a Christian practice. Witches are known and usually feared worldwide, and continue to be hunted and killed in non-Christian societies today, notably in Africa. Two hundred were killed in South Africa in the 1990s. The Catholic Church was more sceptical of witchcraft than most: according to St. Augustine and Canon Law, witches had no real power, and the prosecution and execution of witches was specifically prohibited. The common people, however, did believe in witches. The European witch trials generally took place when and where the Church's influence was weakest.
3.Witches were not pagans, and not the predecessors of today's “neo-pagans.” No witch was ever executed for worshipping a pagan deity. Conversely, the pagan Romans and Norse practiced witch-burning, and it was Christianity that ended this practice. Neopaganism was invented out of whole cloth in the 1950s.
4.Best current estimates are that 40,000 to 60,000 witches were executed throughout Europe over the centuries—not the millions often claimed. One of the main popularizers of the common false figure of nine million killed, chillingly enough, was the German Nazis. The witch trials were supposedly a Christian attempt to exterminate “Aryan womanhood.” This notion of an Aryan holocaust fit well with their anti-Christian campaign, and helped to justify their own real holocaust.
5.Contrary to feminist claims, it was not a matter of men trying to dominate women. While the accused in witch trials were primarily (80%) women, so were the accusers. Women were simply much more likely than men to believe in the reality and efficacy of witchcraft.
2.It was not specifically a Christian practice. Witches are known and usually feared worldwide, and continue to be hunted and killed in non-Christian societies today, notably in Africa. Two hundred were killed in South Africa in the 1990s. The Catholic Church was more sceptical of witchcraft than most: according to St. Augustine and Canon Law, witches had no real power, and the prosecution and execution of witches was specifically prohibited. The common people, however, did believe in witches. The European witch trials generally took place when and where the Church's influence was weakest.
3.Witches were not pagans, and not the predecessors of today's “neo-pagans.” No witch was ever executed for worshipping a pagan deity. Conversely, the pagan Romans and Norse practiced witch-burning, and it was Christianity that ended this practice. Neopaganism was invented out of whole cloth in the 1950s.
4.Best current estimates are that 40,000 to 60,000 witches were executed throughout Europe over the centuries—not the millions often claimed. One of the main popularizers of the common false figure of nine million killed, chillingly enough, was the German Nazis. The witch trials were supposedly a Christian attempt to exterminate “Aryan womanhood.” This notion of an Aryan holocaust fit well with their anti-Christian campaign, and helped to justify their own real holocaust.
5.Contrary to feminist claims, it was not a matter of men trying to dominate women. While the accused in witch trials were primarily (80%) women, so were the accusers. Women were simply much more likely than men to believe in the reality and efficacy of witchcraft.
Saturday, December 19, 2009
Of a Climate of the Mind
It is odd that views on global warming are so consistent with political leanings: left-wingers always believe in it and want something done; right-wingers usually doubt it and want to wait and see. It should not be so; it should be based on the science. But most of us are not climate scientists; yet we must all make the judgement, since huge sums of public money are involved.
So here's my take, as someone who is emphatically not a "climate scientist." Leave aside science: here's my simple math. The value of protocols like Kyoto or Copenhagen depends on a string of unproven assumptions being true. 1) First, the world must indeed be warming. 2) This warming must indeed be unprecedented in speed and scale. 3) Its overall effects must be harmful, not helpful or neutral. 4) It must be possible for humans to do something to prevent it (whether or not they have caused it). 5) The cost of doing something to prevent it must be less than the cost of the event if it occurs. And 6) the particular solutions agreed upon in such a conference must be the most effective means to prevent it, or reasonably close to the most effective.
None of us who are not scientists have any real data to address any of these claims. So, in the absence of evidence, what are the odds that proposition 1 is true? There are three possibilities, a warming earth, a cooling earth, or an earth staying at about the same temperature. So our odds are 33.3%. Cut that in half to represent the odds of thesis 2, which seems to be a yes/no proposition. We're at 16.65%. There are three options again for the third thesis: we're at 5.5%. Halve that again for proposition 4: 2.775%. Again for proposition 5: 1.3875%. Option 6 seems to be less than a real 50/50 proposition, given the known unscientific nature of politics, but let's give it the benefit of the doubt: the overall likelihood of money spent for Kyoto or Copenhagen being money well spent stands at 0.69376%. Put another way, there is a 99.30624% chance we are wasting our money.
I don't like them odds. They'd be worth pitching in for if the prize were big enough and the price of entry cheap enough; like lightning insurance or a ticket on the lottery. But not much more.
But this is precisely the calculation that all of us who are not climate scientists should be making. Even if there were a consensus among climate scientists—and this is not at all clear—those of us who are not climate scientists could not trust it, since asserting the reality of global warming is clearly in any climate scientist's self-interest.
Indeed, that being the case, we should probably chip a bit more off those odds. Make it 0.34688%.
On top of that, I have one further caveat. Consider, if you will, that we are in effect talking here about the weather. That's all climate really is, so far as I can see: long-range weather prediction. Now, any one of us knows from our own common experience that we are not very good at this. We can do reasonably well in forecasting a day ahead, a lot less well for a week ahead, and are notably unreliable once we get much beyond that threshold. The Old Farmer's Almanac does as well as anybody at a year's distance.
So how likely are we to be accurate in any forecast of climate a hundred years hence?
“Chaos theory” was invented by Edward Lorenz in 1961 specifically to explain our inability to predict future weather patterns.
Here's Wikipedia on Chaos theory:
"Small differences in initial conditions (such as those due to rounding errors in numerical computation) yield widely diverging outcomes for chaotic systems, _rendering long-term prediction impossible in general_."
Emphasis mine.
"... [Lorenz] wanted to see a sequence of data again and to save time he started the simulation in the middle of its course. He was able to do this by entering a printout of the data corresponding to conditions in the middle of his simulation which he had calculated last time.
To his surprise the weather that the machine began to predict was completely different from the weather calculated before. ... The computer worked with 6-digit precision, but the printout rounded variables off to a 3-digit number, so a value like 0.506127 was printed as 0.506. ... Lorenz had discovered that small changes in initial conditions produced large changes in the long-term outcome.[35] Lorenz's discovery, which gave its name to Lorenz attractors, proved that meteorology could not reasonably predict weather beyond a weekly period (at most)."
Just to double-check Wikipedia's veracity, here's "Whatis.com" on the same subject:
"Poincare proved mathematically that, even if the initial measurements could be made a million times more precise, that the uncertainty of prediction for outcomes did not shrink along with the inaccuracy of measurement, but remained huge. Unless initial measurements could be absolutely defined - an impossibility - predictability for complex - chaotic - systems performed scarcely better than if the predictions had been randomly selected from possible outcomes."
So “climate science” is perfectly positioned as a distant canvas on which we can project anything we want or can imagine: including mankind's eternal fantasy of the world coming to an end in some great catastrophe of water or fire. “Here be monsters.”
And now, we also have the intriguing evidence of “Climategate”...
The whole affair strikes me as almost laughable. It probably belongs in a newly-revised edition of “Extraordinary Popular Delusions and the Madness of Crowds.”
So here's my take, as someone who is emphatically not a "climate scientist." Leave aside science: here's my simple math. The value of protocols like Kyoto or Copenhagen depends on a string of unproven assumptions being true. 1) First, the world must indeed be warming. 2) This warming must indeed be unprecedented in speed and scale. 3) Its overall effects must be harmful, not helpful or neutral. 4) It must be possible for humans to do something to prevent it (whether or not they have caused it). 5) The cost of doing something to prevent it must be less than the cost of the event if it occurs. And 6) the particular solutions agreed upon in such a conference must be the most effective means to prevent it, or reasonably close to the most effective.
None of us who are not scientists have any real data to address any of these claims. So, in the absence of evidence, what are the odds that proposition 1 is true? There are three possibilities, a warming earth, a cooling earth, or an earth staying at about the same temperature. So our odds are 33.3%. Cut that in half to represent the odds of thesis 2, which seems to be a yes/no proposition. We're at 16.65%. There are three options again for the third thesis: we're at 5.5%. Halve that again for proposition 4: 2.775%. Again for proposition 5: 1.3875%. Option 6 seems to be less than a real 50/50 proposition, given the known unscientific nature of politics, but let's give it the benefit of the doubt: the overall likelihood of money spent for Kyoto or Copenhagen being money well spent stands at 0.69376%. Put another way, there is a 99.30624% chance we are wasting our money.
I don't like them odds. They'd be worth pitching in for if the prize were big enough and the price of entry cheap enough; like lightning insurance or a ticket on the lottery. But not much more.
But this is precisely the calculation that all of us who are not climate scientists should be making. Even if there were a consensus among climate scientists—and this is not at all clear—those of us who are not climate scientists could not trust it, since asserting the reality of global warming is clearly in any climate scientist's self-interest.
Indeed, that being the case, we should probably chip a bit more off those odds. Make it 0.34688%.
On top of that, I have one further caveat. Consider, if you will, that we are in effect talking here about the weather. That's all climate really is, so far as I can see: long-range weather prediction. Now, any one of us knows from our own common experience that we are not very good at this. We can do reasonably well in forecasting a day ahead, a lot less well for a week ahead, and are notably unreliable once we get much beyond that threshold. The Old Farmer's Almanac does as well as anybody at a year's distance.
So how likely are we to be accurate in any forecast of climate a hundred years hence?
“Chaos theory” was invented by Edward Lorenz in 1961 specifically to explain our inability to predict future weather patterns.
Here's Wikipedia on Chaos theory:
"Small differences in initial conditions (such as those due to rounding errors in numerical computation) yield widely diverging outcomes for chaotic systems, _rendering long-term prediction impossible in general_."
Emphasis mine.
"... [Lorenz] wanted to see a sequence of data again and to save time he started the simulation in the middle of its course. He was able to do this by entering a printout of the data corresponding to conditions in the middle of his simulation which he had calculated last time.
To his surprise the weather that the machine began to predict was completely different from the weather calculated before. ... The computer worked with 6-digit precision, but the printout rounded variables off to a 3-digit number, so a value like 0.506127 was printed as 0.506. ... Lorenz had discovered that small changes in initial conditions produced large changes in the long-term outcome.[35] Lorenz's discovery, which gave its name to Lorenz attractors, proved that meteorology could not reasonably predict weather beyond a weekly period (at most)."
Just to double-check Wikipedia's veracity, here's "Whatis.com" on the same subject:
"Poincare proved mathematically that, even if the initial measurements could be made a million times more precise, that the uncertainty of prediction for outcomes did not shrink along with the inaccuracy of measurement, but remained huge. Unless initial measurements could be absolutely defined - an impossibility - predictability for complex - chaotic - systems performed scarcely better than if the predictions had been randomly selected from possible outcomes."
So “climate science” is perfectly positioned as a distant canvas on which we can project anything we want or can imagine: including mankind's eternal fantasy of the world coming to an end in some great catastrophe of water or fire. “Here be monsters.”
And now, we also have the intriguing evidence of “Climategate”...
The whole affair strikes me as almost laughable. It probably belongs in a newly-revised edition of “Extraordinary Popular Delusions and the Madness of Crowds.”
Friday, December 18, 2009
Science is Dead, Not God
I have a strong suspicion that the current “Climategate” affair is a bigger deal than we can yet imagine. Polls are already suggesting the death of “climate change” as a marketable political issue. But I think it has implications beyond the “climate change” issue itself; it speaks to the public prestige of science. In the past several centuries, that prestige has been so great and growing that “Scientism” has become, overwhelmingly, our true majority religion, worldwide. “Scientism” is the belief that science has a unique claim to truth, produces final answers, and can, at least potentially, “explain everything.” From this, “scientists” and those who claim to be “scientists” have derived an odd moral prestige, as if they were a priesthood supernaturally protected not just from error, but from sin.
Bad enough; it produces a class system, and an unregulated, very powerful, elite. But “science” has also been invoked as an authority in areas where it is of no value, with absurd and harmful results: most notably, the “social sciences.” Marxism, feminism, Fascism, Freudianism, all have had a good run on an absurd claim to be “scientific,” killing and destroying lives in incalculable numbers. Most recently we have the “new atheism,” scientism reaching the apotheosis of arrogance, demanding the elimination of all competing forms of belief.
However, just as the invention of the printing press lifted the veil on the sources of Christian doctrine, letting every interested layman see it for themselves, with rather dramatic results in the Reformation and Counter-Reformation, the Internet is lifting the veil on the sources of scientific doctrine, letting every interested layman get a good look. The results are likely to be similar. The united and unchallenged authority of Scientism is likely to crumble. And, just as Christianity was eventually replaced with “Scientism” as the de facto religion as a result, now Scientism in its turn is likely to be replaced as the dominant religion.
Perhaps, with luck, by a return to true religion. Scientism was not just “a religion like the others,” but a false religion: a religion of materialism, with no morality, no God, no spirit, no philosophical validity, no concept of man's inner being. It blocked the path to true religion, while offering no alternative. I doubt it will survive public scrutiny nearly as well as Christian doctrine did.
Bad enough; it produces a class system, and an unregulated, very powerful, elite. But “science” has also been invoked as an authority in areas where it is of no value, with absurd and harmful results: most notably, the “social sciences.” Marxism, feminism, Fascism, Freudianism, all have had a good run on an absurd claim to be “scientific,” killing and destroying lives in incalculable numbers. Most recently we have the “new atheism,” scientism reaching the apotheosis of arrogance, demanding the elimination of all competing forms of belief.
However, just as the invention of the printing press lifted the veil on the sources of Christian doctrine, letting every interested layman see it for themselves, with rather dramatic results in the Reformation and Counter-Reformation, the Internet is lifting the veil on the sources of scientific doctrine, letting every interested layman get a good look. The results are likely to be similar. The united and unchallenged authority of Scientism is likely to crumble. And, just as Christianity was eventually replaced with “Scientism” as the de facto religion as a result, now Scientism in its turn is likely to be replaced as the dominant religion.
Perhaps, with luck, by a return to true religion. Scientism was not just “a religion like the others,” but a false religion: a religion of materialism, with no morality, no God, no spirit, no philosophical validity, no concept of man's inner being. It blocked the path to true religion, while offering no alternative. I doubt it will survive public scrutiny nearly as well as Christian doctrine did.
Thursday, December 17, 2009
On Classroom Discipline
A friend sends along an article from the LA Times about the difficulties teachers face. The subhead reads: “Among the top reasons why teachers are deemed unsuccessful or leave the profession is their inability to effectively manage student behavior, experts say.”
Nicely put, that: "deemed unsuccessful."
Of course, everyone today is worried about maintaining order in the schools; this is the stuff the newspapers love. Kids today have no discipline, and schools have become free-fire zones.
And, indeed, "classroom management," or running a disciplined class, seems always to be the number one thing that school administrators value in teachers. Probably many parents agree. It also seems to be the one thing teachers currently most value in themselves. It is, as the article implies, just about the essence of the teaching profession these days. But should it be? Does it have any relation to actual learning?
There is no question that an orderly, compliant class is much more comfortable for the teacher. It is also easy to observe and evaluate. Indeed, it is about the only thing it is possible to evaluate reliably in a one-hour "classroom observation." This may be one secret of its current prominence, since good teaching is, by comparison, difficult to define, difficult to evaluate, and impossible to evaluate in any one-hour classroom observation. Given that it can be evaluated, this would at least require a lot more work on the part of administrators.
But what about the kids? Is it good for their education? At best, it seems tangential—a question of babysitting, not teaching. Even if it is really important, the simple and less expensive thing, surely, would be to hire a security guard to handle it, a bouncer, and leave teachers to teach.
The studies just have not been done showing that a more orderly classroom results in better student achievement. And consider: for comparison, are the adult societies that can put up the best show of public order the most productive societies? Outstanding in this regard would be countries like North Korea, Nazi Germany, Stalin's Soviet Union... By comparison, public events in a country like the US, Britain, France, or Hong Kong, tend to look relatively chaotic. But which type of society has proven, over the long term, more productive, in either practical or intellectual terms?
Why wouldn't it be the same with classrooms?
Of course, there is a necessary minimum, with students not strangling one another and vandalizing the property. Countries that descend into true chaos, like Somalia or Afghanistan, tend to be least successful of all. But they equate, precisely, to a classroom of children without an adult present—that is not a likely scenario in our schools.
Short of that, it is hard to believe that the presence of an adult in a room of twenty or thirty children—not to mention an adult with the power to evaluate you, send you to the Vice Principal, or give you a detention--is not in itself sufficient to accomplish that much.
Beyond that, in the middle range, some may argue that relatively more orderly societies like Germany and Japan are preferable to relatively less orderly societies like Italy and Korea. But at best, that is a point on which reasonable people can differ, more a matter of personal preference than anything objective. Historically, both have been about equally successful economically and intellectually.
So, on the whole, the current emphasis on “class management” seems to be beside the point. Except that excessive order is almost certainly harmful; yet this is what the present system favours. How natural, or healthy, is it really for a young child to sit still and quiet at a desk for hours at time? How educational is it? In fact, we have definite evidence that we learn better while we are physically engaged, and moving. Aristotle insisted on it, which is why we have the word “pedant.” It means “walker.”
The model of the orderly class at its desks was surely designed for the convenience of the teacher, and the system, not for the best education. Some have argued that it is based on the model of the factory, and sees children as products rolling off an assembly line.
Some will probably point out, and with justice, that students do not sit still at their desks nearly so much as they used to any more—nowadays, they are moved around into different configurations for “pair work,” “group work,” and so on. This is true; but it is still pretty sedentary, and it maintains and even accentuates the teacher's total control over the students. Now it is not enough that they sit silently where they've been planted; they must also get up and march about efficiently at the teacher's command.
Besides not being conducive to learning, all this teaches one particular lesson above all others: conformity. This is, I submit, not a good lesson for a future citizen of a democracy, or of a pluralistic, tolerant society, to learn. Nor is it good for creativity, human progress, or for any serious later intellectual inquiry.
This need for discipline also prompts teachers, I think, to deliberately select boring material. They cannot afford to get the children too excited: excited children tend to make noises or run about. Unfortunately, we also know that maintaining interest is the one great essential for learning. Plato insisted on it as the teacher's chief duty.
It also does two more things that we probably do not want to happen. First, it weeds out of the teaching profession anyone who does not themselves highly value conformity—which would also mean, probably, that it weeds out the brighest teachers, the best scholars, the most creative teachers, those least inclined toward prejudice, those most sympathetic to children and their special needs, and those most likely to make a special effort for a student who needs it. Second, it not only gives free rein to, but positively encourages, teachers who are inclined to be bullies. The profession, by its nature, is probably already a magnet for any born bully: it's the fastest route there is to significant power over a large number of others. We badly need to set up barriers to prevent this. Instead, currently, we are virtually requiring it.
I think it is a very bad sign if, when you stroll into the teacher's lounge, the students are spoken of as adversaries.
http://www.suite101.com/article.cfm/classroom_discipline/8217
Yet that seems to be the common case in teachers' lounges everywhere.
It seems to me that anyone who does not feel a positive affection for all their students should not be teaching. This is what Don Bosco, the great Catholic educator, considered the key to all good teaching: you must love all your students, and be a friend and advocate for them at all times. Nor did he have the luxury of instructing only the well-behaved and well-bred. Just the reverse: his schools were strictly for the urban poor, abandoned street kids, the boys in the 'hood. Boys' Town, in the US, based on his principles, repeated the experiment with the same striking success. Any discipline was handled by the children themselves. St. Philip Neri, another great Catholic educator, said of his rowdy students, "I don't care if they chop wood on my back, so long as they don't sin."
Something is very wrong with teaching, if “class discipline” or “classroom management” has become or remains the linchpin of the profession. And I'm not at all sure it is the students' fault.
Nicely put, that: "deemed unsuccessful."
Of course, everyone today is worried about maintaining order in the schools; this is the stuff the newspapers love. Kids today have no discipline, and schools have become free-fire zones.
And, indeed, "classroom management," or running a disciplined class, seems always to be the number one thing that school administrators value in teachers. Probably many parents agree. It also seems to be the one thing teachers currently most value in themselves. It is, as the article implies, just about the essence of the teaching profession these days. But should it be? Does it have any relation to actual learning?
There is no question that an orderly, compliant class is much more comfortable for the teacher. It is also easy to observe and evaluate. Indeed, it is about the only thing it is possible to evaluate reliably in a one-hour "classroom observation." This may be one secret of its current prominence, since good teaching is, by comparison, difficult to define, difficult to evaluate, and impossible to evaluate in any one-hour classroom observation. Given that it can be evaluated, this would at least require a lot more work on the part of administrators.
But what about the kids? Is it good for their education? At best, it seems tangential—a question of babysitting, not teaching. Even if it is really important, the simple and less expensive thing, surely, would be to hire a security guard to handle it, a bouncer, and leave teachers to teach.
The studies just have not been done showing that a more orderly classroom results in better student achievement. And consider: for comparison, are the adult societies that can put up the best show of public order the most productive societies? Outstanding in this regard would be countries like North Korea, Nazi Germany, Stalin's Soviet Union... By comparison, public events in a country like the US, Britain, France, or Hong Kong, tend to look relatively chaotic. But which type of society has proven, over the long term, more productive, in either practical or intellectual terms?
Why wouldn't it be the same with classrooms?
Of course, there is a necessary minimum, with students not strangling one another and vandalizing the property. Countries that descend into true chaos, like Somalia or Afghanistan, tend to be least successful of all. But they equate, precisely, to a classroom of children without an adult present—that is not a likely scenario in our schools.
Short of that, it is hard to believe that the presence of an adult in a room of twenty or thirty children—not to mention an adult with the power to evaluate you, send you to the Vice Principal, or give you a detention--is not in itself sufficient to accomplish that much.
Beyond that, in the middle range, some may argue that relatively more orderly societies like Germany and Japan are preferable to relatively less orderly societies like Italy and Korea. But at best, that is a point on which reasonable people can differ, more a matter of personal preference than anything objective. Historically, both have been about equally successful economically and intellectually.
So, on the whole, the current emphasis on “class management” seems to be beside the point. Except that excessive order is almost certainly harmful; yet this is what the present system favours. How natural, or healthy, is it really for a young child to sit still and quiet at a desk for hours at time? How educational is it? In fact, we have definite evidence that we learn better while we are physically engaged, and moving. Aristotle insisted on it, which is why we have the word “pedant.” It means “walker.”
The model of the orderly class at its desks was surely designed for the convenience of the teacher, and the system, not for the best education. Some have argued that it is based on the model of the factory, and sees children as products rolling off an assembly line.
Some will probably point out, and with justice, that students do not sit still at their desks nearly so much as they used to any more—nowadays, they are moved around into different configurations for “pair work,” “group work,” and so on. This is true; but it is still pretty sedentary, and it maintains and even accentuates the teacher's total control over the students. Now it is not enough that they sit silently where they've been planted; they must also get up and march about efficiently at the teacher's command.
Besides not being conducive to learning, all this teaches one particular lesson above all others: conformity. This is, I submit, not a good lesson for a future citizen of a democracy, or of a pluralistic, tolerant society, to learn. Nor is it good for creativity, human progress, or for any serious later intellectual inquiry.
This need for discipline also prompts teachers, I think, to deliberately select boring material. They cannot afford to get the children too excited: excited children tend to make noises or run about. Unfortunately, we also know that maintaining interest is the one great essential for learning. Plato insisted on it as the teacher's chief duty.
It also does two more things that we probably do not want to happen. First, it weeds out of the teaching profession anyone who does not themselves highly value conformity—which would also mean, probably, that it weeds out the brighest teachers, the best scholars, the most creative teachers, those least inclined toward prejudice, those most sympathetic to children and their special needs, and those most likely to make a special effort for a student who needs it. Second, it not only gives free rein to, but positively encourages, teachers who are inclined to be bullies. The profession, by its nature, is probably already a magnet for any born bully: it's the fastest route there is to significant power over a large number of others. We badly need to set up barriers to prevent this. Instead, currently, we are virtually requiring it.
I think it is a very bad sign if, when you stroll into the teacher's lounge, the students are spoken of as adversaries.
http://www.suite101.com/article.cfm/classroom_discipline/8217
Yet that seems to be the common case in teachers' lounges everywhere.
It seems to me that anyone who does not feel a positive affection for all their students should not be teaching. This is what Don Bosco, the great Catholic educator, considered the key to all good teaching: you must love all your students, and be a friend and advocate for them at all times. Nor did he have the luxury of instructing only the well-behaved and well-bred. Just the reverse: his schools were strictly for the urban poor, abandoned street kids, the boys in the 'hood. Boys' Town, in the US, based on his principles, repeated the experiment with the same striking success. Any discipline was handled by the children themselves. St. Philip Neri, another great Catholic educator, said of his rowdy students, "I don't care if they chop wood on my back, so long as they don't sin."
Something is very wrong with teaching, if “class discipline” or “classroom management” has become or remains the linchpin of the profession. And I'm not at all sure it is the students' fault.
Tuesday, December 15, 2009
Sunday, December 13, 2009
Why They Fight
Dr. Hamid Tawfik, who became follower of al-Qaeda while at university, but has now left the movement, was asked in a recent video (http://www.youtube.com/watch?v=O2wvqDfitLY) whom the Islamists see as their real, ultimate, enemy. He answers, unhesitatingly, “Women's Rights.” They hate the West, yes, but because they see it as the promoter of “Women's Rights.” Israel? Who cares?
Dr. Tawfik is probably a good source. He was mentored by Dr. Ayman al-Zawahiri himself, the number two man in al Qaeda.
This would explain why Islamism, and a “clash of civilizations” between the Muslim world and the West, has become such a problem at this particular moment in history, when for decades or even centuries there seemed to be relative peace and quiet. As recently as the Eighties, polls showed most Palestinian Arabs were content with the existence of Israel, and any terrorism in the Middle East was Marxist, not Muslim.
Why feminism? This sounds quite likely to me. Because, as I recall, it is feminism, not Islam, which first opened fire, and feminism which made the first contemporary universalist claims. Feminism was making a cause celebre of the Middle Eastern (albeit not specifically Muslim) custom of “female circumcision” at least as early as the 1980s, when Islamism did not yet exist on the international horizon. Feminists were circulating petitions demanding the US invade Afghanistan long before 9/11, on the premise that the traditions enforced by the Taliban were violating women's rights. Feminists were objecting to Arab women wearing the hijab long before 9/11. Feminism has been loudly objecting for decades to all non-Western cultures; bad as they consider the West, they hold all other cultures to be yet more oppressive to women, and have long been demanding that something be done about them.
Accordingly, feminism insisted that “Women's rights” must take precedence over Islam, like Confucianism, Hinduism, Judaism, Christianity, and everyone else, before Islam responded by demanding that Islam should take precedence over “women's rights.” Islamism is, in this sense, very much a defensive, not an aggressive, movement.
Feminism is, in essence, a totalitarian movement. Like all totalitarian movements, it cannot abide religion. Religion stands in the way of ultimate power.
Way back when, it shocked me, as a Christian, when feminism so casually demanded that the established traditions and teachings of Christianity must be altered to fit their preferences. It seemed obvious that religion, and two thousand years of thought and experience, ought not to be upset for a new and untried political ideology. I was similarly shocked when feminism steamrolled over established Native American/First Nations traditions and treaties—I expected some cultural sensitivity.
Islam is not the first established faith to stand up and say “no—the rules of ethics do not change, and the eternal truths do not change at a whim.” Catholicism deserves some credit for refusing some of feminism's more radical demands, and so do some “Fundamentalist” Protestant groups.
Islamism might, perhaps, be faulted for being the first to resort to violence. But where did principled but peaceful resistance get the Catholics and the Biblically loyalist Protestants? Sidelined by the culture. Their few victories so far have been mostly symbolic, while the culture as a whole, including most of their own nominal followers, have moved over to the feminist experiment.
Peaceful resistance and seeking accommodation is not always terribly effective when dealing with a totalitarian movement; it did not work well for the Czechs at Munich. Possibly, then, Islamism is right, in believing that a resort to arms is necessary. It gives the rest of us, at a minimum, cause to stop and consider carefully how far we are prepared to insist on this new feminist dogma. Feminism has met precious little serious opposition up to this point; most Westerners have long not even dared to question it.
For Islam, the stakes are high. Thanks to feminism, and European women’s abandonment of their traditional role and interests, the population of Europe is about to tip into absolute decline. Islam seems poised to take over that continent, demographically, without a shot, so long as they can resist the infection themselves. Can the world be far behind? But if they succumb, as everyone else has, they too will go into decline, in numbers and in authority. The future will belong to the feminist pseudo-religion.
Take this perspective, and it is hard to fault them for putting up a fight.
Dr. Tawfik is probably a good source. He was mentored by Dr. Ayman al-Zawahiri himself, the number two man in al Qaeda.
This would explain why Islamism, and a “clash of civilizations” between the Muslim world and the West, has become such a problem at this particular moment in history, when for decades or even centuries there seemed to be relative peace and quiet. As recently as the Eighties, polls showed most Palestinian Arabs were content with the existence of Israel, and any terrorism in the Middle East was Marxist, not Muslim.
Why feminism? This sounds quite likely to me. Because, as I recall, it is feminism, not Islam, which first opened fire, and feminism which made the first contemporary universalist claims. Feminism was making a cause celebre of the Middle Eastern (albeit not specifically Muslim) custom of “female circumcision” at least as early as the 1980s, when Islamism did not yet exist on the international horizon. Feminists were circulating petitions demanding the US invade Afghanistan long before 9/11, on the premise that the traditions enforced by the Taliban were violating women's rights. Feminists were objecting to Arab women wearing the hijab long before 9/11. Feminism has been loudly objecting for decades to all non-Western cultures; bad as they consider the West, they hold all other cultures to be yet more oppressive to women, and have long been demanding that something be done about them.
Accordingly, feminism insisted that “Women's rights” must take precedence over Islam, like Confucianism, Hinduism, Judaism, Christianity, and everyone else, before Islam responded by demanding that Islam should take precedence over “women's rights.” Islamism is, in this sense, very much a defensive, not an aggressive, movement.
Feminism is, in essence, a totalitarian movement. Like all totalitarian movements, it cannot abide religion. Religion stands in the way of ultimate power.
Way back when, it shocked me, as a Christian, when feminism so casually demanded that the established traditions and teachings of Christianity must be altered to fit their preferences. It seemed obvious that religion, and two thousand years of thought and experience, ought not to be upset for a new and untried political ideology. I was similarly shocked when feminism steamrolled over established Native American/First Nations traditions and treaties—I expected some cultural sensitivity.
Islam is not the first established faith to stand up and say “no—the rules of ethics do not change, and the eternal truths do not change at a whim.” Catholicism deserves some credit for refusing some of feminism's more radical demands, and so do some “Fundamentalist” Protestant groups.
Islamism might, perhaps, be faulted for being the first to resort to violence. But where did principled but peaceful resistance get the Catholics and the Biblically loyalist Protestants? Sidelined by the culture. Their few victories so far have been mostly symbolic, while the culture as a whole, including most of their own nominal followers, have moved over to the feminist experiment.
Peaceful resistance and seeking accommodation is not always terribly effective when dealing with a totalitarian movement; it did not work well for the Czechs at Munich. Possibly, then, Islamism is right, in believing that a resort to arms is necessary. It gives the rest of us, at a minimum, cause to stop and consider carefully how far we are prepared to insist on this new feminist dogma. Feminism has met precious little serious opposition up to this point; most Westerners have long not even dared to question it.
For Islam, the stakes are high. Thanks to feminism, and European women’s abandonment of their traditional role and interests, the population of Europe is about to tip into absolute decline. Islam seems poised to take over that continent, demographically, without a shot, so long as they can resist the infection themselves. Can the world be far behind? But if they succumb, as everyone else has, they too will go into decline, in numbers and in authority. The future will belong to the feminist pseudo-religion.
Take this perspective, and it is hard to fault them for putting up a fight.
Thursday, December 10, 2009
Signs You May Be Catholic
You may be Catholic if ...
The above is all copyright by me, Stephen K. Roney; but free to use for non-commercial purposes with clear acknowledgement of the course.
- you are not offended by references to bodily functions, but by disrespectful references to the Trinity or elements of the Mass.
- your friends all call you Frank, and your middle name begins with an X.
- you know the difference between a deuterocanonical and an apocryphal book.
- you suspect Dan Brown is borderline mentally retarded.
- you always found the words to the pop song “Spirit in the Sky” (“Never been a sinner/I never sinned”) shocking and blasphemous.
- you worry more about Purgatory than Hell
- you study the philosophers AuGUStine and St. Thomas More, instead of the philosophers AUgustine and Sir Thomas More.
- you are never certain you are saved, or that anyone else is damned.
- you don't see anything sinister about being Spanish.
- you take original sin personally.
- you do not think of Limbo as a dance.
- you're not sure what a “Roman” Catholic is.
- you know that Felicity and Perpetua are saints.
- you know that “Immaculate Conception” means conceived without sin, not without sex
- ...and you know it was Mary, not Jesus.
- you think of the British monarchy as chintzy and nouveau riche.
- you recognize the acronyms OLPH, JMJ, and AMDG
- you see nothing immoral in gambling, smoking, or drinking, but think of eating fish as virtuous.
- you know Jesus fell three times while carrying the cross.
- you can immediately tell whether a statue is of Mary, or of St. Theresa of Lisieux
- you don't think of saints, let alone priests and pastors, as sinless.
- you know the difference between mortal and venial.
- if you won the lottery, the first thing you'd do is have more children.
- you would never admit it, but in your own mind you tend to conflate Martin Luther and Lex Luthor.
- you are vicariously embarassed by the “white hats/black hats” moral view of most Hollywood movies.
- your friends call you Ben, but your name isn't Benjamin.
- you know the names of the three kings.
- you think of Anne, Elizabeth, and Mary as New Testament saints, not English Tudor Queens.
- you think of the Middle Ages as kind of a good idea.
- you are Irish or Scottish, and your name isn't William.
- you don't believe Santa Claus is a fairy tale, or a lie you tell your children; he's a saint in heaven.
- you don't see anything unhealthy about feeling guilty.
- your friends call you Al, but your name isn't Allan or Albert.
- you have invisible friends that you talk to regularly.
- you see art as a part of religion, not something opposed to it.
- you think “mixed marriage” means marrying a non-Catholic.
- you know the difference between a substance and an accident.
- you do not believe it is possible to be “too good”--but you can be too rich or too thin.
- you imagine the architecture in heaven as being Gothic.
- you think of Mardi Gras and Carnival as religious celebrations.
- when someone calls out the number of the next hymn in church, you sometimes feel the urge to shout out “Bingo!”
- you believe in the healing power of saying you're sorry.
- you can't remember the sermon.
- you used to get your leisure reading suggestions not from Oprah, but from something called “The Index.”
- for you, Mary is queen of more than just the Scots.
The above is all copyright by me, Stephen K. Roney; but free to use for non-commercial purposes with clear acknowledgement of the course.
Tuesday, December 08, 2009
No Salvation Outside the Church?
In a debate with Dinesh D’Souza, which I recently viewed on YouTube, Christopher Hitchens made the claim that Christians are obliged to believe that all non-Christians go to hell. Making a very similar point, Richard Dawson responded to a simple question from a Christian, “What if you are wrong?” by countering “What if you are wrong about Zeus?” (He might also have mentioned his flying spaghetti monster.) You’d think Hitchens and Dawson would have at least taken the time to learn what Christianity actually teaches before deciding to reject it; that they have not seems to reflect poorly on their sincerity. D’Souza was able to respond, quite simply, that this is not the teaching of the Catholic Church. Indeed, it has officially been declared heretical, and has led to actual excommunication. Similarly, Dawson is wrong if he supposes that belief in Catholicism means believing other religions are simply untrue, or that Catholics must hold, as atheists do, that Zeus does not exist.
The Catholic belief is quite simple and lucid. Catholicism is the truth. If we did not think so, of course, we would not be Catholics. It is not, of course, the only truth; that I am wearing black socks is also true, without really being an article of Catholic faith. But any assertion that directly contradicts Catholic teaching, plainly, must be untrue.
Other religions, therefore, can be largely, indeed mostly, true. Because, overwhelmingly, they agree with Catholicism. The points on which they disagree are generally few. Atheism, on the other hand, is plainly false in its key, defining assertion.
Nor does Catholicism ask anyone to accept its assertions on anything like “blind faith.” The teachings of the Catholic Church can by and large be explained and demonstrated to the unaided human intellect through reason and evidence. Yes, there are “mysteries” that go beyond what reason can completely comprehend, but these too can be shown to be logically necessary conclusions, and certainly never to contradict either reason or evidence.
We are all perfectly aware, through unaided conscience, of an absolute moral obligation to seek truth, and, once truth is known, to embrace it. Since Catholicism is true, we are all morally obliged to be Catholics. If we are not Catholics, we have committed a sin, and in a sense the most serious sin possible, that of turning from God.
So it is perfectly reasonable, and obvious, to say that one must be Catholic to be saved. But only if one’s ignorance of the truth of Catholicism is deliberate. Just the present gentle reader is not guilty of lying if he insists he does not know the colour of my socks, as he cannot see them.
What is, in all circumstances, morally obligatory, is to genuinely seek to find and to follow truth; especially on the most important matters of life, the most important of all being religion. And if one does so, that road will inexorably lead in the direction of Catholicism. Nevertheless, it is entirely possible to do so in all sincerity, and yet still die without yet having come to the definite conclusion that Catholicism is true. In this case, you are innocent of any sin, and therefore still entirely likely to go to heaven.
Understand this principle, and you understand why Catholic evangelization is not terribly pushy. Non-Catholics are okay, so long as they are sincere in what they do believe. And so long as they are sincere, they are heading in the right direction without any intervention on the missionary’s part. There is no need to rush anything.
Unfortunately, Hitchens seems to have identified himself plainly as not of this camp of sincere seekers of truth—he has not made the effort to find out what Catholicism teaches. He believes what he believes not out of a commitment to truth, it appears, but for some ulterior motive. Nor can he plead lack of intelligence, a lack of the intellectual equipment to discern matters quickly and clearly. And the same can be said of Dawkins. It is not for us to judge; but it is striking just how much flat-out misinformation and deliberate distortion there is in the popular culture about what the teachings of the Catholic Church are. This deliberate falsification of Catholic teaching in itself argues strongly for the truth of the Catholic teaching. Obviously, those who oppose it do not do so out of any commitment to truth; and obviously, they fear the power of and secretly suspect the truth of the real teaching, or they would reveal it in order to plainly disprove it.
Can someone know the truth, and willfully refuse to accept it? Of course; Adam and Eve did exactly that. It seems to be a part of human nature, completely illogical as it is. TS Eliot thought that “man can only take so much truth,” and Winston Churchill suggested that most people, when they stumble upon the truth, simply pick themselves up and walk away, as though nothing had happened.
This, of course, argues as much against an easy, uncritical, facile Catholicism as anything else in terms of unexamined life. It is no doubt easier for us Catholics; but it is still up to us to test everything, as Saint Paul required.
The Catholic belief is quite simple and lucid. Catholicism is the truth. If we did not think so, of course, we would not be Catholics. It is not, of course, the only truth; that I am wearing black socks is also true, without really being an article of Catholic faith. But any assertion that directly contradicts Catholic teaching, plainly, must be untrue.
Other religions, therefore, can be largely, indeed mostly, true. Because, overwhelmingly, they agree with Catholicism. The points on which they disagree are generally few. Atheism, on the other hand, is plainly false in its key, defining assertion.
Nor does Catholicism ask anyone to accept its assertions on anything like “blind faith.” The teachings of the Catholic Church can by and large be explained and demonstrated to the unaided human intellect through reason and evidence. Yes, there are “mysteries” that go beyond what reason can completely comprehend, but these too can be shown to be logically necessary conclusions, and certainly never to contradict either reason or evidence.
We are all perfectly aware, through unaided conscience, of an absolute moral obligation to seek truth, and, once truth is known, to embrace it. Since Catholicism is true, we are all morally obliged to be Catholics. If we are not Catholics, we have committed a sin, and in a sense the most serious sin possible, that of turning from God.
So it is perfectly reasonable, and obvious, to say that one must be Catholic to be saved. But only if one’s ignorance of the truth of Catholicism is deliberate. Just the present gentle reader is not guilty of lying if he insists he does not know the colour of my socks, as he cannot see them.
What is, in all circumstances, morally obligatory, is to genuinely seek to find and to follow truth; especially on the most important matters of life, the most important of all being religion. And if one does so, that road will inexorably lead in the direction of Catholicism. Nevertheless, it is entirely possible to do so in all sincerity, and yet still die without yet having come to the definite conclusion that Catholicism is true. In this case, you are innocent of any sin, and therefore still entirely likely to go to heaven.
Understand this principle, and you understand why Catholic evangelization is not terribly pushy. Non-Catholics are okay, so long as they are sincere in what they do believe. And so long as they are sincere, they are heading in the right direction without any intervention on the missionary’s part. There is no need to rush anything.
Unfortunately, Hitchens seems to have identified himself plainly as not of this camp of sincere seekers of truth—he has not made the effort to find out what Catholicism teaches. He believes what he believes not out of a commitment to truth, it appears, but for some ulterior motive. Nor can he plead lack of intelligence, a lack of the intellectual equipment to discern matters quickly and clearly. And the same can be said of Dawkins. It is not for us to judge; but it is striking just how much flat-out misinformation and deliberate distortion there is in the popular culture about what the teachings of the Catholic Church are. This deliberate falsification of Catholic teaching in itself argues strongly for the truth of the Catholic teaching. Obviously, those who oppose it do not do so out of any commitment to truth; and obviously, they fear the power of and secretly suspect the truth of the real teaching, or they would reveal it in order to plainly disprove it.
Can someone know the truth, and willfully refuse to accept it? Of course; Adam and Eve did exactly that. It seems to be a part of human nature, completely illogical as it is. TS Eliot thought that “man can only take so much truth,” and Winston Churchill suggested that most people, when they stumble upon the truth, simply pick themselves up and walk away, as though nothing had happened.
This, of course, argues as much against an easy, uncritical, facile Catholicism as anything else in terms of unexamined life. It is no doubt easier for us Catholics; but it is still up to us to test everything, as Saint Paul required.
Friday, December 04, 2009
Why Kids Always Love School
There is precious little useful data in the social sciences. However, Maryland's Abell Foundation has crunched all the available studies to determine what makes a good teacher. Their analysis, interestingly, arrives at all the same conclusions I already have, working deductively. The evidence, at a minimum, adds some reinforcement to common sense.
First point: studies show that good teachers do make a difference in student performance. Roughly, the students of a good teacher learn twice as fast as those with a bad teacher. It follows that it is indeed worthwhile to try to improve teacher quality. But how do we do this? Insist that they all have proper training in "pedagogy"?
Well, no. Obviously, the Education Schools have a vested interest in proving this is so, and access to the funding to do the relevant studies. Nevertheless, no study to date, according to the Abell survey, has shown that graduates of ed schools teach any better than those who do not go, in terms of student achievement. Studies also indicate that teachers with higher degrees in education (M. Ed., Ed. D.) do not produce better student results. In fact, a table aggregating sixty studies on the issue (Abell, p. 18) seems to plainly show, on balance, that they produce worse student results-the more education a teacher has in education, the worse a teacher he or she becomes .
Since modern teacher education stands and falls on its supposedly "scientific" basis, plainly, it falls. We should be spending no more money on Education Schools, or on hiring teachers with such qualifications. It is at best worthless, with some indications it is actually harming our children.
So, in choosing teachers, what criteria should we use instead?
No suprises here, at least for me. The single clearest indication of who will be a good teacher is how well the candidate does on a test of verbal ability. The present paper cites directly 20 separate recent studies that produce this same result.
That's what good teaching is, in the first instance: being able to explain things clearly (and, even better, engagingly). The traditional training for a career as a teacher in the past, in the West and in many other cultures, was in rhetoric, and the final exam was often a successful performance in a debate.
Now we need to refer to a fact cited here before, and mentioned again in the present report: those entering Ed Schools at every level regularly score lower on the verbal SAT than those majoring in almost any other subject. Leave aside why this is so-it necessarily means that selecting our teachers from Ed School graduates produces a lower quality of instruction than choosing randomly from holders of any other degree. It is not just that the schools do nothing to teach the one essential teaching skill: they also, for whatever reason, repel those who already have it.
The present study theorizes that the greater verbal ability measured in the best teachers is a proxy for greater intelligence, and that this is the relevant factor. It might well be; nothing correlates better with IQ than the size of one's working vocabulary. And studies also show that the graduates of more "selective" colleges do better as teachers than others. They also show that the holders of avanced degrees in the subject taught produce better student results, at least at high school level.
This too, is simply common sense. Other than an ability to communicate well, the second logical requirement for a good teacher is knowledge of the subject taught. Intelligence is ability to learn quickly; and those who attend a better college presumably also learn more in the same amount of time. And this always matters; you cannot teach what you do not know. Even at the elementary level, a good general knowledge, a wide grounding in the history of human thought, makes a difference. This is precisely the value a living teacher can add: fielding those unexpected questions. Otherwise, the student could do as well on their own with a good book. Moreover, someone who has spent a few years devoted to a given subject obviously has some personal enthusiasm for it: if nothing else, enthusiasm and interest shows, and tends to rub off.
Unfortunately, as the authors of the Abell study point out, academic excellence of this traditional sort is given no status in Ed Schools. When I was going through the system and looked into the possibility of a teaching degree, back in the mid-seventies, there were absolutely no academic requirements for Ed School in terms of marks or schools attended: if you held the general BA, it was purely first-come, first-served.
Rumours hold that the same attitude persists in the marking of students once in. For, the authors of the present study note, the Ed Schools and the teaching profession have developed a distinctly anti-intellectual culture that resents those with other academic qualifications and backgrounds. They are presumably representatives of the oppressor class.
Rather obviously, by circumventing this system, a private school could easily achieve better results at a lower cost. And that is exactly what we find: private schools achieve consistently better student results at roughly half the cost per student; so does home schooling.
In other words: our Ed Schools are a net drag on our society.
First point: studies show that good teachers do make a difference in student performance. Roughly, the students of a good teacher learn twice as fast as those with a bad teacher. It follows that it is indeed worthwhile to try to improve teacher quality. But how do we do this? Insist that they all have proper training in "pedagogy"?
Well, no. Obviously, the Education Schools have a vested interest in proving this is so, and access to the funding to do the relevant studies. Nevertheless, no study to date, according to the Abell survey, has shown that graduates of ed schools teach any better than those who do not go, in terms of student achievement. Studies also indicate that teachers with higher degrees in education (M. Ed., Ed. D.) do not produce better student results. In fact, a table aggregating sixty studies on the issue (Abell, p. 18) seems to plainly show, on balance, that they produce worse student results-the more education a teacher has in education, the worse a teacher he or she becomes .
Since modern teacher education stands and falls on its supposedly "scientific" basis, plainly, it falls. We should be spending no more money on Education Schools, or on hiring teachers with such qualifications. It is at best worthless, with some indications it is actually harming our children.
So, in choosing teachers, what criteria should we use instead?
No suprises here, at least for me. The single clearest indication of who will be a good teacher is how well the candidate does on a test of verbal ability. The present paper cites directly 20 separate recent studies that produce this same result.
That's what good teaching is, in the first instance: being able to explain things clearly (and, even better, engagingly). The traditional training for a career as a teacher in the past, in the West and in many other cultures, was in rhetoric, and the final exam was often a successful performance in a debate.
Now we need to refer to a fact cited here before, and mentioned again in the present report: those entering Ed Schools at every level regularly score lower on the verbal SAT than those majoring in almost any other subject. Leave aside why this is so-it necessarily means that selecting our teachers from Ed School graduates produces a lower quality of instruction than choosing randomly from holders of any other degree. It is not just that the schools do nothing to teach the one essential teaching skill: they also, for whatever reason, repel those who already have it.
The present study theorizes that the greater verbal ability measured in the best teachers is a proxy for greater intelligence, and that this is the relevant factor. It might well be; nothing correlates better with IQ than the size of one's working vocabulary. And studies also show that the graduates of more "selective" colleges do better as teachers than others. They also show that the holders of avanced degrees in the subject taught produce better student results, at least at high school level.
This too, is simply common sense. Other than an ability to communicate well, the second logical requirement for a good teacher is knowledge of the subject taught. Intelligence is ability to learn quickly; and those who attend a better college presumably also learn more in the same amount of time. And this always matters; you cannot teach what you do not know. Even at the elementary level, a good general knowledge, a wide grounding in the history of human thought, makes a difference. This is precisely the value a living teacher can add: fielding those unexpected questions. Otherwise, the student could do as well on their own with a good book. Moreover, someone who has spent a few years devoted to a given subject obviously has some personal enthusiasm for it: if nothing else, enthusiasm and interest shows, and tends to rub off.
Unfortunately, as the authors of the Abell study point out, academic excellence of this traditional sort is given no status in Ed Schools. When I was going through the system and looked into the possibility of a teaching degree, back in the mid-seventies, there were absolutely no academic requirements for Ed School in terms of marks or schools attended: if you held the general BA, it was purely first-come, first-served.
Rumours hold that the same attitude persists in the marking of students once in. For, the authors of the present study note, the Ed Schools and the teaching profession have developed a distinctly anti-intellectual culture that resents those with other academic qualifications and backgrounds. They are presumably representatives of the oppressor class.
Rather obviously, by circumventing this system, a private school could easily achieve better results at a lower cost. And that is exactly what we find: private schools achieve consistently better student results at roughly half the cost per student; so does home schooling.
In other words: our Ed Schools are a net drag on our society.
Thursday, December 03, 2009
If You Meet the Holy Father on the Road, Kill Him
Kathy Shaidle is right only about 50% of the time, in my books; but even when she is wrong she is insightful. Often, like most great writers, she makes me pound my fist on the table in sudden recognition, hissing “Yes!” under my breath. In her blog yesterday, she mentions how she hates the sort of Catholic who says "Our Blessed Mother" and "The Holy Father" instead of "Mary" and "the Pope.” Haven't I always thought the same? That kind of “more reverent-than-thou” language offends me too. They always do things “in a special way.” They always wear their miraculous medal in full view. Their saints are made of plaster or alabaster instead of meat and blood—they consider it blasphemy to allow that this or that saint, or pope, or even priest, was ever wrong or ever sinned.
These are, plainly, the sort of people to whom the outward appearance of piety is more important than the reality.
They are, in a word, Pharisees.
Pharisaism is a central problem not just for all religions, but for all professions; false piety is always the first refuge of a scoundrel, and false wisdom of a fool. The worst thing about it is not that it can cause me that urge to regurgitate, or render so much of the content of a typical religious bookshop, say, unreadable; but that it can drive sincere and naturally pious people away from religion.
The good news is that Christianity is at least far more aware of the problem than most of the world's institutions. It is almost the first point made in the New Testament.
These are, plainly, the sort of people to whom the outward appearance of piety is more important than the reality.
They are, in a word, Pharisees.
Pharisaism is a central problem not just for all religions, but for all professions; false piety is always the first refuge of a scoundrel, and false wisdom of a fool. The worst thing about it is not that it can cause me that urge to regurgitate, or render so much of the content of a typical religious bookshop, say, unreadable; but that it can drive sincere and naturally pious people away from religion.
The good news is that Christianity is at least far more aware of the problem than most of the world's institutions. It is almost the first point made in the New Testament.
Tuesday, December 01, 2009
Darwin and Mein Kampf
Among the lies we are all taught in school is that bit about “Social Darwinism.” You remember it, don't you? As we were solemnly taught, the late nineteenth and early twentieth centuries were cursed with an unfortunate pseudo-scientific misapplication of Darwin's Theory of Evolution, the idea that humans and human societies continue to develop along the lines of natural selection. But of course, this is wrong, right? Evolutionary pressures no longer apply in human society, right?
In fact, it is not a misapplication at all. It is an integral part of the original theory, as argued by Darwin himself. In The Descent of Man, Darwin claimed in so many words that “social qualities” were acquired through natural selection, in a competion of tribe against tribe, race against race, natio n against nation.
Darwin accordingly asserted with the perfect scientific objectivity of history's greatest biologist that the “western nations of Europe ... immeasurably surpass their former savage progenitors and stand at the summit of civilization.” He also assumed that “at some future period, not very distant ... the civilized races of man will almost certainly exterminate and replace throughout the world the savage races.”
A rather brutal concept of Empire, surely. Obviously, for Darwin himself, evolutionary pressures continued. But the picture was a bit more complicated than it might first appear. As a Briton, Darwin was no doubt at least a bit concerned by the thought that victory was not always to the most civilized per se, but to the fittest. Civilization had its obvious benefits, but it could be carried too far—in fact, Darwin argued explicitly against allowing evolutionary pressures to cease to apply to human society, on the grounds that any human society that permitted this would eventually be doomed.
“If we do not prevent the reckless, the vicious and otherwise inferior members of society from increasing at a quicker rate than the better class of men,” he warned, “the nation will retrograde.” A younger, more vital, less principled nation might supplant it. Continued advancement required that human beings “must remain subject to a severe struggle.”
The fact of Darwin's own “social Darwinism” has been suppressed, of course; as well as his arguments for it, which unfortunately seem to meet in advance the arguments against natural selection applying in human society: if they do not for one society, so much the worse for it in the end!
This, of course, is because it is only too obvious that Darwin's theory leads directly to the Nazi and Fascist theories of racism, amorality, and the virtues of war and conquest. Everything they asserted, believed, and did, follows directly from Darwin. Germany, Italy and Japan saw themselves as just the sort of young and vital nations to take out the old, “decadent,” “plutocratic” England and France in the evolutionary struggle. Hitler's “Kampf,” as in “Mein Kampf,” was a Darwinian reference. The future belonged to them, of a scientific certainty—unless, that is, they let the Slavs develop far enough to jump them from behind.
Hence an obvious problem for the powers that be: it would seem that either a) Hitler was right, or b) Darwin was wrong, and not just about Social Darwinism, but Darwinism itself. Not wanting to accept either contention, they have essentially suppressed the issue, looked away, chucked the relevant evidence as much as possible down the memory hole.
It was, for example, not because of some fervent belief in a literalist interpretation of Genesis that William Jennings Bryan fought Clarence Darrow over the teaching of evolution in the Scopes “monkey trial.” Bryan, after all, was a politician, not a preacher. He feared that the general acceptance of Darwin's views would destroy liberalism, not Christian fundamentalism—though Bryan quite rightly understood that liberalism depended on a Judeo-Christian foundation.
And Bryan was proved dramatically right, within just a few years.
Unfortunately, the attempt to simply suppress and ignore the issue inevitably leads to the same deductions from Darwin being made all over again, over time. Elements resurface in such movements as feminism, postmodernism, and the new atheism; given enough time, on this path, a full-blown rerun of Nazism is probably inevitable.
But there is another way to look at it. Arguably, World War II itself tended to disprove Darwin. Certainly, there were too many variables to make the experiment truly valid, but Hitler and his allies did everything Darwin suggested they should, and the result went decisively against them.
More broadly, Darwin's fears about the future decay of developed “decadent” nations in which more or less everybody survives and breeds have been proved wrong in an important way. In the absence of survival pressures, the more so with modern social security and health care, according to Darwin, each subsequent generation in the most developed countries ought to be a little less intelligent, and a little less physically able, than the last. “The rich get richer and the poor get children.”
Yet what has actually been happening? Leaving aside the performance of schools, which has apparently been generally declining for independent reasons, we discover that the average IQ, in the most developed countries, has been increasing in recent times by about three percentage points each generation. Meantime, Olympic and other sports records have been falling at regular intervals over the past hundred years, most often to athletes from more developed countries.
This is the opposite of what Darwin's theory predicts. Without natural selection, this general tendency towards physical and mental improvement (i.e., survivability) of the species and the tribe ought to have stopped, then reversed. Darwin's theory, and Darwinian evolution, the idea of “natural selection of random mutations” has thus been demonstrated not to be predictive. In proper scientific terms, it has been proved false.
These data argue instead for an intelligent design: an evolution that is not random, and is not powered by a struggle of all against all.
In fact, it is not a misapplication at all. It is an integral part of the original theory, as argued by Darwin himself. In The Descent of Man, Darwin claimed in so many words that “social qualities” were acquired through natural selection, in a competion of tribe against tribe, race against race, natio n against nation.
Darwin accordingly asserted with the perfect scientific objectivity of history's greatest biologist that the “western nations of Europe ... immeasurably surpass their former savage progenitors and stand at the summit of civilization.” He also assumed that “at some future period, not very distant ... the civilized races of man will almost certainly exterminate and replace throughout the world the savage races.”
A rather brutal concept of Empire, surely. Obviously, for Darwin himself, evolutionary pressures continued. But the picture was a bit more complicated than it might first appear. As a Briton, Darwin was no doubt at least a bit concerned by the thought that victory was not always to the most civilized per se, but to the fittest. Civilization had its obvious benefits, but it could be carried too far—in fact, Darwin argued explicitly against allowing evolutionary pressures to cease to apply to human society, on the grounds that any human society that permitted this would eventually be doomed.
“If we do not prevent the reckless, the vicious and otherwise inferior members of society from increasing at a quicker rate than the better class of men,” he warned, “the nation will retrograde.” A younger, more vital, less principled nation might supplant it. Continued advancement required that human beings “must remain subject to a severe struggle.”
The fact of Darwin's own “social Darwinism” has been suppressed, of course; as well as his arguments for it, which unfortunately seem to meet in advance the arguments against natural selection applying in human society: if they do not for one society, so much the worse for it in the end!
This, of course, is because it is only too obvious that Darwin's theory leads directly to the Nazi and Fascist theories of racism, amorality, and the virtues of war and conquest. Everything they asserted, believed, and did, follows directly from Darwin. Germany, Italy and Japan saw themselves as just the sort of young and vital nations to take out the old, “decadent,” “plutocratic” England and France in the evolutionary struggle. Hitler's “Kampf,” as in “Mein Kampf,” was a Darwinian reference. The future belonged to them, of a scientific certainty—unless, that is, they let the Slavs develop far enough to jump them from behind.
Hence an obvious problem for the powers that be: it would seem that either a) Hitler was right, or b) Darwin was wrong, and not just about Social Darwinism, but Darwinism itself. Not wanting to accept either contention, they have essentially suppressed the issue, looked away, chucked the relevant evidence as much as possible down the memory hole.
It was, for example, not because of some fervent belief in a literalist interpretation of Genesis that William Jennings Bryan fought Clarence Darrow over the teaching of evolution in the Scopes “monkey trial.” Bryan, after all, was a politician, not a preacher. He feared that the general acceptance of Darwin's views would destroy liberalism, not Christian fundamentalism—though Bryan quite rightly understood that liberalism depended on a Judeo-Christian foundation.
And Bryan was proved dramatically right, within just a few years.
Unfortunately, the attempt to simply suppress and ignore the issue inevitably leads to the same deductions from Darwin being made all over again, over time. Elements resurface in such movements as feminism, postmodernism, and the new atheism; given enough time, on this path, a full-blown rerun of Nazism is probably inevitable.
But there is another way to look at it. Arguably, World War II itself tended to disprove Darwin. Certainly, there were too many variables to make the experiment truly valid, but Hitler and his allies did everything Darwin suggested they should, and the result went decisively against them.
More broadly, Darwin's fears about the future decay of developed “decadent” nations in which more or less everybody survives and breeds have been proved wrong in an important way. In the absence of survival pressures, the more so with modern social security and health care, according to Darwin, each subsequent generation in the most developed countries ought to be a little less intelligent, and a little less physically able, than the last. “The rich get richer and the poor get children.”
Yet what has actually been happening? Leaving aside the performance of schools, which has apparently been generally declining for independent reasons, we discover that the average IQ, in the most developed countries, has been increasing in recent times by about three percentage points each generation. Meantime, Olympic and other sports records have been falling at regular intervals over the past hundred years, most often to athletes from more developed countries.
This is the opposite of what Darwin's theory predicts. Without natural selection, this general tendency towards physical and mental improvement (i.e., survivability) of the species and the tribe ought to have stopped, then reversed. Darwin's theory, and Darwinian evolution, the idea of “natural selection of random mutations” has thus been demonstrated not to be predictive. In proper scientific terms, it has been proved false.
These data argue instead for an intelligent design: an evolution that is not random, and is not powered by a struggle of all against all.
Subscribe to:
Posts (Atom)