Sunday, September 25, 2005

Evolution vs. Intelligent Design: False Certainty

I'm immersed in my study of Evolution vs. Intelligent Design Theory (ID). I'm reading essays, books, critiques both ways, etc. Lots of reading. Since I'm immersed in the topic, I also notice articles in newspapers and on the internet about the subject as well.

For example, I read an article from the Associated Press on the internet last week titled:

"Genes Show Signs Brain Still Evolving"

As I was reading the article I was reminded of a very old, and very bad, joke that I heard almost thirty years ago in one of the four calculus courses that I took in college. It goes like this:

Three scientists were riding in a car on their way to a meeting in Scotland. There was an astronomer, a physicist, and a mathematician.

Looking out the winding, the astonomer saw an animal standing sideways to the car out in a pasture.

"I didn't know that there were brown cows in Scotland", said the astronomer.

"Wait a minute", said the physicist. "All we can really say is that there is one brown cow in Scotland."

"Actually", smiled the mathematician, "the only thing we know precisely is that there is one cow in Scotland that appears to be brown on at least one side."

Bad joke, I know. You have to be a real science geek to get that joke. I personally love it.

The point of the joke is the level of precision and certainty common to different scientific disciplines. Mathematicians being very precise, astronomers less so. What's a billion light years here or there? Over the years I decided that Darwinian evolution advocates often fall in the less precise category.

I encountered another angle on this concept in my own scientific training in my chosen field. Metrology - "the science of measurement". A practical science, but a science. In the course of my training I learned to beware of false precisison. In other words, beware of stating the results of a measurement to a higher degree of precision that you could actually achieve with your measuring instruments. In other words, saying "a quarter of an inch" is different than saying "0.250000". This is especially dangerous when you are using calculators to convert measurements and you are fooled by your 8-digit display to think you are more precise than you really were. As an example, I once watched a man measure an eight foot long piece of metal with a standard toolbox 25 foot metal tape measure. Not a precise instrument by any means. Using the tape measure, which was marked off probably in 1/8th inch increments, he measured the metal rod. Then, very confidently, he recorded the measurement on a blueprint to 4 digits of precision. 8.6250 inches. He was fooling his readers with that sense of accuracy.

Unfortunately, I usually bring that joke and that training with me when I read scientific articles and because of them I usually end up heckling and deconstructing the article. Why? For two reasons:

- The headlines of these articles often implies a sense of certainty in the "discovery"

- The body of the article will often start with a degree of precision that the remainder of the article either can't back up or will contradict.

Let's examine our example. Again, the title:

Genes Show Signs Brain Still Evolving

Okay. It starts out confident and promising. "Genes show signs..." Will the article back that up? Let's continue...

WASHINGTON (AP) - The human brain may still be evolving. So suggests new research that tracked changes in two genes thought to help regulate brain growth, changes that appeared well after the rise of modern humans 200,000 years ago.

Wait a minute..."may still be evolving"? My confidence is slipping. But hey, it's backed up by "new research". Okay, I'm still listening.

That the defining feature of humans - our large brains - continued to evolve as recently as 5,800 years ago, and may be doing so today, promises to surprise the average person, if not biologists.

Wait a minute...stepped off a cliff again. Twice.

"...and may be doing so today"? Really? How confident are we again?

"...continued to evolve as recently as 5,800 years ago". Really?

5,800 years ago? Not 5,700? Not 5,900? How about almost 6,000 years ago? How about 6,000 plus or minus 1000 years ago? Are you sure? Are you that sure? Does the validity of this discovery hinge on how sure you are?

Let's press on and see how sure we are.

"We, including scientists, have considered ourselves as sort of the pinnacle of evolution," noted lead researcher Bruce Lahn, a University of Chicago geneticist whose studies appear in Friday's edition of the journal Science.

"There's a sense we as humans have kind of peaked," agreed Greg Wray, director of Duke University's Center for Evolutionary Genomics. "A different way to look at is it's almost impossible for evolution not to happen."

Wow, that sounds pretty sure. So sure, that "it's almost impossible for evolution not to happen." But then again, how likely is it that a director of a "Center for Evolutionary Genomics" is going to speak up and say that genetics don't show evolution? Really now. He'd have trouble finding a seat at the center's Darwin party. Let's see if anyone disagrees with all of this certainty.

Still, the findings also are controversial, because it's far from clear what effect the genetic changes had or if they arose when Lahn's "molecular clock" suggests - at roughly the same time period as some cultural achievements, including written language and the development of cities.

Lahn and colleagues examined two genes, named microcephalin and ASPM, that are connected to brain size. If those genes don't work, babies are born with severely small brains, called microcephaly.

Using DNA samples from ethnically diverse populations, they identified a collection of variations in each gene that occurred with unusually high frequency. In fact, the variations were so common they couldn't be accidental mutations but instead were probably due to natural selection, where genetic changes that are favorable to a species quickly gain a foothold and begin to spread, the researchers report.

Lahn offers an analogy: Medieval monks would copy manuscripts and each copy would inevitably contain errors - accidental mutations. Years later, a ruler declares one of those copies the definitive manuscript, and a rush is on to make many copies of that version - so whatever changes from the original are in this presumed important copy become widely disseminated.

Scientists attempt to date genetic changes by tracing back to such spread, using a statistical model that assumes genes have a certain mutation rate over time.

For the microcephalin gene, the variation arose about 37,000 years ago, about the time period when art, music and tool-making were emerging, Lahn said. For ASPM, the variation arose about 5,800 years ago, roughly correlating with the development of written language, spread of agriculture and development of cities, he said.

"The genetic evolution of humans in the very recent past might in some ways be linked to the cultural evolution," he said.

Other scientists urge great caution in interpreting the research.

That the genetic changes have anything to do with brain size or intelligence "is totally unproven and potentially dangerous territory to get into with such sketchy data," stressed Dr. Francis Collins, director of the National Human Genome Research Institute.

Aside from not knowing what the gene variants actually do, no one knows how precise the model Lahn used to date them is, Collins added.


Uh oh. Trouble in the tribe. Let's bypass for a moment the accuracy of the "molecular clock". I'm just going to relish the quote that it's "totally unproven and potentially dangerous territory to get into with such sketchy data". Delicious.

But, did they at least hang with the study author about the 5,800 years? Let's look:

Lahn's own calculations acknowledge that the microcephalin variant could have arisen anywhere from 14,000 to 60,000 years ago, and that the uncertainty about the ASPM variant ranged from 500 to 14,000 years ago.
Those criticisms are particularly important, Collins said, because Lahn's testing did find geographic differences in populations harboring the gene variants today. They were less common in sub-Saharan African populations, for example.

That does not mean one population is smarter than another, Lahn and other scientists stressed, noting that numerous other genes are key to brain development.

"There's just no correlation," said Duke's Wray, calling education and other environmental factors more important for intelligence than DNA anyway.


Uh oh. Not quite the precision that "5,800 years" implied. Not that I believe the other numbers either. 14,000 to 60,000 years ago? Really? I guess if you have better numbers, use them.

After you re-read the total article a couple of times you have to ask yourself two questions:

1. How many "probably"s and "may have"s does it take for a discovery to just become a inconsequential statement made by someone trying to justify their research?

2. How - in any way, shape, or form - does this article hold up and have any news value for the Associated Press to report?

But, I must say, I had fun heckling it.

No comments: