Wednesday, November 17, 2010

Luddites and The Evolution Machine

When Newton’s “clockwork” universe was “discovered” shortly before the 18th century, European culture had been familiar with mechanical clocks and automata of increasing sophistication for over 400 years. Once primed with their burden of potential energy and released, the ordered yet complex motions of clocks and automata continued without further human intervention - its maker could then walk away and let it run. These clever pieces of engineering provided the prototypes of a new paradigm and they have been used as an instructive model by philosophers ever since. Moreover, early anatomical studies helped the paradigm along.  It was natural enough, then, for the interpreters of Newton to use the automaton as an analogy to draw conclusions about the relation of God to His creation. In particular, the notion of a Deistical God who built the universe in the manner of a human engineer and then left it to its devices is a notion still very much with us today. The advent of Quantum randomness didn’t change the picture very much either: Probability, like the tossing of a coin, is also subject to mathematical laws; moreover everyday observations suggest that the disorder of randomness is often a sign of the absence of intelligent management. Consequently mathematical randomness is inclined to subsume under the heading of mechanism. Nowadays those physical mechanisms are portrayed as being so good at creating variety and form that many doubt that a Divine creator and sustainer is needed at all; the successes of mechanism in generating patterns may prompt the idea that somehow mechanism can even “self create”; such a notion may in the final analysis be unintelligible, but it is probably at the back of some people’s minds. The upshot is that the concept of the cosmos as a grand logically self sufficient mechanism is now so embedded in our consciousness that many effectively say of God “I do not need that hypothesis”

The underlying ideas driving this kind of thinking are a very anthropomorphic; gone is the idea that God is so totalizing an entity that He is an environment, but instead God is imagined to be in an environment - almost to an extent reminiscent of the Grecian view of gods, gods who have very human attributes and live in a very human environment. The picture is of a God who, much like a human artisan, one day creates a cosmic sized mechanism that once running needs little sustenance, and which he can then walk out on and leave to manage itself.

Ironically the fundamentalists and anti-evolutionists share in this mindset; they have a sneaky suspicion that the atheists are right and that somehow mechanism, like the machines of the industrial revolution, is likely to put people out of work - even a cosmic designer. Anxious therefore to have a God who doesn’t put himself out of a job they downplay the abilities of mechanism to generate form and variety. It is no surprise then that for fundamentalists and anti-evolutionists using mechanism to explain life is bad, bad, bad, whereas using Divine fait is good, good, good.

There is a strong common gut feeling that the “Law and Disorder” mechanisms of modern physics betoken a regime that can function apart from the presence of God. The underlying anthropomorphism inherent in this form of deism is not only at the root of atheist thinking but also, ironically, not far away in anti-evolutionist thinking. For example in this blog entry on Uncommon Descent we are presented with an apt metaphor for this sentiment; we hear of “Darwin’s unemployed God” as if, as I have already said, God is a frustrated divine artisan in some Greek myth. It is this sort of anthropomorphic outlook which, I submit, means that Richard Johns paper on self organization appeals to anti-evolutionists. (See my blog post here). Johns’ thesis tries to show how a law and disorder package cannot effectively be the creator of complex variety and form. This (false) conclusion will find a very receptive audience amongst the anti-evolutionists, because so many of them have a subliminal deistical view that although law and disorder needs little or no divine management, it nevertheless has no right to compete with God as the creator of form and variety. They are therefore anxious to downplay the role of physical mechanism and re-employ God as the miraculous intervener by positing a world that requires large dollops of arbitrary divine fiat.

At the start of his paper Johns uses a polemical method that helps assist the anti-evolutionist view of minimizing the role of mechanism. Johns hamstrings any chance that self organization might have been the cause of evolution by explicitly excising any intelligence that could be used to select a dynamical system which could favour evolution. He effectively posits a know-nothing agent who is completely blind to an overview of the situation and thus has no chance of selecting the dynamics that gives self organization a chance. He is not only tying the boxer’s hands behind his back but he has blindfolded him as well. Ironically Johns is kicking God out of His own creation

In the anti-evilutionist and fundamentalist mind evolution is defined by the absent of contrivance, so it is no surprise that their version of evolution fails to work and that they regard themselves as the only purveyors of authentic intelligent design. They don’t believe that theistic evolutionists can be serious about intelligent design, and as I have already indicated, their subliminal deism will lead them to accuse Theistic Evolutionists of giving God his redundancy notice. That’s in spite of the fact that an evolutionist like Sir John Polkinghorne claims to be an intelligent design creationist. It’s little wonder then that people like John Polkinghorne are not sympathetic to the anti-evolutionists. Neither am I. I wouldn’t say that I’m a 100% convinced by standard evolutionary theory (Caveat: that may be down to my ignorance of the details of evolutionary theory) but I stand by it partly because of some of the crass philosophy one finds amongst the anti-evolutionists. As Nietzsche said:

Sometimes we remain true to a cause simply because its opponents are unfailingly tasteless. (or stupid – ed)

Sunday, November 14, 2010

YEC Star light Travel Time: If at first you don’t succeed…


Judging from this article over on the Institute of Creation Research Young Earth Creationist Russ Humphreys is still beavering away on his geocentric cosmology. The article is dated 1st Nov 2010 but it points back to an article here which in turn refers to some 2007 work by Humphreys where he attempts to perfect his model. I had a look at Humphreys ideas in my blog post here, but I haven’t seen this later work. Essentially Humphreys idea involves postulating a finite asymmetrical big-bang-like event with the Earth near the centre. His hope is that the resulting space-time metric generates enough time dilation in the vicinity of the Earth to slow down time to the extent that only about 6000 years  passes in the Earth’s locale since creation.

One of the problems in Humphreys earlier work, the article claims, is that his models “Did not provide enough time dilation for nearby stars and galaxies,”. Humphreys later models, I gather, attempt to address this problem. This difficulty actually brings out just how geocentric Humphreys model must be: Problems start little by little for the YECs at a mere 6000 light years from Earth. At all distances greater than that light is hard pressed to reach is in time unless the “Humphreys effect” kicks into action in stages for astronomical objects seen beyond that distance. A distance of 6000 light years covers only a small fraction of own galaxy let alone the wider environs of the cosmos. Thus, the Earth, according to Humphreys, must lie very precisely at the centre of his cosmic model. In fact I think that Humphreys is effectively claiming that the Voyager gravitational anomaly may be evidence of the special status of Earth’s locale.

I suspect that Humphreys ideas will come to grief in time. If the “Humphreys effect” can be thought of as a concentric gravitational field with the Earth centrally placed, then the fact that the metric of that field must at some point in the past have very steep differentials within a few thousand light years from Earth’s locale would, I guess, considerably distort the shape of our own galaxy. Off the top of my head let me just say that I’m not aware astronomical observations support such a conclusion.

Anyway, Humphreys is at least showing some respect for science and the integrity of the created order by taking this approach; that is, he is trying to craft a creation narrative that doesn’t make recourse to large dollops of arbitrary Divine fiat. His type of approach is probably the best bet for the YECs - although many YECs may feel uneasy about the fact that according to Humphreys  billions of years of history passes in most of the cosmos implying that there are likely to be planets out there with an “old Earth” geology. As for the ideas of YEC creationist Jason Lisle they are best binned and forgotten about.

Friday, November 12, 2010

Some Light Relief: Trench Humour on Remembrance Day.

The video below is absolutely priceless: I would normally publish this sort of thing on my Views News and Pews blog but I thought that Quantum Non-Linearity was in need of some lightening up:


Screams of laughter with Pastor Kerney Thomas

Perhaps I have a strange sense of humour but I was legless after watching this guy. He really does seem to be real and not just a spoofer. Who would have guessed that somebody out there would have come up with this one and provided us with such an unexpected idiosyncrasy for us to gawp and wonder at! Think of the sheer tragedy of our world with its wars of mass destruction, numerous sufferings and evils .... and yet in the midst life’s deeply serious trench warfare it suddenly throws up this sort of peculiar, asinine, frivolous inanity which contrasts so gloriously against the backdrop gravitas of life’s affairs. For a while it breaks the tension of contention and causes roars of laughter.

“There’s nowt so queer as folk” as the saying goes. Some people say that “You can’t make this stuff up!” Well, I once had a jolly good go at making it up:

http://viewsnewsandpews.blogspot.com/2007/05/fragrant-flatulence-blessing.html
http://viewsnewsandpews.blogspot.com/2006/10/moshing-mayhem-coming-to-your-church_13.html
http://viewsnewsandpews.blogspot.com/2006/12/signs-and-blunders.html
http://viewsnewsandpews.blogspot.com/2006/10/soul-searching.html
http://viewsnewsandpews.blogspot.com/2006/11/dont-try-this-at-home.html
http://viewsnewsandpews.blogspot.com/2006/11/plain-truth.html
http://viewsnewsandpews.blogspot.com/2007/10/death-by-sermon.html
http://viewsnewsandpews.blogspot.com/2006/12/signs-and-blunders.html




Wednesday, November 10, 2010

Self Organisation


According to my reading of Richard Johns, this complex fractal landscape can't happen

As promised in my last two blogs here is my post giving a more detailed look at Richard Johns’ paper on self organization. The central part of Johns’ paper is his “Limitative Theorem”, in which he states:

A large, maximally irregular object cannot appear by self organization is any dynamical system whose laws are local and invariant.

Understood in its most literal sense the above statement by Johns is false. But, to be fair, it does rather depend on how one interprets the terms in Johns’ formulation. Interpreting these terms using excluded middle “true or false” logic, then Johns’ central thesis is beguiling. However, if interpreted in the fuzzy terms of probabilities which deal with typical cases rather than special cases then Johns is seen to have point.

***

If you toss a coin many times you end up with a sequence of heads and tails that can be conveniently represented by a long binary pattern of 1s and 0s. This pattern of 1s and 0s can be broken down into a contiguous series of segments of any chosen length of, say, n tosses. The probability of finding a segment containing k 1s is then given by Bernoulli’s formula :


where p is the probability of heads being thrown and where:


A corollary of this formula is that segments with equal numbers of 1s have equal probability. For example with n = 10 a segmental configuration such as 1010101010 is as equally probable as say 111110000 or 1000101011.

Bernoulli’s formula is most likely to be thought of as device for statistical prediction in as much as it can be used to calculate the expectation frequencies of segmental configurations like 1000101011 or 10101010 etc. The reason why the formula works is because the number of possible sequences that can be constructed from 1 and 0s with segmental frequencies consistent with the formula is overwhelmingly large compared to the number of sequences that deviate from the Bernoulli expectation frequencies. Thus, given that a sequence is generated by a chance source then it is clear that a member taken from the class of sequences conforming to the Bernoulli expectation values is going to be a highly probable outcome.

But statistical prognostication is not the only way that Bernoulli’s formula can be used. It is also possible to use the formula as an indication of the configurational complexity of a sequence. To do this we take a given sequence of 1 and 0s and enumerate the frequencies of its segmental configurations. If the sequence so enumerated returns frequencies consistent with the Bernoulli values then that sequence is deemed to be maximally complex. If, given a set of enumerated segmental frequencies, the number of possible sequences consistent with that enumeration is represented by Z, then Z is maximised for a Bernoulli sequence. If, on the other hand, a sequence deviates from the Bernoulli expectation values, then correspondingly Z will deviate from a maximum value. The departure of Z from its maximum is a measure of the departure of the sequence from maximum complexity. I have dealt with these matters more rigorously elsewhere, but for the purposes of this article we can now get a handle on Richard Johns’ paper. Johns’ paper makes use of something he calls “irregularity”, a concept closely related, if not identical to the concept of configurational complexity that I have defined above: Johns breaks his patterns up into small chunks and then defines the concept of irregularity as the number of possible configurations consistent with the frequency distribution of the segmental configurations.

Johns’ paper is an exploration of the relationship between configurational complexity (or “irregularity” as he calls it) and computational complexity. Computational complexity, as Johns understands it, is measured in terms of the execution time needed to reach a pattern; in particular he is concerned with the time needed to generate configurationally complex patterns. Johns arrives at the conclusion - as stated by his Limitative Theorem above - that maximally irregular patterns cannot appear as a result of self organization. What he actually means by “cannot appear” is that maximally irregular patterns cannot be produced in what he calls a “reasonably short time”. Thus, according to Johns, complex irregular patterns are computationally complex – that is, they need large amounts of execution time.

A few moments reflection reveals that there is a correlation between computational complexity and configurational complexity as Johns submits. This correlation arises simply because of the sheer size of the class of configurationally complex patterns. The value of Z rises very steeply as the segmental frequencies move toward the Bernoulli values. In fact the value of Z far outstrips the number of steps that can be made in a realistic computation. Thus, if we imagine a computation taking place it is clear that given realistic execution times any computation, whether controlled by a deterministic algorithm or some aleatory process, is only going to be able to visit a relatively small number of configurations that make up the huge number Z. Thus, it is fairly easy to see that the chances of a computation generating an arbitrarily selected complex configuration in a realistic time is very small. This, then, is essentially Johns’ Limitative Theorem: An arbitrarily selected irregular or complex configuration is very probably going to have a high computational complexity.

But we must note here that “very probable” doesn’t rule out the possibility of improbable cases where self organization appears to take place in a realistic time. In fact at the start of his paper Johns declares that cases where self organization depends on starting from “fine tuned initial conditions” breaks the rules of his game and therefore doesn’t classify as self-organisation. The big problem with Johns’ formulation is over the meaning of his terms; in some people’s books an example of a complex pattern that makes a surprisingly quick appearance from particular initial conditions would still classify as self organization.

Johns’ concept of irregularity, as we have seen, is closely related to the kind of patterns generated by chance processes such as the tossing of a coin. Johns “irregular” patterns and those patterns produced by aleatory processes both have high values of Z. That is why I actually prefer to call them “disordered” patterns and refer to the quantity Z as “disorder”. Given this equivalence between Johns’ concept of irregularity and my concept of “disorder”, it is easy to think of examples where an irregular object is generated in a reasonably short time, apparently violating Johns Limitative Theorem: I am thinking of cases where elementary algorithms generate pseudo random sequences in polynomial execution times; such rapidly generated sequences show a high value of Z. But given that there is a certain amount of doubt about the way the terms of Johns’ Limitative Theorem should be interpreted it is difficult to know whether this example can be considered to violate his thesis. Is a blank memory and a simple algorithm to be considered as “fine tuned initial conditions”? In addition we know that when Johns says in his Limitative Theorem that self organization “cannot appear” he is not speaking in absolute terms but actually means that self-organization cannot appear in reasonable execution times. So perhaps we should rewrite Johns’ Limitative Theorem as follows:

A large, arbitrarily selected, maximally irregular object is unlikely to appear in reasonable time by self organization in any dynamical system whose laws are local and invariant.


This more measured form of Johns’ principle allows for the rare cases where self organization happens. For example, we know that some elementary algorithms can reach a limited subset of irregular patterns in a relatively short time, even though it is mathematically impossible for such algorithms to reach all irregular patterns in reasonable execution times. It is precisely this kind of exception to the rule that leaves the whole issue of self organisation an open question. Therefore as far as I’m concerned Johns’ paper has not eliminated evolutionary self organization from the inquiry. In fact it’s the same old, same old contention that I have raised time and time and again on this blog; complexity is not in general a conserved phenomenon; complexity can be generated from simple symmetrical starting conditions in polynomial times. I have never seen satisfactory cognizance taken of this fact in the anti-evolution community. In fact as I have suggested before, it is ironic that their very positing of a fabulous creative intelligence subtly subverts their anti-evolutionism; for if evolution is an example of self-organization arising from the right regime of generating laws then we are dealing with a very rare case – a case that the level of creative intelligence envisaged by the anti-evolutionists is presumably more than capable of contriving. With breath taking irony the anti-evilutionist's attempts to show the impossibility/improbability of evolution fail to connect the evolutionary conundrum with a concept that looms large in their culture and which stands impassively behind them: Namely, Divine Intelligence!

Some Notes on Complexity.
Although Johns rightly points out that biological structures such as DNA score high on the irregularity scale, irregularity (or “disorder” as I prefer to call it) does not capture an important characteristic of biological complexity and therefore fails to bring out the real reason for the relative computational complexity of these structures. Let me explain:

In one sense the class of disordered sequences is actually computationally elementary: For example imagine the problem being posed of generating any irregular/disordered pattern. The simplest way of solving this problem is to toss a coin: Using this method of “computation” we can arrive at an irregular/disordered pattern in as short a time as it takes to toss the required number of tosses. So let me suggest that a better measure of complexity than the bare value of Z is given by this equation:

Complexity, C ~ Log (Z/W),

where Z is the disorder of the case sort for and W is the size of the specified class whose members all classify as a “hit”. For example, if we specify that we require any maximally disordered pattern then W ~ Z and so C ~ 0 ; which expresses the fact that this particular problem is of low computational complexity, because simply flipping a coin is likely to produce what we want. But, on the other hand, if we specify that the pattern generated is to be a self perpetuating machine it is very likely that Z will be large and W small, thus returning Log(Z/W) ~ large. Hence, the point I am trying to make is that what makes living things computationally complex is not so much their configurational complexity as measured by Z, but rather a combination of configurational complexity and their rarity.

But even this measure of complexity may only be apposite for typical conditions. If there is such a thing as a class of self perpetuating complex patterns that can be arrived in polynomial time from some basic physical algorithms then it is likely that the number of basic physical algorithms is actually smaller than the configurational complexity of the patterns they produce; in which case the computational complexity of these patterns would be smaller than their configurational complexity! In fact, although we don't actually get a "free lunch" here, we at least get a fairly cheap lunch!

Whether or not actually existing biological structures are a rare case of evolutionary self organization, I nevertheless suspect that there are incredibly complex self perpetuating structures out there in platonic space that cannot be reached by evolutionary self organization. Just imagine what they must be like!