Btw. If you've ever wondered what a complete idiot writes like, you could do worse than read this attack on Drysdale's review. Completely missing the basic soundness of the attack and buying in to Wolframs previously mentioned catchall defense against academic criticism, Stephen Harris basically says nothing, and does so very angrily. In contrast to Drysdales careful reading of all of Wolframs book, his criticism hinges on 1 criticism of Drysdale's - and fails to say anything important about that even.
Just as I was contemplating getting into serious debunking mode myself, I found an extremely thorough Review of 'A New Kind of Science', by a guy named David Drysdale, which does much of what one could want to do of debunking, although it fails in debunking a few of the 'fundamental' claims that need debunking.
All in all the review is good (which the book isn't) but it should point out that some of the new laws of nature Wolfram generously takes credit for are not new and maybe not even surprising - this includes the main Principle of Computation Equivalence according to other reviewers, and certainly includes some of claims of 'groundbreaking' new sources of randomness and complexity from simple rules.
What is particularly great about the review is that it is based on careful reading notes following the text with great accuracy and attention to detail. I particularly liked the dry comments on Wolframs catchall defense for the bloated, overstated, self-aware writing style (under the heading Clarity and Modesty) in the notes.
The champion of 'humanistic intelligence', Steve Mann has defined a new buzzword: Sousveillance. As a piece of language the term is a bit contrived but the idea is interesting. The claim is that what's bad about surveillance is not really the act of surveillance itself, but it is the secrecy and the dehumanizing anonymity of eyes in the sky.
He proposes another way to add further introspection to the commons that keeps societey open but still makes the world smaller and safer - namely surveillance 'from below' (hence the new term 'sur' is 'over', 'sous' is 'under') - meaning technology enhanced 1-1 interactions as the legitimate way to information augment society. His thinking is that it's not really the permanent record the being remembered and held accountable for individual actions, but the correlating and anonymous collection of data instead.
It's an interesting idea to think about even if the idea of the 'sousveillance society' remains rather half baked. It has a strong interplay with ideas of identity in web-space, where ideas of signed encrypted person to person exchanges of identity tokens are beginning to take shape along side more bigbrotherian identity from above, pushed by governments and big corporations (As an aside, Orwell would have loved the newspeak inherent in the name Liberty Alliance. What has emerged from this organization is nothing but big-iron heavy handed control organization identity. No freedom inside.)
The debate over whether or not Bjørn Lomborg's book The Skeptical Environmentalist is science or not has been confused for a number of reasons.
First of all , if it is science it is interdisciplinary, and there are different ideas in different fields about what constiututes science. Actually calling it interdisciplinary is wrong. It is a politically charged debate of a social science problem of allocation drawing it's subject matter and notions of cost and utility from the natural sciences.
So it is not really 'inter'-disciplinary in the sense that there are two-way connections between the three subject layers (or there shouldn't be - Lomborg would maybe claim there is, this being part of his criticism of the environmental sciences). There are three distinct layers to the book and each of these - were the book a little better - would be distinguishable and debatable separately.
It is disconcerting that neither the author nor his defenders or his critics carry out this straight forward separation of concerns, since it renders the debate essentially meaningless.
The high profile of the debate is due entirely to the politics of the book, and not the social science or natural science material. The proponents of the book are almost exclusively proponenents of the political or social science discussions, whereas the criticism from the natural sciences has been quite unanimous (of course there is argument about the findings the book draws it's numbers from - that is what open scientific debate is about - but the majority vote as well as most of the arguments presented are against the books findings).
So the debate on whether or not the book constitutes science is an unhappy debate. While there clearly should be no censorship of the political material or even the allocation debates (rendering the verdict on the books scientific standing meaningless) Lomborg should know better than to pass an enormous accumulation of numbers from an uncontrollable number of sources with a complete lack of consistence as a consequence as well as some important factual and methodical errors in the treatment of the natural science parts of the material. The methodical failures are the most damning. Lomborgs book and to a large extent his rebuttals of criticism completely fail to accept that there are methodical barriers to what statements you can sensibly make about the numbers he quote so very many of. Lomborgs reads the material like the devil reads the bible.
So the conclusion remains that Lomborg should take care not to pass off the natural science parts of his book as anything remotely resembling science, and the comittee on scientific conduct should then simply have chosen not to comment on the book.
How nice that would have been. An open debate on the politics of choice and allocation, without accusations of inquisition from one camp and criminal stupidity on the other. If only Bjørn Lomborg had had the character and the conscience to say what Daniel Dennett does in the preface of his admirable book on evolution:
This book is largely about science but is not itself a work of science. Science is not done by quoting authorities, however eloquent and eminent, and then evaluating their arguments. Scientists do, however, quite properly persist in holding forth, in popular and not-so-popular books and essays, putting forward their interpretations of the work in the lab and the field, and trying to influence their fellow scientists. When I quote them, rhetoric and all, I am doing what they are doing: engaging in persuasion.
In fact Lomborg is doubly guilty of not holding to Dennett's view of science. Not only does he not take proper care to distinguish between what he has primary knowledge of himself and what he has only secondary knowledge of. His material itself is not the original science in most cases, but rather numbers drawn from the persuasion - not the science - of others. Much of the material he uses is secondary (UN and other organisations reports prepared for decisionmakers (i.e. lay readers) not scientists) or tertiary (newspaper commentary on such reports).
Wired News is asking the same question I did in a previous post: Why Did Google Want Blogger?.
The focus here in on PageRank for RSS. One would imagine something like PageRank extensions for RSS even, i.e. letting the XML feed itself carry the kind of data that e.g. Moveable Type TrackBack enables.
The description given sounds not too far off from things that would appear in a Myers Briggs Type Indicator test. The whole concept of the test reminds me of the famous In the Beginning was the Command Line essay by Neal Stephenson.
A short article in the latest Wired summarizes some modern positions in the environment or inheritance debate, i.e. the big question of how much our life determines our selves and how much is predetermined by our genes.
The derogative term for this kind of writing is list journalism - sacrificing form and substance both by simple hoarding on facts without conclusions. But as lists go, this is quite decent. First of all there is a nice off-hand conclusion about the failure of politics in the networked world, broken down in micro-discussion taking in place in 'ever more precise ideological niches' as it is. Secondly the summaries on what some of these micro-discussions mean in terms of really basic questions like this one is well done, tying very diverse questions about freedom, sexuality, economics and technology together by their different attachments to this basic question.
My big brother, who is a sociologist, would probably - taking cues both from the history of philosophy and some empirical work he has done on Bourdieu's concept of lifestyle claim that the rift in the debate is an even deeper one, with fractions why are philosphically materialist on one side and fractions who are philosphically idealists on the other. The idealists would come out in favour of the genetic explanation as some kind of 'destiny' - an externalized source of meaning, whereas the materialists would favour the environmental theory as a source of rationalized behaviourism.
The list Johnson puts forward indicates how the everyday politics of the issues sometimes turn these things upside down: Religious conservatives have a hard time with the science of genetics and say 'nature' - but surely their nature is really much more akin to the genetic explanation with plenty of predetermined moral behaviour only not by DNA but by divine virtue. On the other hand gay-rights activists trying to give homosexuality a grounding deeper than mere behaviour (which one could then debate as more or less 'natural') are embracing genetic predetermination even though you would expect them to favour the argument for personal freedom.
Economists, environmentalists and feminists on the other hand are right on track in their lifestyle niches.
A memo by an ex-Microsoft program manager on Microsoft and the Commoditization of Software sounds a lot like classic disruption speak.
The gist of the essay is that David Stutz (for that is his name) believes Microsoft is doing exactly what the textbook says: Enhancing margins by focusing ruthlessly on the moneymakers of the company: The rich client that is the OS and MS Office combined.
Of course MS is betting on just about everything - branding phones, PDA, and self-invented laptop replacers, i.e. the tablet PC. But all of these efforts are exactly that: Extensions of the desktop brand according to mr. Stutz. The question remains of course whether or not people will stay mainly tethered to the desktop and thus continue to value the connections to it from the new devices. Looking at the mobile market, where jobless teenagers are key drivers of tech it sounds like a hard sell.
They could care less about Office and MS Exchange integration.
Gibson claims that apophenia is what I'm suffering from when I insist on the Cayce-Case comparison (If it's not by explicit design I would suspect some unconscious reuse on Gibsons part before that though): The spontaneous perception of connections and meaningfulness between unrelated phenomena.
This sounds like a diagnosis ready to replace current common but physically hard to quantify diagnoses such as fibromyalgia, constant fatigue syndrome, and whiplash. Certainly there's a tendency to connect too many non-related phenomena, a natural reaction to an increasingly complex world, and the fact that this is now a diagnosis and not just a description of a particular culture creating agency of the brain is an interesting indication of the disconnect between the modern rationality embodied in the information society and the pre-modern religious brain that keeps manifesting itself in new ways all the time.
In this salon interview, William Gibson claims no connection between the Case of Neuromancer and the Cayce of Pattern Recognition as I previously discussed there might be.
I can only repeat that the characters share more than their name. Both are gifted, i.e. supernaturally in sync with the futuristic environment the action takes place in, both are fragile and protected by a strong guardian, and as previously remarked the flow of the action and the parts played by secondary characters are remarkably analogous, so Gibsons claim is not quite believeable, or if it's true Gibson has been finding the very same story that he found in a distant future in Neuromancer once again.
I'll just repeat and rephrase the story analogues:In both stories the action is spun around a prodigy, a protector and a contractor.
The contractor is almost omnipotent - but lacks of course the powers of the prodigy.
In both stories the contractor is - in the end - met by an opposing equally potent power, and the action is moved physically to the territory of this contrary power.
Whether these analogies are merely broad storytelling conventions (the idea of physically contracting the space the action takes place in to heighten the suspense is used in many places like e.g. every James Bond movie in existence, and the prodigy at work between duelling omnipotent forces we of course recognise in the story of Jesus), or very specifically made identical is anyones guess, but the spin given to the character of the prodigy is remarkable identical.
While gifted, the prodigy ends up as some kind of side character in the final resolution of the action (in which the omnipotent powers step onto the scene on their own instead of just through other side characters). The combination of fragility with a mental gift.
Google is showing more signs of Yahoo envy, as if Froogle wasn't enough, with the purchase of Blogger, or Pyra Networks as the company behind Blogger is known.
Exactly which part of Google this ties into is not that easy to say: Weblogs are similar to threaded dialog (so groups.google.com comes to mind), they are mostly commentary on news of some kind (i.e. news themselves and they are of course simply webpages. It seems most likely to be the news angle and aggregation that will get play as a valuable new source of information on the web. Consistently applying pagerank or collaborative filtering methods to trackbacks and such would be a valuable addition to the blogosphere and if Google maintain their "you don't have to be evil" approach and drive open standards for indexable metadata on weblogs instead of a proprietary Blogger standard, that would be just great.
Alternatively, as a way to build community around Googles own news and search pages (The Google cache is of course a favourite source of permalinks to off-blog resources) this move is also much more meaningful than some kind of my.google.com idea...
If it's not that, it looks a little too much like buying simply a valuable community of home page builders, used to paying for premium service, even if the homepages are of a modern more meaningful kind than the usual Geocities "Hello World" page of the Mahir kind.
So while on my day off, contemplating the fate of NATO I went looking for some links to acidic rhetoric (or is rhetoric base - I forget). The best page on the Simpson invented insult 'Cheese eating surrender monkeys' must be this one.
In fact this guy is funny all the way. Among the good stuff he's dug up is a reality-show golddigger who freudian-slips and confuses 'merciful' or 'missionary' with 'mercenary'...
Interesting things are happening in identity space through the use of FOAF. It's identity done like it should be, namely first and foremost as assertions by an individual - not of an individuals relationship to government or corporate entities.
What is on the table is a person to person identification service for the fastest growing two-way web there is, namely the space of weblogs. An implementation by the always on Ben Trott is already available. So now there is a way to handshake when using a weblog which of course makes your digital space (your weblog) much more lived in. While I work for a company with a technology to do something like that I think it can only happen in the way it is happening now from the ground up from the rightful owners of the information. The organizational push of identity from above is a completely different matter of course.
The second information is published, privacy concerns kick in of course.
And these privacy concerns are not so easy to put to rest. FOAF affords signing and encryption of content with PGP, and that works nicely for point to point connections, but conversely encryption completely screws up your ability to casually meet new people and thus the ability to establish public space. To establish a public space you need (in the digital world) indexing.
And that means you're left with a dilemma: You want to publish information about yourself to establish a public persona, but you want to keep some information private (for obvious reasons).
The technical fix that applies is simple. Establish the notion of a persona (which is a role distinct from the identity) explicitly and provide barriers of privacy by establishing a layered presence around that persona. To simplify your interactions you want these personas to interrelate: The private You can of course act directly on behalf of the amazon shopping You and can sign on behalf of the amazon shopping You, but not vice versa. The amazon shopping You and the google searching You don't know anything about one another, nor should they (or sinister behind the scenes information aggregators will be able to correlate way too many of your online behaviours)
Note that the PGP signing approach is too simplified for this kind of thing. Your PGP public key is as dangerous as a publicly known social security number when it comes to learning things about you by aggregating information from various sources that should have been kept private.
A model like that would mirror the way we organize our physical world. Different relationships you have with different entities afford different rights to know stuff about you and to act on your behalf.
Interestingly for any kind of privacy to remain you need quite oblique namespaces to hold these personas, or the namespaces themselves will give the private person away, just like the PGP key did. The human accessible rememberable space for this kind of technology must be rights based and strictly point to point.
What that means is that the namespace of today cannot be used as a safe public space for individuals. You need new dynamic services with a rights system built in. They can be aggregated in the very loose knit fashion of DNS, (in fact the notion of zones make a lot of sense as something akin to personas) but the descent through zones would have to occur at the client and could not safely be aggregated via a server. What that means for the network intensity of this interaction is unclear to me.
If you're not following Warblogging you should be. It is a very good review of the dangers that American freedom is failing entirely to deal with in response to 911. All the freedom invading legislation, and the practices in the face of legislation are reviewed as is the current NATO crisis which is of course a sad case of no one doing the right thing. The decision by France and Germany to up the ante and force the decomission of NATO is the most terrible abuse of an international organization for local means I can think of. That the veto should be anything but is ridiculous. France and Germany must know that what is going to happen will happen regardless of the veto, they have only assured the destruction of NATO with the destruction of Iraq. It looks mainly like domestic policy use of NATO run amok.
In contrast - even if you do disagree with Bush - you can't really accuse him of doing this as a pure interior policy initiative since it might very well be political suicide. If the war drags on into 2004, Bush will not look good as the creator of the next Vietnam, i.e. an unclear conflict on unfriendly soil against an unclear group of enemies without a clear goal. Clearly a war that cannot be won.
The changes in legislation to curb terrorism are a lot scarier than this, since this is erosion of the democracy of the worlds only superpower - 'our former ally' as we may soon have to say. Arguments for the legislation and limitation of rights invariably invoke an image of martial law, i.e. a time limited voluntary restriction of rights to defeat the enemy. But the war on terrorism will not be won in any clear sense of the word. That's not saying that it shouldn't be fought -I think by and large it should - its just that it will never end, or it will not end as a result of American actions at least.So this legislation will never go away again. And that is a very real threat to democracy. It is not really important whether it is to protect the personal safety of a dictator or the public safety of most of the population; if a society is based on a basic and widespread fear of the anonymous citizen it can never be a democracy. The new legislation being proposed seems very much to be that.
The politics of George Paine (anti-Bush and anti-war) are clear, but even if you disagree the summary of the situation he offers is sober and worthwhile.
Og re denne idiotidebat (se nedenfor), så har DR desv?rre bestemt sig for, i et fors?g på at ligne TV2 indtil det br?kfremkaldende, at kaste nogen flere udsendelsers h?derlighed over bord og stille sp?rgsmålet 'Hvad synes du?' istedet. Det faste idiotiprogram er '19 Direkte', selvf?lgelig. Det er 100% 'Hvad synes du?'. Nogen gange rimeligt, fordi man diskuterer rene meninger, men ofte sp?rgsmål hvor det er tåbeligt bare at sp?rge om en mening.
DR2 - kanalen der pr?vet at leve sin kommerecielt dr?bende l?dighed ned - er imidlertid godt med på vognen. Der har k?rt serier om alternativ medicin som var kritikl?se salgstaler for den terapiform som en given udsendelse nu handlede om. Forleden bukkede selv nyhedsredaktionen (Udefra) under for det Mission Capricorn inspirerede vr?vl om at NASA aldrig var på månen. Og endelig k?rer for tiden en serie om at komme videre med sin karriere, hvor en fors?gsperson går til NLP, Body SDS, og karriererådgivning.
Når jeg ser de her alternative forslag, så undrer jeg mig altid over de gode velpr?vede metoder der ikke er med. Blandt de mere gammeldags metoder til at komme videre, som er traditionelle i Europa, men som ikke er n?vnt, er
Normalt anmelder jeg ikke det min storebroder skriver i medierne, men denneher klumme er virkelig god, og så r?rer den ved nogen af de ting der er virkelig vigtigt for at komme ud af det meningsd?dvande Danmark er havnet i.
Neo-idiotisme betegner Henrik den tendens at alle mulige tåbelige historiefort?llinger om hverdagen uden grund i virkeligheden og uden videnskabernes systematik og sammenh?ng eller det politiske rums hensigt og sammenh?ng (man bem?rker det gentagne ord sammenh?ng) bringes på markedet som l?dige alternativer til de videnskabernes eller det borgelige samfunds standardsvar på sp?rgsmålene. Det sker mere og mere og mediedanskere er tilsyneladende fuldst?ndig ligeglade med at de er ved at l?gge enhver rationalitet i kraven ved at bringe alt muligt vr?vl som nyheder fordi der af en eller anden grund er interesse i det. De skulle skamme sig skulle de.
Det er naturligvis legitimt at diskutere alternative synspunkter til de synspunkter som samfundets videnssystemer frembringer, men det er ikke legitimit at erstatte den opvejning af forklaringer mod vidnesbyrd og sammenh?ng med andre rimelige forklaringer med det der er så universelt i moderne formidling: 'Hvad synes du?'. For nu at sige det som det er så er det fuldst?ndig ligegyldigt hvad seerne mener om gulerodsspisnings gavnlige virkning på lungecancer, for de ved ikke noget om det.
'Hvad synes du?' har v?ret svaret på den stigende kompleksitet formidleren er stillet overfor at formidle. En total opgiven i selve formidlingsapparatet overfor den opgave det er at bringe l?seren/seeren/lytteren istand til at foretage en rimelig vurdering, og ikke mindst en vurdering af den rigtige type.
En kommentar til artiklen siger god start, men 'hvem er så de rigtige eksperter', og det er at gå fuldst?ndig galt af sp?rgsmålet. Det handler ikke om at det skal v?re nogen s?rlige sandhedsvidner der siger noget, tv?rtimod. Sandhedsvidner er i alle tilf?lde de v?rste meddelere man kan have om et emne, hvilket er en grund i sig selv til ikke at tro på clairvoyante og synske - der er ingen af os der har en s?rlig gave til at se. Der er derimod nogen der er informerede (per hårdt arbejde, ikke guddommeligt lys) om forklaringsindhold og forklaringsform som man gennem lang tids omhyggelig unders?gelse har fundet giver en nyttig indsigt om et givet emne, og det er lige det det handler om. Forklaringssituationen skal v?re den rigtige.
I?vrigt er hele diskussionen, både dens fremkomst, og den holdning til den jeg giver udtryk for afl?selig fra min yndlingsbog for tiden - Det Hyperkomplekse Samfund. Her kan diskussionen om erkendelsesformer og forklaringssituation også findes bedre og grundigere beskrevet
Interestingly, the site called StagNation does not contain gay porn as suggested by the name, but rather a criticism of the failure of invention in aerospace since the 70s.
William Gibsons new book looked to be really great from press coverage and the topic of the book was completely in tune with the times it seemed, So I hurried up and read it. It was less fantastic than I had hoped, although I did enjoy myself while reading it. There are plenty of nice touches in the cultural observations and language, but in the end the story is too trivial to be interesting and comes apart at the end, like in other Gibson novels.
And in fact even some of the languages seesm stale. The mentions of Google are so frequent you'd think he was getting paid by the word.
The notion of brand allergy is great fun though, and of course the invention of 'Naomi Klein Clothing' - that particular brand of clothing that is brand-less, completely captures the trap that Naomi Klen and cohorts simply can't get out of: It's not the brands that are making us idea-conscious. It's our idea consciousness that makes brands a possibility. Or, to paraphrase Frank Zappa playing a room full of a previous generation of left-wing anti-establishemt types: "Don't kid yourself - evryone in this room is wearing a uniform".
If you want a second opinion - read the wired review.
Seeing as steganography (the hiding of information invisibly within another information source) is a theme in the book, one can't help but notice the many similarities to neuromancer, Gibsons first book
Our protagonist is called Case (albeit spelled differently this time). The plot takes the form of a global search for information involving strange and shady characters. Behind the scenes mighty conspiracies of wast power juggle for supremacy. Our hero is physically insignificant but is provided by the powers running the action with a physically superiour watchdog. And in buth novels the powers running the show emerge succesful in the end and continues to run the show, even stronger than before.
So much for narrative pattern recognition, as for actual steganography, i.e. formal differences between the novels revealing new meaning, I doubt there is any.
If this guy doesn't deserve a little free advertising, nobody does.
For the tiny sum of 20$ you can - if you're stupid enough - buy a short time only drawing of a slogan of your choice on a no to manly looking chest. Available as a 640x400 JPEG image.
It's not that I don't use IE, but Microsoft seem intent on being big enough assholes that one would go out of ones way to stop.
Recently they've once again deliberately broken the browing experience browsers for no reason at all, by sending bogus data to you if you're using Opera. Whats's next? A patriotism defence for The Great American Browser?.
Moveable Types standard page style is a very nice implementation of a "publish or perish" algorithm. Links aren't shown if they are too old - and the right-aligned menu disappears below the main content if main content doesn't take up enough space.
Disappointed as I was by Small Pieces Loosely Joined, in particular about the naive descriptions of the 1-1 society, and the 'global village' approach to networked life, I was happy to find a much heavier (in content not pages) book that adressed the problem of information saturation in the networked society within the five first pages! I am talking about The Hypercomplex Society which is just about in print in an english edition but came out in Danish in 1998.
This is a brilliant book by a Danish professor of multimedia, Lars Qvortrup, and it is good to see that it is being translated into english. The (danish edition that I have) is extremely well written and kept in a light-hearted style in a mock dialogoue with Qvortrup's daughters - supposedly preparing an essay for school on the information society The text makes no excuses about explaining the content with reference to a host of philosophers and social scientists, with quotes dating from 1486 to the present.
The book does what Small Pieces fails in trying - namely present the nature of the networked society from a human perspective as opposed to a technological perspective. It is refreshing to get a a view on the impact of technology in the networked society that explains this society entirely in terms of the ideas and developments driving people and society and not in terms of the enabling technology. The concept driving the book - the Hypercomplex society - is a real eye-opener for me, in pointing out the fundamentally changed role people play in this new society. This is not just mysticism but a workable rational theory about people, communication, and networks. It manages to be very simple to explain and still have a very great explanatory power in describing our present society.
The idea makes perfect sense, and furthermore it also makes sense of the development in our understanding of meaning since the ideas about safe universal meaning finally collapsed completely in the 1920s. It's easy to feel (or at least it was during the postmodern wave of the 80s) that the collapse of universal meaning has left no meaning at all, but this book does a fine job at delineating what boundaries we have to accept for meaning, but still recovering the meaning that is left for us to share.
If you're conversant with the literature it's based on this should not be shocking, but the ideas are presented in a a very straightforward and understandable style, considering the depth of the material. You're greatly helped in your reading if you know your history of ideas up until 1900, but for later developments the text is fairly self-contained.
A word of warning for the casual reader: The english edition promises to be much more scholarly and probably more condensed, being a reworking of three danish language titles into one monography.
In a previous post I hyperlinked a short text by Clayton M. Christensen - remarking then that the article may have just been a rehash of the ideas from his bestselling book The Innovator's Dilemma. It was.
It is a very interesting book and it is well researched, with precise accounts of cases of disruptive and sustaining innovation in a number of different markets.
The ideas presented are clear, and clearly backed up by the cases given. After having read it one can't help though to be left with two feelings:
As to the first point: The description of Christensen's discovery is quite adequate in the Tech Review article. What is not there, is the accounts of how to deal with disruptive change, as well of course as the detailed case stories.
As to the second point: There is a passing reference only to the almost complete analogy to the theory of evolution Christensen's ideas present.
We can give his findings in biological terms, cluetrain style. First in Christiansens terms:
The analogy is perfect and explains Christensen's findings. The fact that companies fail to see the disruptions as competition may be seen simply as a case of the disruptors utilizing different resources. Eventually the disruptors migrate towards the resources consumed by the incumbents but not at onset.
Clearly the evolutionary pressure does not pitch the disruptors against the incumbents, since they occupy different niches, so the incumbents do not evolve to fight the disruptors.
Christensen does mention ideas like this that apparently appear in the literature, but he dismisses them, even though they perfectly explain his findings. His dismissal comes from a an unwillingness to accept the determinsm inherent in this biological mapping, but even his alternative theory of how organisms, er companies, adapt to disruption has biological interest.
Let's just review the two main competing ideas of evolution when that theory was new. The idea that environmental pressure forces change has been discovered by several scientists through history. Before Darwin there was Lamarck. Lamarck is mainly famous for his theory that behavioural change forced by environmental change is actually heriditary, which is obviously false for the theory of gene based evolution, that has been so successful.
The determinism in the economic uses of evolutionary theory comes about as a darwinian consequence of the mapping to biological terms.
While evolution in darwinian terms means that the species adapts, no individual organism can change to adapt, they merely fail or succeed. But companies can adapt. Indeed, the dstinguishing characteristic of cultural, knowledge based systems - as opposed to natural, gene based systems - is that culture is Lamarckian.
Cultural evolution is based on adapting individuals that are changed by the changes in their ecological niche.
The other reference that comes to mind related to this book is Thomas Kuhn's 'The strufture of scientific revolutions'. This too presents a mapping, an even more obvious one, where the disruptions constitutes paradigm shifts in the field of industry. This analogy is also absent from the book.
Reading the disussions about the design of perl 6 which seems to have come to a screeching halt bogged down by arcane disussions of excessivily promoted new features one is reminded of the old perl slogan "worse is better" - meaning 'usable' is better than 'perfect'.
I would however like to add a personal sentiment: It is not that much better. A language has to emerge as an end result. If that language is more difficult than perl to explain or - even worse - to understand, then it doesn't really matter what nifty capabilities it has. It will not be used. Simplicity wins.
Thinking about this, people's fondness for php and python, a previous post about - among other things - average developer skills, and having just read The Innovator's Dilemma, I think it is safe to say that perl is experiencing a disruption. And even worse, the perl community is reacting exactly like an incumbent champion of industry would, adding features - and cost - and spending endless amounts of time on sophistication and 'getting it right' to tweak the mileage the language offers.
Perl in itself was disruptive when it appeared. It is a remarkable unifying improvement on the unix toolchain, replacing shell scripts, awk, grep, etc. with a unified extensible tool. This made new things possible - like building a lot of the web - and was arguably the start of the rise of scripting languages as first class citizens of the software world.
The disruption perl is facing is the attack of the average programmer. People think perl is hard. They're probably right. So they turn to tools which may not enjoy the advantage of CPAN, with which you can do literally everything, and the best build system in the world (CPAN again and the completely standard modules) but they get the job done. And more people can learn how to use them, so there is no question perl is losing mind-share.
Reading Lisp discussion lists will give you a sense of what I'm talking about. And reading Peter Norvig's 1999 summary of the state of Lisp is a lot like reading the 'State of the Onion' that set perl 6 in motion. Incidentally, as far as I understand the examples in the 3rd edition of Norvig's AI book which used to be in Lisp, are now written in Python.