Review of “Life Ascending”, Nick Lane, Eurotimes July 2009

Review of “Life Ascending”, Nick Lane, Eurotimes July 2009


This fine book on evolution was well reviewed at the time and won the 2010 Royal Society prize for science books. Here is my review from Eurotimes . Or rather this is a draft, and readers will note one paragraph just trails off… I cannot find the final version online or in my email so I am not sure what followed! This review is focused on the ophthalmological aspects of the book, though not to the exclusion of the wider issues :

Life ascending.
Nick Lane

There are ten great inventions of evolution discussed in Nick Lane’s lucid, stimulating book – life’s origin,
DNA, photosynthesis, the complex cell, sex, movement, sight, hot blood, consciousness, and death. Lane
makes it clear from the outset that invention does not mean a conscious agency purposefully steered the
process, rather he is referring to the ten great innovations that have transformed life that were created
through natural selection. Readers of this journal will have particular interest in the chapter on sight, which
I will therefore focus on in this review, but the whole book is superbly written and extremely enjoyable.

The eye has long been a favourite topic of anti-evolutionists. In 1802, the English utilitarian philosopher William Paley
argued in his Natural Theology that the eye is an organ of such complexity that it is absurd to suppose
that the purposeless blunderings of evolution (evolutionary ideas pre-dated Darwin, of course) could have
produced it. He used the analogy of a blind watchmaker producing a timepiece, which later gave Richard
Dawkins the title of one of his books. Darwin himself is frequently misquoted by creationists and affiliated
persons in this context – he seemed the admit that “To suppose that the eye, with all its inimitable
contrivances for adjusting the focus to different distances, for admitting different amounts of light, and for
the correction of spherical and chromatic aberration, could have been formed by natural selection, seems
… absurd in the highest possible degree.” Darwin went on the write, however, that “if numerous
gradations from a perfect and complex eye to one very imperfect and simple, each being useful to its
possessor, can be shown to exist” the problem is solved.

In fact, we now have models of the evolution of the eye that exceed those of other organs in explanatory
power. The Swedish researchers Dans Eric Nilsson and Susanne Pelger have modelled this succession
of steps, which is each generation is taken as one year, requires somewhat less than half a million years.

The eye does seem, at first glance, to pose a problem to evolutionary explanations of its origin. What’s
more the human eye, with its rods and cones located behind an array of nerves and with its blind spot
where the optic nerve leaves the orbit, does not at first, cynical glance to be especially well designed.
Furthermore, the cant charge of anti-evolutionists has been “what use is half an eye?”, and answering the
question of how a retina could have evolved, separate from the rest of the optic apparatus, is at first
glance difficult. “Evolution is cleverer than you are” is a famous dictum of the evolutionary biologist Leslie
Orgel, and Lane goes on to show not only that the eye is well adapted to its purpose, but that (I am not sure what I said subsequently)

His approach begins, entertainingly for readers of this publication, with the observation that “anyone who
has been to a conference of ophthalmologists will appreciate that they fall into two great tribes: those who
work at the front of the eye … and those who work at the back … the two tribes interact reluctantly, and at
times barely seem to speak the same language.” For this divide, ironically, reflects the half-an-eye
distinction and allows us to consider the evolution of both halves of the eye.

For the retinal part of the answer, Lane travels (literarily speaking – it was the marine biologist Cindy Lee
Van Dover who did the actual exploring) to the most hostile and extreme habitat on earth – black-smoker
vents on the deep ocean floor that support an ecosystem of hardy survivors. Among these is the
ironically named eyeless reef shrimp (Rimicaris exoculata), which as a larva has fully formed eyes.
These are not of use to the adult shrimp, so they are reabsorbed and replaced with a literal half an eye
– a naked retina.

Most doctors will remember rhodopsin, perhaps rather dimly. It is the light-sensitive protein at the heart of
the visual process, being involved in photoreceptor synthesis as well as the initial perception of light.
Rhodopsin evolved from an algal ancestor where it is used to calibrate light levels in photosynthesis.
Rhodopsin is used by some bacteria for a form of photosynthesis.
Lane synthesises the evolution of all the aspects of the eye, although one of the ophthalmological tribes
may feel their area of interest is dealt with in slightly less detail than their retinal brethren. The naked
retina was the first step on the journey. As different organisms’ sheets of light-sensitive were arrayed in
different ways, with some recessing into pits which allowed shadows to be cast and therefore an idea of
where light comes from to be assessed, the trade-off between resolving light and light sensitivity began to
tip the balance in favour of lens formation.

Writers in this field must be tired of having to handle the creationist/intelligent design issue. Lane’s book is
not aimed at this debate, although in the footnotes he refers the reader to “The Flagellum Unspun” by
Catholic biochemist Kenneth Miller which attacks the creationist idea of irreducible complexity, as
exemplified by the development of a flagellum. Lane quotes Miller on intelligent design advocates as
double failures, “rejected by science because they do not fit the facts, and having failed religion because
they think too little of God,” and discusses Pope John Paul II’s views of evolution and the mind (made in
the course of his 1996 pronouncement recognising that evolution was more than a hypothesis) with
respect and sensitivity. Lane is clearly that wonderful thing, an enthusiast able to explain and inform


Silence and the limits of language

“That for which we find words in something already dead in our hearts. There always is a kind of contempt in the act of speaking.”

Nietzsche, The Twilight of the Idols

Some popular sayings about education and mastery reflect a intuition that what is remembered is not always the full truth, or even the essential. There is the saw that “education is what remains when you have forgotten what you learned in school”, one of those quotes ascribed to multiple authors from Einstein to Lord Halifax. There is near-taunt that, on topic X, “I have forgotten more than you will ever know”.

The Nietzsche aphorism above is the epigraph of Harold Bloom’s “Shakespeare the Invention of the Human”. In another translation, the full paragraph is as follows:

We no longer have sufficiently high esteem for ourselves when we communicate. Our true experiences are not at all garrulous. They could not communicate themselves even if they tried: they lack the right words. We have already gone beyond whatever we have words for. In all talk there is a grain of contempt. Language, it seems, was invented only for what is average, medium, communicable. By speaking the speaker immediately vulgarizes himself. — Out of a morality for deaf-mutes and other philosophers.

Over the years, I have often essayed (in the sense of attempt) a philosophy of silence:

A few years ago I realised that silence is a thing in itself. It is not just the absence of sound, or the absence of noise.

There are anthropological texts on silence in different cultures , social history texts on the invention of silence and the constructed nature of concepts such as silence, sound and noise, there are audiological and acoustic texts on sound and how it is created in our brains.

All seem beside the point. Silence is.

Silence is a force, a power. A philosophy of silence will, after all, always be expressed in language, and always trap itself in language.

We are told that absolute silence is unattainable, and in our modern world even relative silence is close to impossible to find. Still, silence is free, and silence is everywhere, in the gaps.

Silence is the punchline of every unspoken joke, the conclusion of every unformulated argument, the summation of all unspeeched thoughts. In the beginning was the word and in the end there is silence.

Through various crooked paths, I have tried to explore through various quotes and passages. Much of this has related to nature, to religion, to mysticism, to reflection.

Perhaps a common thread through all this is this sense of silence surrounding and pervading all our noise. The idea that forgetting can be a marker of the richness of original knowledge, as hinted at in the pseudo-Einstein quote and the forgot-more-than-you’ll-ever-know rhetoric, also implies the vastness of what we do not know. Another near-cliché is that the more one knows, the more one knows what one doesn’t know.

Perhaps my thoughts on silence from nthposition some years ago could be better expressed as silence is not merely an absence, but the positive presence of all that we do not know, do not perceive, cannot find words for.

Nietzsche is a powerful thinker, though I have always found it necessary to “divide through” his rhetoric a little. In the passage from the Twilight of the Idols one can see why Bloom felt he was among the greatest of psychologists, a precursor of Freud, as he expresses the vast domain of the inexpressible that underlies our motivations and actions, and for which we often devise plausible reasons after the fact.

At the An Enduring Romantic blog, we find other Nietzsche thoughts on this, and the contrasting thought of Auden. As An Enduring Romantic concludes:

To try and gather up these scattered remarks into some kind of conclusion: I suppose that we can either view language as the eternal, futile reaching-forth towards an inaccessible essence, doomed to perpetual failure; or we can view it as a mode of creation, creating and evoking a different kind of response from a deeply private, personal sense of awe. On this view, language isn’t partial or incomplete, always falling short of – shall we say – the ideal. It is simply a different manner of response. As Auden says, both kinds of imagination are necessary. The imaginative awe, on its own, will not and cannot give us the forms of beauty that are so integral to the aesthetic experience, because the imaginative awe doesn’t exist through those forms. And so, it is not the case, as Heine says, that “where words leave off, music begins“; and nor is it the case that “the only valuable thing in art is that which you cannot explain.”

The media landscape so many of us inhabit (and of which this blog is a tiny part of, but a part of nevertheless) is one militates against reflection and the silence and space necessary for reflection. Silence has become a countercultural force, possibly the one true countercultural force in a culture in which rebellion and self-conscious individuality co-opted by corporate interests. Both of the kinds of imagination described by Auden are under threat by this radical undermining of any space for reflection and silence.

“I think there is a world market for maybe five computers”

There are quotes – like “Let them eat cake” and an awful lot of things supposedly said by Mark Twain – which are indestructibly associated with the wrong person, or the completely wrong context. This post on the the blog Engage the Fox is an interesting reflection on some reasons why quotes are misattributed. However, the post is focused on why wise or witty sayings are misattributed to celebrities, or better known figures in general (something like this happened with the Mary Schmich column that became the Baz Luhrmann Sunscreen Song which was falsely reported to be a speech by Kurt Vonnegut)

There is another species of misattributed quote – the one that, rather than reflecting the supposed wisdom of the person falsely cited, makes them look foolish or hopelessly out of touch.  And one specific subspecies is the False Prediction – the boldly confident claim that, with the benefit of hindsight, looks totally absurd.

Seven supposed predictions from the world of technology are collected here in a PC World article. My confidence in this article, as will become clear, is pretty low. However it is a useful example of the kind of “prediction” that gets mocked in later years. We allow ourselves a little rather self-congratulatory chuckle at the fools of the past with their nuclear-powered vacuum cleaners and failure to see why anyone would want to own a home computer. Of course, our turn will come.

The very first “Foolish Tech Prediction” highlighted in the PC world article is this:


Foolish Tech Prediction 1

“I think there is a world market for maybe five computers.”
Thomas Watson, president of IBM, 1943

At the dawn of the computer industry, nobody really knew where this new technology would take us. But the explosion of desktop computing that put a PC in nearly every American home within 50 years seems to have eluded the imagination of most mid-century futurists.
After all, when IBM’s Thomas Watson said “computer,” he meant “vacuum-tube-powered adding machine that’s as big as a house.” It’s fair to say that few people ever wanted one of those, regardless of the size of their desk.

(IBM did stay in the business, of course.)

This, of course, does acknowledge that predicting that devices as big as house would ever have a popular appeal would not have seemed reasonable when Watson made his statement.

Except, Watson said no such thing. From Wikipedia:

“I think there is a world market for maybe five computers” is often attributed to Thomas Watson; Senior in 1943 and Junior at several dates in the 1950s. This misquote is from the 1953 IBM annual stockholders’ meeting. Thomas Watson, Jr. was describing the market acceptance of the IBM 701 computer. Before production began, Watson visited with 20 companies that were potential customers. This is what he said at the stockholders’ meeting, “as a result of our trip, on which we expected to get orders for five machines, we came home with orders for 18.”[7]

Aviation Week for 11 May 1953 says the 701 rental charge was about $12,000 a month; American Aviation 9 Nov 1953 says “$15,000 a month per 40-hour shift. A second 40-hour shift ups the rental to $20,000 a month.”

So there you go – something quite different and in context entirely reasonable thing was conflated with various other speculative comments by others (there is more on the Wikipedia page on Thomas Watson) One wonders how many of the rest of PC World’s “foolish tech predictions” were quite so foolish after all