Quantum Entanglement

Quantum entanglement has been in the news (again), and in response to several puzzled enquiries, here is my attempt to disentangle the subject.  Adapted from various bits of my previous writings, including Q is for Quantum, Schrödinger’s Kittens, and Science: A History in 100 Experiments.  I also recommend George Musser’s book Spooky Action at a Distance.

Common sense tells us that if I hit a cricket ball on a playing field in England, this has no effect on a cricket ball in Australia, even if the two balls were manufactured in the same batch in the same factory and once nestled together in the same box.  But does the same common sense apply to things in the quantum world, like photons and electrons? Bizarre though it may seem, in the twentieth century quantum physics proved  by experiment that the answer is “no”.
It all started in 1935, when Albert Einstein and his colleagues Boris Podolsky and Nathan Rosen presented a puzzle (sometimes known as the “EPR Paradox”) in the form of a thought experiment.

Faster than light?
By 1935, Einstein was settled in Princeton, at the Institute for Advanced Study.  He had been working with two younger colleagues, Boris Podolsky (1896-1966) and Nathan Rosen (1909-1995), and together (led by Podolsky on this occasion) they had come up with what seemed to them an unarguable refutation of the nonsense (as they saw it) inherent in the idea of collapsing wave functions and the Copenhagen Interpretation.  This is the bizarre idea (still widely, but incorrectly, taught as the best way to understand the quantum world) that nothing is real until it is measured.  An electron for example, exists (according to the Copenhagen Interpretation) as an indeterminate superposition of waves until someone looks at it, when it “collapses” into a point, before spreading out as waves again as soon as you stop looking.  Their paper describing what became known as the “EPR Paradox”, even though it is not really a paradox, appeared under the title “Can quantum mechanical description of physical reality be considered complete?” in the journal Physical Review in May 1935.  They described the puzzle in terms of measurement of position and momentum, but I shall use what seems to me a simpler example involving electron spin.
Imagine a situation in which two electrons are ejected from a quantum system (such as an atomic nucleus) in different directions, but required by the laws of symmetry to have opposite spin.  According to the Copenhagen Interpretation (largely devised by the Dane Niels Bohr, hence the name), neither of the electrons possesses a definite spin until it is measured; each exists in a 50:50 superposition of spin up and spin down states, until it is measured.  Then, and only then, the wave function collapses into one or the other state.  But in this example the laws of symmetry require the other electron to have the opposite spin.   This is fine when both electrons are in the superposition of states, but it means that at the instant one electron is measured, the other electron, which might by now be far away (in principle, on the other side of the Universe) collapses into the opposite state at the same instant.  How does it know to do this?  It seems that what Einstein called a “spooky action at a distance” links the two particles, which communicate with one another faster than light.  And all quantum entities (which means everything) must be linked  in the same way.
It is a key tenet of the theory of relativity, which has passed every test ever applied to it, that no signal can travel faster than light, so Einstein, in particular, saw this as a complete refutation of Bohr’s ideas.  The EPR paper concluded that this makes the reality of properties of the second system “depend upon the process of measurement carried out on the first system, which does not disturb the second system in any way.  No reasonable definition of reality could be expected to permit this.”
The alternative that Einstein favoured is that there is some kind of underlying reality, an invisible clockwork which controls the workings of the Universe and gives the appearance of uncertainty, collapsing wave functions and so on, even though “in reality” each of the electrons, in this example, always has a well-defined spin.  In other words, things are “real”, not in a superposition of states, even when we are not looking at them.  The idea that the Universe is composed, even at the quantum level, of real things that exist whether or not we observe them, and that no communication can travel faster than light, is known as “local reality”.
It is, perhaps, jut as well Einstein did not live to see a series of beautiful experiments carried out in the 1980s which proved that local reality is not a good description of the Universe; more of this later, but the implication is that we are forced to abandon either the local bit (allowing communication faster than light) or reality (invoking instead collapsing wave functions).  But nobody knew this in 1935, and Erwin Schrödinger in particular was delighted when he saw the EPR paper.  He wrote at once to Einstein, commenting that “my interpretation is that we do not have a q.m. that is consistent with relativity theory, i.e, with a finite transmission speed of all influences”, and in a paper published in the Proceedings of the Cambridge Philosophical Society later that year  said “it is rather discomforting that the theory should allow a system to be steered or piloted into one or the other type of state at the experimenter’s mercy in spite of his having no access to it.”  This was the genesis of Schrödinger’s famous cat, and also introduced the term “entanglement” into the quantum story.

The truth about the cat in the box
The ideas encapsulated in the famous “thought experiment” involving Schrödinger’s cat actually came in no small measure from Einstein, in the extended correspondence between the two, triggered by the EPR paper, and preserved in the Einstein Archive at Princeton University.  Einstein introduced the idea of two closed boxes and a single ball, “which can be found in one or the other of the two boxes when an observation is made” by looking inside the box.  Common sense says that the ball is always in one of the boxes but not the other; the Copenhagen Interpretation says that before either box is opened a 50:50 wave function fills each of the boxes (but not the space in between!), and when one of the boxes is opened the wave function collapses so that now the ball is in one box or the other.  Einstein continued “I bring in the separation principle.  The second box is independent of anything that happens to the first box.”
In a later letter, Einstein came up with another reductio ad absurdum.  He suggested to Schrödinger the idea of a heap of gunpowder that would “probably” explode some time in the course of a year.  During that year, the wave function of the gunpowder would consist of a mixture of states, a superposition of the wave function for unexploded gunpowder and the wave function for exploded gunpowder:
In the beginning the -function characterises a reasonably well-defined macroscopic state.  But, according to your equation, after the course of a year this is no longer the case at all.  Rather, the -function then describes a sort of blend of not-yet and of already-exploded systems.  Through no art of interpretation can this -function be turned into an adequate description of a real state of affairs  .  .  .  in reality there is just no intermediary between exploded and not-exploded.
Stimulated by the EPR paper and his correspondence with Einstein, Schrödinger wrote a long paper, published in three parts in the journal Die Naturwissenschaften later in 1935, summing up his understanding of the theory he had helped to invent.  It was titled “The Present Situation in Quantum Mechanics”, and it introduced to the world both the term entanglement and the cat “paradox” that (like the EPR “paradox”) is not really a paradox at all.  An excellent English translation of the paper, by John Trimmer, appeared in the Proceedings of the American Philosophical Society in 1980, and can also be found in the volume Quantum Theory and Measurement edited by John Wheeler and Wojciech Zurek.  Many garbled accounts of the cat in the box “experiment” have appeared over the years, but it is best to go back to this source and Schrödinger’s own words (as interpreted by Trimmer) to get the puzzle clear:
One can even set up quite ridiculous cases.  A cat is penned up in a steel chamber, along with the following diabolical device (which must be secured against direct interference by the cat): in a Geiger counter there is a tiny bit of radioactive substance, so small, that perhaps in the course of one hour one of the atoms decays, but also, with equal probability, perhaps none; if it happens, the counter tube discharges and through a relay releases a hammer which shatters a small flask of hydrocyanic acid.  If one has left this entire system to itself for an hour, one would say that the cat still lives if meanwhile no atom has decayed.  The first atomic decay would have poisoned it.  The -function of the entire system would express this by having in it the living and the dead cat (pardon the expression) mixed or smeared out in equal parts.
It is typical of these cases that an indeterminacy originally restricted to the atomic domain becomes transformed into macroscopic indeterminacy, which can then be resolved by direct observation.
In other words, according to the version of quantum mechanics that was generally taught and widely (but not universally) accepted for the rest of the twentieth century, the cat is both dead and alive (or if you prefer, neither dead nor alive) until somebody looks inside the chamber and by the act of observation “collapses the wave function”.  But there is nothing in the equations about collapsing wave functions.  This collapse business is an entirely ad hoc idea, introduced by Bohr, with no basis in reality.  That is the single most important message to take away from Schrödinger’s thought experiment (which, I stress, is indeed “all in the mind”; nobody has ever done anything like this to a real cat).  Although the “cat-in-the-box” idea did not generate widespread interest in 1935, Einstein at least fully appreciated the importance of Schrödinger’s puzzle; Schrödinger described the idea to him in a letter, before his paper was published, and Einstein replied:
Your cat shows that we are in complete agreement concerning our assessment of the character of the current theory.  A [wave]-function that contains the living as well as the dead cat just cannot be taken as a description of a real state of affairs.
Schrödinger was right to point out the nonsensical nature of the concept of the collapse of the wave function, and there are much better ways to understand the workings of the quantum world – the most intriguing of which Schrödinger himself later came close to developing (my own preferred explanation, which involves the Many Worlds Interpretation; but that is anther story).
The EPR puzzle itself was later refined by David Bohm, and later still by John Bell.  In its later form, the puzzle concerns the behaviour of two photons (particles of light) ejected from an atom in opposite directions.  The photons have a property called polarization, which can be thought of as like carrying a spear pointing either up, down or at any angle across the direction of travel; the key feature of the puzzle is that the photons must have different polarization, but correlated in a certain way.  For simplicity, imagine that if one photon is vertically polarised the other must be horizontally polarised.
Now comes the twist.  Quantum physics tells us that the polarization of the photon is not determined – it does not become “real” – until it is measured.  The act of measurement forces it to “choose” a particular polarisation, and it is possible (indeed, straightforward) to set up an experiment which measures a photon to decide if it vertically polarised, or horizontally polarised.  The essence of the EPR “paradox” is that according to all this, measuring one of the pair of photons and forcing it to become, say, vertically polarised instantaneously forces the other photon, far away and untouched, to become horizontally polarised.  Einstein and his colleagues said that this is ridiculous, defying common sense, so quantum mechanics must be wrong.
After John Bell presented the puzzle in a particularly clear form in the 1960s, the challenge of testing the prediction was taken up by several teams of experimenters, leading up to a comprehensive and complete experiment carried out by Alain Aspect and his colleagues in Paris in the early 1980s.  Although such experiments have since been refined and improved, they always give the same results that emerged from the Aspect experiment itself.
The key feature of the experiment is that the choice of which polarisation will be measured is made automatically and at random by the experiment, after the photons have left the atom.  At the time the photon arrives at the polariser, there has not been long enough for any signal, even travelling at the speed of light, to have reached the other side of the experiment.  So there is no way that the detector used to measure the second photon “knows” what the first measurement is.
It would be very difficult (just about impossible with present technology) to do the experiment literally with pairs of photons, two at a time; but in the Aspect experiment and its successors very many pairs of photons are studied, with more than two angles of polarisation being investigated, and the results analysed statistically.  John Bell’s great contribution was to show that in this kind of analysis if one particular number that emerges from the statistics is bigger than another specific number, common sense prevails and there is no trace of what Einstein used to call “spooky action at a distance”.  This is what Bell expected to happen, and it is known as Bell’s Inequality.  But the experiments show that Bell’s Inequality is violated.  The first number is smaller than the second number.  Experiments are somehow particularly convincing when they prove the opposite of what the experimenters set out to find – it certainly shows that they were not cheating, or unconsciously biased by their preconceptions!  And as Richard Feynman pithily summed up the essence of science, “if it disagrees with experiment, then it is wrong”.  So Einstein was wrong.  But what does it mean?
The pairs of photons really are linked by spooky action at a distance, confounding “common sense”, in the state quantum physicists call entanglement.  What happens to photon A really does affect photon B, instantaneously, no matter how far apart they are.  This is called “non-locality”, because the effect is not “local” (specifically, it occurs faster than light}.  But (and it is not just a big “but” but an absolutely crucial “but”) it turns out that no useful information, such as the result of the 3.30 race at Newmarket, can be transmitted faster than light by this or any other means).  The polarizations of the photons are determined at random.  Measuring the polarization of photon A determines the polarization of photon B, but someone who can only detect photon B still sees a random choice of polarization.  It might be different random choice from the one that would have occurred if photon A had been affected differently, but it is still random.  The only way to extract useful information from studying photon B is to send a message (slower than light) from A to B informing the observer what was done to photon A.  Nevertheless, the Aspect experiment and its successors show that the world is non-local.  And this strange property even has practical implications, in the rapidly developing world of quantum computing.  This may not be a “reasonable definition of reality”, but it is the way the world works.
Feynman was particularly delighted by this definitive experimental evidence of the way the quantum world works.  “I’ve entertained myself always,” he said, “by squeezing the difficulty of quantum mechanics into a smaller and smaller place, so as to get more and more worried about this particular item.  It seems to be almost ridiculous that you can squeeze it to a numerical question that one thing is bigger than another.  But there you are.”
Don’t worry, though if you do not understand how the world can be like that.  As Feynman also wrote, in The Character of Physical Law, “I think I can safely say that nobody understands quantum mechanics  .  .  .  Do not keep saying to yourself, if you can possibly avoid it, ‘But how can it be like that?” because you will go ‘down the drain’ into a blind alley from which nobody has yet escaped.  Nobody knows how it can be like that.”

Fulsome accommodationism in the journal Nature

Couldn’t put it better.

Why Evolution Is True

I don’t know what’s going on with Science and Nature—perhaps the two most prestigious science journals in the world—but both are increasingly catering, if not pandering, to religion. Science and its sponsoring organization the AAAS have a program, funded by Templeton, to increase dialogue between science and religion, and the AAAS has faith-themed events at its annual meeting. Nature publishges editorials and pieces speaking positively about religion, claiming that science and religion both depend on “faith”, and arguing that science and religion are compatible (see here and here, for example).

Now Nature has jumped the shark even farther with a new article by Kathryn Prichard, who works for the Church of England, called “Religion and science can have a true dialogue.” This short piece is still too long, for Prichard simply claims that science and religion can have a fruitful dialogue because religious people are avid followers of—indeed, are hungering for—science…

View original post 926 more words

Inferno of Rock Report – SEPTEMBER 2016

I am now officially. A musicologist!

Hard Rock Daddy

Inferno Of Rock ReportBy Ian Liberman


On August 20,2016, the loyal and deep-rooted fans of the Tragically Hip had the chance to listen to them perform one of their final concerts as part of a tour that would be their last of their 30-year career.  Gord Downie (lead singer/songwriter) of the band was diagnosed with terminal brain cancer in May.  Playing famous songs like “Three Pistols” and “Bobcaygeon,” and their latest album entitled Man Machine Poem, the band weaved a tapestry of Canadian culture and history as they played to tens of thousands of fans.  Playing in their hometown of Kingston, Ontario, at the Rogers K-Rock Centre, the concert (which was aired live on CBC networks) had fans of all ages in attendance.  Although the band will eventually break up, it will always be part of our Canadian culture (like maple syrup without the calories).


In the…

View original post 795 more words

The Woman Who Studied the Sun

Posting this in response to a question I was asked.  It is essentially an extract from my book 13.8


Cecilia Payne won a scholarship to Newnham College, Cambridge (the only way she could have afforded a university education) in 1919. She studied botany, physics and chemistry, but also attended a talk by Arthur Eddington about the eclipse expedition on which he had famously “proved Einstein right” by measuring the way light from distant stars was bent by the Sun. This fired her interest in astronomy, and she visited the university’s observatory on an open night, plying the staff with so many questions that Eddington took an interest, and offered her the run of the observatory library, where she read about the latest developments in the astronomical journals.

After completing her studies (as a woman, she was allowed to complete a degree course, but could not be awarded a degree; Cambridge did not award any degrees to women until 1948) she looked for a way to pursue this interest. There was no chance of a career in research in England, where the only job opportunities for women scientists were in teaching, but through Eddington she had met Harlow Shapley, from Harvard, on a visit to England. He offered her the chance to work for a PhD on a graduate fellowship (even though, technically, she was not a graduate) and in 1923 she left for the United States. Just two years later, she produced a brilliant thesis and became the first person to be awarded a PhD by Radcliffe College (also the first for work carried out at Harvard College Observatory). In it, she established that the Sun is mainly made of hydrogen. But, in a sign of the times, the idea was not fully accepted until two male astronomers independently came to the same conclusion.

Payne’s study of the solar spectrum made use of the then-recent discovery by the Indian physicist Meghnad Saha that part of the complication of the pattern of lines in a stellar spectrum (or the Sun’s Fraunhofer lines) was a result of different physical conditions in different parts of the atmosphere of a star. By the 1920s, physicists knew (as, of course, Bunsen and Kirchoff had not) that atoms are composed of a tiny central nucleus, with one or more electrons at a distance from the nucleus. Dark lines in a spectrum are produced when an electron absorbs a specific wavelength of light, moving to a higher energy level within the atom, and bright lines are produced when an electron drops down from one energy level to another and emits radiation (in the form, we would now say, of a photon). An atom which has lost one or more of its electrons is called an ion, and the spectra of ions are correspondingly different (in a way which can be calculated) from those of the “parent” atoms. Payne measured the absorption lines in stellar spectra and showed how the temperature (in particular) and pressure in the atmosphere of a star affects the ionisation of the atoms there. This makes for a more complicated pattern of lines than if all the atoms were in their un-ionised state.[1] The spectra of stars differ from one another not because they are made of different things, but because of different amounts of ionisation in their atmospheres.

Payne’s great achievement was to unravel this complicated pattern of hundreds of Fraunhofer lines and work out what proportion of different elements in different stages of ionisation had to be present to account for the observations. Some idea of the difficulty of her task can be gleaned from the fact that her thesis was later described by the astronomer Otto Struve as “the most brilliant Ph.D. thesis ever written in astronomy”. She worked out the proportions of eighteen elements in the Sun and stars, discovering that they all had nearly the same composition. But the big surprise was that according to her analysis the Sun and stars are made almost entirely of hydrogen and helium. If she was correct, everything else put together made up only two per cent of the composition of our nearest star, and of all stars. Most of the matter in the Universe was in the form of the two lightest elements, hydrogen and helium. This was almost literally unbelievable in 1925. Payne believed her results were correct, but when Shapley sent a draft of her thesis to Henry Norris Russell at Princeton for a second opinion he replied that the result was “clearly impossible.” On Shapley’s advice, she added a sentence to the thesis saying that “the enormous abundance derived for these elements [hydrogen and helium] in the stellar atmospheres is almost certainly not real”. But with the thesis accepted and her doctorate awarded, she wrote a book, Stellar Atmospheres, which began to persuade astronomers that the results were, in fact, almost certainly real.

The change of mind was aided by the independent confirmation of Payne’s results by other astrophysicists. In 1928, the German astronomer Albrecht Unsöld carried out a detailed spectroscopic analysis of the light from the Sun; he found that the strength of the hydrogen lines implied that there are roughly a million hydrogen atoms in the Sun for every atom of anything else. A year later, the Irish astronomer William McCrea confirmed these results using a different spectroscopic technique.[2] What this shows, more than anything, is that although Cecilia Payne was a brilliant researcher who got there first, this was a discovery whose time had come; given the technology oft the 1920s it was inevitable that the discovery would be made sooner rather than later. In 1929, having carried out a similar analysis using a different technique, Russell himself published a paper confirming these results, and giving due credit to Payne’s priority; unfortunately, because of Russell’s established position in the astronomical community, for some time he was often cited as the discoverer by people who should have known better (or at least, read his paper properly).

Payne went on to a distinguished career in astronomy; in 1934 she married the Russian-born astrophysicist Sergei Gaposchkin, and became known as Cecilia Payne-Gaposchkin. She remained at Harvard throughout her career, in spite of the low status and low pay she received as a woman. For many years, her official title was “technical assistant”, even though she carried out all the research and teaching duties expected of a Professor. It was not until 1956 that she was promoted to become a full Professor — the first female Professor at Harvard. But, like most scientists, she was not primarily motivated by status or salary. In 1976, three years before her death, she was awarded the prestigious Henry Norris Russell Prize by the American Astronomical Society. No doubt she appreciated the irony. In her acceptance lecture, she said, clearly referring to her early work on stellar spectra, “The reward of the young scientist is the emotional thrill of being the first person in the history of the world to see something or to understand something.” Even if someone else tells you it is “clearly impossible”.


[1] I am always careful to use the hyphen in the word “un-ionised” since Isaac Asimov once pointed out to me that the way to distinguish between a scientist and a politician is to ask them how to pronounce the word “unionised”:.

[2] Much later, McCrea was my PhD examiner.

From Here to Infinity

A slightly adapted version of a review I wrote for the Wall Street Journal:

Eyes on the Sky

Francis Graham-Smith

Oxford UP
Eyes on the Sky is an intriguing and unusual book.  Popular books about astronomy usually focus on the wonders of the Universe – black holes, quasars, the Big Bang, and so on.  But radio astronomer Francis Graham-Smith focuses on the terrestrial wonders that have revealed these exciting objects – the telescopes.  Not just optical telescopes, but telescopes peering out at he Universe across all the wave bands of the electromagnetic spectrum, from radio waves to gamma rays.  The story begins with Galileo, who first turned a telescope upwards almost exactly four hundred years ago, and ends with the radio telescopes that combine observations made by different antennae at widely spaced locations to mimic a dish the size of the Earth.  But the author could not have been aware when he wrote the book how neatly this story rounds off an era in astronomy, since as of the fall of 2015 we have a completely new way of looking at the Universe, using gravitational waves, not electromagnetic radiation.
This book-ending is particularly appropriate, since Graham-Smith was one of the pioneers of radio astronomy who opened up the new investigation of the sky in the 1940s, using radar technology developed during World War Two.  As he says, before 1947 “the only real astronomers were those who actually looked through [optical] telescopes”.  Radio astronomy was the first of a clutch of new techniques which opened up new windows on the Universe.  And it has all happened within a single human lifetime.
The approach of the book is both thematic and historical.  The themes address each part of the spectrum, starting with Galileo and “conventional” telescopes then moving on to ultraviolet, infrared, X-ray and gamma ray astronomy, before winding up with Graham-Smith’s real love, radio astronomy.  Within each theme he covers developments historically, including developments such as the use of rockets, satellites and high-altitude balloons to make observations at wavelengths blocked by the Earth’s atmosphere.  The techniques have developed as much as the technology – interferometry and aperture synthesis, the methods used to make a virtual giant telescope out of an array of smaller instruments, are explained as clearly here as anywhere you are likely to find.  But the real fascination of the book is the way it almost incidentally highlights the way science has changed in little more than fifty years.  Galileo might have been amazed by the technology, but he would surely have understood the principles on which the 200-inch Hale telescope on Mount Palomar, the biggest one around in 1947, operated.  But how could he have comprehended the significance of a satellite such as Planck, an observatory orbiting the Earth in order to measure the strength of radio waves from the dawn of time?  Come to that, very few astronomers in 1947 would have taken the idea of such a satellite seriously.
Astronomers of 1947 would also have been astonished at the size of present day astronomical projects, both in terms of the physical dimensions of the equipment involved and the number of people – and nations — collaborating on the observations.  The latest radio telescope to come online is the Five Hundred Metre Aperture Spherical Telescope (known as FAST), in Guizhou Province, southwest China.  As the name suggests, it is a dish 500 metres across (for comparison, the famous Arecibo radio telescope that featured in the movie Contact is 1,000 feet, or just over 300 metres, across).  But the dish of FAST is made up of 4,600 separate triangular panels, connected by flexible joints and a network of steel cables to 2,300 computer-controlled electric motors, which continually adjust the exact angle of each panel to in effect tilt the dish to track objects across the sky as the Earth rotates.  The accuracy of this “active surface” keeps each triangle positioned to an accuracy of five millimetres.
The international nature of astronomy is highlighted by a telescope known (for obscure reasons; astronomers love acronyms) as VISTA.  This has a mirror 4 metres across and is located at the European Southern Observatory, on a mountain top in Chile.  The mirror was designed in the UK, cast in Germany, and smoothed to its precise shape in Russia, before being shipped to South America, where it is now used by astronomers of many nations; there are fifteen member states of ESO.  Not that anyone “looks through” such a telescope; it captures light onto an array of Charge Coupled Devices (CCDs) — sixteen arrays each 2,048 by 2,048 pixels across, making a total of some 67 million pixels.  The resulting images appear on monitor screens, or the data is sucked off by a computer for detailed analysis.
Since the time of Galileo, Graham-Smith points out, the light collecting area of individual telescopes has increased by a factor of more than half a million, while over the past fifty years there has also been a dramatic increase in the performance of detectors, such as CCDs.  And as he emphasises, progress has not yet come to a halt.  Where next?  Well, we now have the ability to image planets orbiting other stars.  The next generation of telescopes may be able to detect signs of life on those planets.

The Story of Epilepsy

Here’s my latest from the Literary Review:


A Smell of Burning:

The story of epilepsy

Colin Grant; Cape


Colin Grant begins his story of epilepsy by explaining that he was drawn to write about the subject because of his brother, an epilepsy sufferer. As it happens, Grant is also a trained doctor, as well as being a writer and broadcaster, so he brings a core of expert knowledge and a perhaps even greater degree of skill as a communicator to the story, as well as his special interest. I cannot claim any medical skill, but I also have a special interest, since I too have a family member who suffers from epilepsy. But you don’t need any special connection to appreciate the absorbing and sometimes horrifying story he has to tell.

The horror comes from the grim treatment meted out to epileptics in the past – and not always the distant past. Grant tells the story more or less chronologically, interleaving it with episodes from his brother’s life, and focusing on different aspects of “treatment”, if that is not too polite a term for some of the techniques described here. The result is curiously contrapuntal. The broad story is one of things slowly getting better; the personal story is one of things slowly getting worse. This is so clearly telegraphed from early on the book that no spoiler alert is really needed if I tell you that it does not have a happy ending. Equally clearly, writing the book represented a catharsis for the author. All of which lifts it above the level of the kind of professional history of the subject that might have been written by an author with equal skill but no personal involvement.

It is the recent history that is the most startling aspect of the story. We can hardly blame the Ancient Greeks or Romans for having a superstitious attitude to seizures. But 22-year-old Graham Greene seriously contemplated suicide when told of the diagnosis in 1926, and the story is now well-known that the youngest son of King George V, Prince John, was hidden away in the country because of the “stigma”. Right up until 1970 in the United Kingdom a marriage could be declared void “if either party was, at the time of marriage, of unsound mind, mentally defective, or subject to recurrent fits of insanity or epilepsy.” It was three years after the UK decriminalised homosexual acts in private between two men that epilepsy was removed from this list. In some countries, the law is less enlightened, even today. Until 2010, the official Chinese term for epilepsy translated as “crazy seizure disorder”; only then was it changed to “brain seizure disorder”. And as a cricket lover and fan of Test Match Special, I did not need Grant to remind me of Henry Blofeld’s disgraceful assertion that Tony Greig’s “defection” to Kerry Packer’s World Series Cricket was because “he’s an epileptic and that may be one reason why he’s made this ridiculous decision”.

But I confess that I had not previously made the connection between the storyline of the Powell and Pressburger film, A Matter of Life and Death and epilepsy. Obvious, once it is pointed out! And the film partly gives Grant the title of his book – the main character, played by David Niven, is tantalised by the smell of fried onions, an example of what the experts call “auras”, sensory precursors to seizures. But the title also has a personal resonance for the author. On one occasion, Grant’s brother said “Can you smell burning?” immediately before he “crashed into a fit”.

Along with the broader history, the usual roll-call of famous epileptics make an appearance in A Smell of Burning – including Julius Caesar, Joan of Arc, Fyodor Dostoevsky, Vladimir Lenin, Edward Lear, Vincent Van Gogh, and Neil Young. This begs the (unanswered) question whether sufferers from epilepsy are more likely to be creative (in the broadest sense of the term), or whether inevitably just like the general population some epilepsy sufferers are creative and some are not. But Grant highlights an interesting point. Perhaps the threat of seizures encourages some people to make the most of things in between them. Van Gogh, for example, wrote that “it drives me to work and to seriousness, as a coal-miner who is always in danger makes haste in what he does.”

The danger today is greatly reduced by the development, since the 1930s and ongoing, of anti-epileptic drugs. The downside, for some people, is that these take the edge off alertness and intelligence. Grant quotes one person who accepts the situation and has been free from seizures for two decades, but says when you wake up in the morning “the first thing you’ve got to do is push your way through that thickness of cotton wool to get to where you can operate but actually that bit there [the sharpness] that’s gone.” Others, including Neil Young, don’t take the medication and live with the consequences. Grant’s brother, Christopher, was one of those. A few hundred of those people each year, including Christopher in 2008, die from a condition that is rare, but not rare enough to avoid having its own acronym – SUDEP, for sudden unexpected death in epilepsy. A powerful reason to keep taking the medicine.






John Gribbin is a Visiting Fellow in astronomy at the University of Sussex and co-author of Being Human

Why The Imitation Game is a disaster for historians.

This saves me the bother. A superb summary.

The Renaissance Mathematicus

I made the mistake, as a former professional historian of logic and meta-mathematics and, as a consequence, an amateur historian of the computer, of going to the cinema to watch the Alan Turing biopic The Imitation Game. I knew that it wouldn’t be historically accurate but that it would be a total historical disaster and, as I said on leaving the cinema, an insult to the memory of both Alan Turing and the others who worked in Bletchley Park surprised even me, a dyed in the wool, life-long cynic.

As I ventilated my disgust over the next few days on Twitter some, quite correctly, took me to task, informing me that it is a film and not a history book and therefore one shouldn’t criticise it for any inaccuracies that it contains. This attitude is of course perfectly correct and I would accept it,m if only the people who…

View original post 872 more words