The Woman Who Studied the Sun

Posting this in response to a question I was asked.  It is essentially an extract from my book 13.8


Cecilia Payne won a scholarship to Newnham College, Cambridge (the only way she could have afforded a university education) in 1919. She studied botany, physics and chemistry, but also attended a talk by Arthur Eddington about the eclipse expedition on which he had famously “proved Einstein right” by measuring the way light from distant stars was bent by the Sun. This fired her interest in astronomy, and she visited the university’s observatory on an open night, plying the staff with so many questions that Eddington took an interest, and offered her the run of the observatory library, where she read about the latest developments in the astronomical journals.

After completing her studies (as a woman, she was allowed to complete a degree course, but could not be awarded a degree; Cambridge did not award any degrees to women until 1948) she looked for a way to pursue this interest. There was no chance of a career in research in England, where the only job opportunities for women scientists were in teaching, but through Eddington she had met Harlow Shapley, from Harvard, on a visit to England. He offered her the chance to work for a PhD on a graduate fellowship (even though, technically, she was not a graduate) and in 1923 she left for the United States. Just two years later, she produced a brilliant thesis and became the first person to be awarded a PhD by Radcliffe College (also the first for work carried out at Harvard College Observatory). In it, she established that the Sun is mainly made of hydrogen. But, in a sign of the times, the idea was not fully accepted until two male astronomers independently came to the same conclusion.

Payne’s study of the solar spectrum made use of the then-recent discovery by the Indian physicist Meghnad Saha that part of the complication of the pattern of lines in a stellar spectrum (or the Sun’s Fraunhofer lines) was a result of different physical conditions in different parts of the atmosphere of a star. By the 1920s, physicists knew (as, of course, Bunsen and Kirchoff had not) that atoms are composed of a tiny central nucleus, with one or more electrons at a distance from the nucleus. Dark lines in a spectrum are produced when an electron absorbs a specific wavelength of light, moving to a higher energy level within the atom, and bright lines are produced when an electron drops down from one energy level to another and emits radiation (in the form, we would now say, of a photon). An atom which has lost one or more of its electrons is called an ion, and the spectra of ions are correspondingly different (in a way which can be calculated) from those of the “parent” atoms. Payne measured the absorption lines in stellar spectra and showed how the temperature (in particular) and pressure in the atmosphere of a star affects the ionisation of the atoms there. This makes for a more complicated pattern of lines than if all the atoms were in their un-ionised state.[1] The spectra of stars differ from one another not because they are made of different things, but because of different amounts of ionisation in their atmospheres.

Payne’s great achievement was to unravel this complicated pattern of hundreds of Fraunhofer lines and work out what proportion of different elements in different stages of ionisation had to be present to account for the observations. Some idea of the difficulty of her task can be gleaned from the fact that her thesis was later described by the astronomer Otto Struve as “the most brilliant Ph.D. thesis ever written in astronomy”. She worked out the proportions of eighteen elements in the Sun and stars, discovering that they all had nearly the same composition. But the big surprise was that according to her analysis the Sun and stars are made almost entirely of hydrogen and helium. If she was correct, everything else put together made up only two per cent of the composition of our nearest star, and of all stars. Most of the matter in the Universe was in the form of the two lightest elements, hydrogen and helium. This was almost literally unbelievable in 1925. Payne believed her results were correct, but when Shapley sent a draft of her thesis to Henry Norris Russell at Princeton for a second opinion he replied that the result was “clearly impossible.” On Shapley’s advice, she added a sentence to the thesis saying that “the enormous abundance derived for these elements [hydrogen and helium] in the stellar atmospheres is almost certainly not real”. But with the thesis accepted and her doctorate awarded, she wrote a book, Stellar Atmospheres, which began to persuade astronomers that the results were, in fact, almost certainly real.

The change of mind was aided by the independent confirmation of Payne’s results by other astrophysicists. In 1928, the German astronomer Albrecht Unsöld carried out a detailed spectroscopic analysis of the light from the Sun; he found that the strength of the hydrogen lines implied that there are roughly a million hydrogen atoms in the Sun for every atom of anything else. A year later, the Irish astronomer William McCrea confirmed these results using a different spectroscopic technique.[2] What this shows, more than anything, is that although Cecilia Payne was a brilliant researcher who got there first, this was a discovery whose time had come; given the technology oft the 1920s it was inevitable that the discovery would be made sooner rather than later. In 1929, having carried out a similar analysis using a different technique, Russell himself published a paper confirming these results, and giving due credit to Payne’s priority; unfortunately, because of Russell’s established position in the astronomical community, for some time he was often cited as the discoverer by people who should have known better (or at least, read his paper properly).

Payne went on to a distinguished career in astronomy; in 1934 she married the Russian-born astrophysicist Sergei Gaposchkin, and became known as Cecilia Payne-Gaposchkin. She remained at Harvard throughout her career, in spite of the low status and low pay she received as a woman. For many years, her official title was “technical assistant”, even though she carried out all the research and teaching duties expected of a Professor. It was not until 1956 that she was promoted to become a full Professor — the first female Professor at Harvard. But, like most scientists, she was not primarily motivated by status or salary. In 1976, three years before her death, she was awarded the prestigious Henry Norris Russell Prize by the American Astronomical Society. No doubt she appreciated the irony. In her acceptance lecture, she said, clearly referring to her early work on stellar spectra, “The reward of the young scientist is the emotional thrill of being the first person in the history of the world to see something or to understand something.” Even if someone else tells you it is “clearly impossible”.


[1] I am always careful to use the hyphen in the word “un-ionised” since Isaac Asimov once pointed out to me that the way to distinguish between a scientist and a politician is to ask them how to pronounce the word “unionised”:.

[2] Much later, McCrea was my PhD examiner.

From Here to Infinity

A slightly adapted version of a review I wrote for the Wall Street Journal:

Eyes on the Sky

Francis Graham-Smith

Oxford UP
Eyes on the Sky is an intriguing and unusual book.  Popular books about astronomy usually focus on the wonders of the Universe – black holes, quasars, the Big Bang, and so on.  But radio astronomer Francis Graham-Smith focuses on the terrestrial wonders that have revealed these exciting objects – the telescopes.  Not just optical telescopes, but telescopes peering out at he Universe across all the wave bands of the electromagnetic spectrum, from radio waves to gamma rays.  The story begins with Galileo, who first turned a telescope upwards almost exactly four hundred years ago, and ends with the radio telescopes that combine observations made by different antennae at widely spaced locations to mimic a dish the size of the Earth.  But the author could not have been aware when he wrote the book how neatly this story rounds off an era in astronomy, since as of the fall of 2015 we have a completely new way of looking at the Universe, using gravitational waves, not electromagnetic radiation.
This book-ending is particularly appropriate, since Graham-Smith was one of the pioneers of radio astronomy who opened up the new investigation of the sky in the 1940s, using radar technology developed during World War Two.  As he says, before 1947 “the only real astronomers were those who actually looked through [optical] telescopes”.  Radio astronomy was the first of a clutch of new techniques which opened up new windows on the Universe.  And it has all happened within a single human lifetime.
The approach of the book is both thematic and historical.  The themes address each part of the spectrum, starting with Galileo and “conventional” telescopes then moving on to ultraviolet, infrared, X-ray and gamma ray astronomy, before winding up with Graham-Smith’s real love, radio astronomy.  Within each theme he covers developments historically, including developments such as the use of rockets, satellites and high-altitude balloons to make observations at wavelengths blocked by the Earth’s atmosphere.  The techniques have developed as much as the technology – interferometry and aperture synthesis, the methods used to make a virtual giant telescope out of an array of smaller instruments, are explained as clearly here as anywhere you are likely to find.  But the real fascination of the book is the way it almost incidentally highlights the way science has changed in little more than fifty years.  Galileo might have been amazed by the technology, but he would surely have understood the principles on which the 200-inch Hale telescope on Mount Palomar, the biggest one around in 1947, operated.  But how could he have comprehended the significance of a satellite such as Planck, an observatory orbiting the Earth in order to measure the strength of radio waves from the dawn of time?  Come to that, very few astronomers in 1947 would have taken the idea of such a satellite seriously.
Astronomers of 1947 would also have been astonished at the size of present day astronomical projects, both in terms of the physical dimensions of the equipment involved and the number of people – and nations — collaborating on the observations.  The latest radio telescope to come online is the Five Hundred Metre Aperture Spherical Telescope (known as FAST), in Guizhou Province, southwest China.  As the name suggests, it is a dish 500 metres across (for comparison, the famous Arecibo radio telescope that featured in the movie Contact is 1,000 feet, or just over 300 metres, across).  But the dish of FAST is made up of 4,600 separate triangular panels, connected by flexible joints and a network of steel cables to 2,300 computer-controlled electric motors, which continually adjust the exact angle of each panel to in effect tilt the dish to track objects across the sky as the Earth rotates.  The accuracy of this “active surface” keeps each triangle positioned to an accuracy of five millimetres.
The international nature of astronomy is highlighted by a telescope known (for obscure reasons; astronomers love acronyms) as VISTA.  This has a mirror 4 metres across and is located at the European Southern Observatory, on a mountain top in Chile.  The mirror was designed in the UK, cast in Germany, and smoothed to its precise shape in Russia, before being shipped to South America, where it is now used by astronomers of many nations; there are fifteen member states of ESO.  Not that anyone “looks through” such a telescope; it captures light onto an array of Charge Coupled Devices (CCDs) — sixteen arrays each 2,048 by 2,048 pixels across, making a total of some 67 million pixels.  The resulting images appear on monitor screens, or the data is sucked off by a computer for detailed analysis.
Since the time of Galileo, Graham-Smith points out, the light collecting area of individual telescopes has increased by a factor of more than half a million, while over the past fifty years there has also been a dramatic increase in the performance of detectors, such as CCDs.  And as he emphasises, progress has not yet come to a halt.  Where next?  Well, we now have the ability to image planets orbiting other stars.  The next generation of telescopes may be able to detect signs of life on those planets.

The Story of Epilepsy

Here’s my latest from the Literary Review:


A Smell of Burning:

The story of epilepsy

Colin Grant; Cape


Colin Grant begins his story of epilepsy by explaining that he was drawn to write about the subject because of his brother, an epilepsy sufferer. As it happens, Grant is also a trained doctor, as well as being a writer and broadcaster, so he brings a core of expert knowledge and a perhaps even greater degree of skill as a communicator to the story, as well as his special interest. I cannot claim any medical skill, but I also have a special interest, since I too have a family member who suffers from epilepsy. But you don’t need any special connection to appreciate the absorbing and sometimes horrifying story he has to tell.

The horror comes from the grim treatment meted out to epileptics in the past – and not always the distant past. Grant tells the story more or less chronologically, interleaving it with episodes from his brother’s life, and focusing on different aspects of “treatment”, if that is not too polite a term for some of the techniques described here. The result is curiously contrapuntal. The broad story is one of things slowly getting better; the personal story is one of things slowly getting worse. This is so clearly telegraphed from early on the book that no spoiler alert is really needed if I tell you that it does not have a happy ending. Equally clearly, writing the book represented a catharsis for the author. All of which lifts it above the level of the kind of professional history of the subject that might have been written by an author with equal skill but no personal involvement.

It is the recent history that is the most startling aspect of the story. We can hardly blame the Ancient Greeks or Romans for having a superstitious attitude to seizures. But 22-year-old Graham Greene seriously contemplated suicide when told of the diagnosis in 1926, and the story is now well-known that the youngest son of King George V, Prince John, was hidden away in the country because of the “stigma”. Right up until 1970 in the United Kingdom a marriage could be declared void “if either party was, at the time of marriage, of unsound mind, mentally defective, or subject to recurrent fits of insanity or epilepsy.” It was three years after the UK decriminalised homosexual acts in private between two men that epilepsy was removed from this list. In some countries, the law is less enlightened, even today. Until 2010, the official Chinese term for epilepsy translated as “crazy seizure disorder”; only then was it changed to “brain seizure disorder”. And as a cricket lover and fan of Test Match Special, I did not need Grant to remind me of Henry Blofeld’s disgraceful assertion that Tony Greig’s “defection” to Kerry Packer’s World Series Cricket was because “he’s an epileptic and that may be one reason why he’s made this ridiculous decision”.

But I confess that I had not previously made the connection between the storyline of the Powell and Pressburger film, A Matter of Life and Death and epilepsy. Obvious, once it is pointed out! And the film partly gives Grant the title of his book – the main character, played by David Niven, is tantalised by the smell of fried onions, an example of what the experts call “auras”, sensory precursors to seizures. But the title also has a personal resonance for the author. On one occasion, Grant’s brother said “Can you smell burning?” immediately before he “crashed into a fit”.

Along with the broader history, the usual roll-call of famous epileptics make an appearance in A Smell of Burning – including Julius Caesar, Joan of Arc, Fyodor Dostoevsky, Vladimir Lenin, Edward Lear, Vincent Van Gogh, and Neil Young. This begs the (unanswered) question whether sufferers from epilepsy are more likely to be creative (in the broadest sense of the term), or whether inevitably just like the general population some epilepsy sufferers are creative and some are not. But Grant highlights an interesting point. Perhaps the threat of seizures encourages some people to make the most of things in between them. Van Gogh, for example, wrote that “it drives me to work and to seriousness, as a coal-miner who is always in danger makes haste in what he does.”

The danger today is greatly reduced by the development, since the 1930s and ongoing, of anti-epileptic drugs. The downside, for some people, is that these take the edge off alertness and intelligence. Grant quotes one person who accepts the situation and has been free from seizures for two decades, but says when you wake up in the morning “the first thing you’ve got to do is push your way through that thickness of cotton wool to get to where you can operate but actually that bit there [the sharpness] that’s gone.” Others, including Neil Young, don’t take the medication and live with the consequences. Grant’s brother, Christopher, was one of those. A few hundred of those people each year, including Christopher in 2008, die from a condition that is rare, but not rare enough to avoid having its own acronym – SUDEP, for sudden unexpected death in epilepsy. A powerful reason to keep taking the medicine.






John Gribbin is a Visiting Fellow in astronomy at the University of Sussex and co-author of Being Human

Why The Imitation Game is a disaster for historians.

This saves me the bother. A superb summary.

The Renaissance Mathematicus

I made the mistake, as a former professional historian of logic and meta-mathematics and, as a consequence, an amateur historian of the computer, of going to the cinema to watch the Alan Turing biopic The Imitation Game. I knew that it wouldn’t be historically accurate but that it would be a total historical disaster and, as I said on leaving the cinema, an insult to the memory of both Alan Turing and the others who worked in Bletchley Park surprised even me, a dyed in the wool, life-long cynic.

As I ventilated my disgust over the next few days on Twitter some, quite correctly, took me to task, informing me that it is a film and not a history book and therefore one shouldn’t criticise it for any inaccuracies that it contains. This attitude is of course perfectly correct and I would accept it,m if only the people who…

View original post 872 more words