Read The Second Book of General Ignorance Online
Authors: John Lloyd,John Mitchinson
They were silver, not ruby.
L. Frank Baum’s novel
The Wonderful Wizard of Oz
was the best-selling book for children in the USA for two years after its publication in 1900. Since translated into more than forty languages, it created one of the most successful publishing franchises of all time: Baum produced a series of thirteen sequels set in the land of Oz and many more were published after his death. He also wrote the script for a musical version, which ran almost continuously on Broadway between 1903 and 1904, and was the first adaptation to use the shortened title,
The Wizard of Oz.
Under the new name, in 1939 MGM took the famous book and the famous musical and turned them into an even more famous film, directed by Victor Fleming and starring Judy Garland as Dorothy. In 2009 it was named the ‘most watched’ film of all time by the US Library of Congress. Although it did well at the box office, its huge budget meant it made only a small profit and it was beaten to the Best Picture Oscar by that year’s other blockbuster,
Gone with the Wind.
But the immortality of
The Wizard of Oz
was assured by its annual Christmas screening on US television, which began in 1956. It’s the most repeated movie on TV of all time.
Dorothy’s slippers were changed to red in the film because
the producer, Mervyn LeRoy, wanted them to stand out.
The
Wizard of Oz
was only the second film made in Technicolor and the new process made some colours easier to render than others. It took the art department over a week to come up with a yellow for the Yellow Brick Road that didn’t look green on screen.
The new technology made the six-month shoot hazardous for the actors. The lights heated the set to a stifling 38 °C and eventually caused a fire in which Margaret Hamilton (the Wicked Witch of the West) was badly burned. The cast had to eat liquidised food through straws because their thick colour face make-up was so toxic. The original Tin Man, Buddy Ebsen, nearly died from inhaling the aluminium powder it contained and had to leave the film.
Lyman Frank Baum died in 1919, long before his book made it to the screen, although he ended his days in Hollywood as a film producer. This was the last in a long line of careers – as a breeder of fancy poultry, a newspaper editor, a theatrical impresario, the proprietor of a general store, a travelling salesman and a writer of over fifty books, many under female pseudonyms such as Edith van Dyne and Laura Metcalf. In 1900, the same year he published
The Wonderful
Wizard of Oz,
he also brought out
The Art of Decorating Dry Goods
Windows and Interiors
, which listed the many marketing advantages of using shop-window mannequins.
But it is
Oz
he will be remembered for and, despite all attempts at interpreting his novel as a political allegory or a feminist tract, it is best read the way he intended, as a home-grown American version of the fairy tales of the Brothers Grimm and Hans Christian Andersen that he had loved as a child.
No, it doesn’t glow in the dark.
Radioactivity isn’t detectable as visible light. If it were, the whole earth would glow in the dark, as well as every plant and animal on it. Rocks, soil and living tissue all contain traces of radioactive material.
Radioactivity is not the same as radiation. Radiation is the means by which energy – radio waves, light, heat and X-rays – travels in space. These are all made of photons that spread out (or ‘radiate’) in waves moving at the speed of light. Though they are all made of the same stuff and travel at the same speed, their waves have different distances between the peaks and troughs, graded along a scale known as the electromagnetic spectrum. At one end are low-frequency waves (with long wavelengths) like radio waves; at the other, high-frequency waves with short wavelengths like X-rays. In the middle is ‘visible light’, the narrow band of electromagnetic energy that we can see.
All radiation is harmful if we’re exposed to too much of it for too long. Sunshine – a mix of wavelengths from infrared (heat), through visible light, to ultraviolet – causes sunburn. At the high-frequency end of the spectrum, the energy is so intense it can knock electrons out of orbit, giving a previously neutral atom a positive electric charge. This charged atom is called an
ion
(Greek for ‘going’). One ion creates another in a rapid chain reaction. This can cause terrible damage by changing the molecules in our cells, causing skin ‘burns’, cancerous tumours and mutations in our DNA.
Substances that do this are called ‘radioactive’, a term coined by the Polish chemist Marie Curie (1867–1934) in 1898. Although she invented the
word
, the French physicist Henri Becquerel (1852–1908) had accidentally discovered the actual
process
two years earlier, while working with uranium. Following in his footsteps, Marie discovered something a
million times more radioactive than uranium: a new chemical element she called ‘radium’.
Becquerel, Marie and her husband Pierre shared the 1903 Nobel Prize for their discovery and the ‘invigorating’ effects of radium salts were soon being hailed as a cure for ailments from blindness to depression and rheumatism. Radium was added to mineral water, toothpaste, face-creams and chocolate and there was a craze for ‘radium cocktails’. Added radium to paint made it luminous, a novelty effect that was used to decorate clock and watch faces.
This is the origin of the radioactive ‘green glow’. It wasn’t the radium glowing, but its reaction with the copper and zinc in the paint, creating a phenomenon called ‘radioluminescence’. The phrase ‘radium glow’ stuck in the public mind. When the true consequences of exposure to radioactivity were revealed in the early 1930s, glowing and radioactivity had become inseparably linked.
Hundreds of ‘radium girls’, who had worked in factories applying paint containing glow-in-the-dark radium to watch-faces (and licking the brushes as they did so) were to die from painful and disfiguring facial cancers. And in 1934 Marie Curie herself died of anaemia, caused by years of handling the ‘magic’ substance she had discovered.
Microwave ovens don’t cook food ‘from the inside out’.
Microwaves are a form of electromagnetic radiation that sits on the spectrum between radio waves and infrared light. They are called ‘micro’ waves because they have much shorter wavelengths than radio waves. They have a wide variety of uses: mobile phone networks, wireless connections like Bluetooth, Global Positioning Systems (GPS), radio telescopes and radar all rely on microwaves at differing frequencies. Although they carry more energy than radio waves, they’re a long way from the dangerous end of the electromagnetic spectrum where X-rays and gamma rays reside.
Microwave ovens don’t directly cook food; what they do is heat water. The frequency of microwaves happens to be just right for exciting water molecules. By spreading their energy evenly through food, the microwaves heat the water in it and the hot water cooks the food. Nearly all food contains water, but microwaves won’t cook completely dry food like cornflakes, rice or pasta.
The molecules in the centre of your soup aren’t heated any quicker than those on the outside. In fact, the opposite is true. If the food is the same consistency all the way through, the water nearest the surface will absorb most of the energy. In this regard, microwave cookery is similar to heating food in a normal oven, except that the microwaves penetrate deeper and more quickly. The reason why it sometimes appears that the middle of microwaved food has ‘cooked first’ is to do with the type of food. Jacket potatoes, for instance, and apple pies, are drier on the outside than the inside; so the moist centre will be hotter than the outside skin or crust.
Because microwaves work by exciting the water molecules, it also means that the food rarely gets much hotter than the 100 °C temperature at which water boils. Meat cooked in a microwave can be tender, but it is more like poaching than roasting. To break down protein and carbohydrate molecules rapidly and form a caramelised crust as in pork crackling (or to get the crisp exterior of a chip) requires temperatures of 240 °C or higher.
Microwave ovens are a by-product of the invention of radar in 1940. In 1945 Percy Spencer, a US engineer working for the defence systems company Raytheon was building a magnetron (the device at the core of radar that converts electricity to microwaves) when he noticed that a chocolate peanut bar in his pocket had completely melted. Guessing it was caused by the magnetron, he built a metal box and fed in microwave radiation. The first food he cooked in his improvised oven was popcorn; his second experiment, with a whole egg, ended in an explosion. The water in the egg had rapidly vaporised.
Raytheon was quick to introduce the first commercial microwave oven in 1947 and, by the late 1960s, smaller domestic versions had started appearing in American homes. Despite the various myths they have gathered down the years, they now occupy pride of place in 90 per cent of US kitchens.
Yorkshire.
In 1953 British scientists seriously considered detonating a nuclear weapon next to the tiny village of Skipsea, on the East Yorkshire coast road between Bridlington and Hornsea. Home to just over 630 people, it has a medieval church and the remains of a Norman castle but not much else.
It was exactly this isolated, sleepy character – plus its convenient proximity to the RAF base at Hull – that commended the village to the scientists at the Atomic Research Establishment at Aldermaston. They were looking at
various coastal sites in the UK for an above-ground atomic bomb explosion following their successful test detonation under the sea off the Monte Bello Islands, north-west of Australia, in 1952. Skipsea ticked all the boxes.
Unsurprisingly, the local community leaders were unanimously opposed to the idea, pointing out that the test site was dangerously close to bungalows and beach huts and that a public right of way ran through it. The Aldermaston team eventually relented and switched their plans back to Australia.
The results of the Maralinga tests in South Australia, in which seven above-ground atomic devices were detonated between 1956 and 1957, show just how close Skipsea – and the rest of the UK – came to total disaster. The interior of the whole Australian continent was severely contaminated, with testing stations 3,200 kilometres (2,000 miles) apart reporting a hundredfold increase in radioactivity. Significant fallout even reached Melbourne and Adelaide.
Maralinga was a site of great spiritual importance to the local Pitjantjatjara and Yankunytjatjara peoples (its name means ‘Place of Thunder’) and their evacuation was incompetently managed. After the detonations, there was little attempt to enforce site security and all the warnings signs were in English. As a result, many Aboriginals returned to their homeland soon afterwards.
Even more shocking, British and Australian servicemen were intentionally sent to work on the site to gauge the effect of radioactivity on active troops. It is estimated that 30 per cent of the 7,000 servicemen who worked at the location died from various cancers before they turned sixty. The effect on the Aboriginal inhabitants has been even worse – with blindness, deformity and high levels of cancer reported across the local population.
After pressure from the troops’ veterans’ association and aboriginal groups, the McClelland Royal Commission was set
up in 1984. It concluded that all seven tests had been carried out ‘under inappropriate conditions’ and ordered a comprehensive clean-up of the site, which was eventually completed in 2000. In 1994 a compensation fund of $13.5 million was set up for the local people and limited payments have been made to Australian veterans.
At the time of writing, the UK government has produced no formal compensation scheme for British survivors of its nuclear testing programme.
STEPHEN
Where did Britain originally plan to test their atomic
bombs?SANDI TOKSVIG
Was it Paris?
Neither the Yorkists nor the Lancastrians were based in the counties that bear their names, and neither side called the conflict ‘the Wars of the Roses’.
The Houses of York and Lancaster were branches of the House of Plantagenet, which had ruled England for 300 years. They were unconnected with either Yorkshire or Lancashire. If anything, more Lancastrians than Yorkists came from Yorkshire and the remainder of the Duke of Lancaster’s estates were in Cheshire, Gloucestershire and North Wales.
Most Yorkist supporters were from the Midlands, not from Yorkshire, and the Duke of York’s estates were mainly concentrated along the Welsh borders and down into south Wales.
The ‘Wars of the Roses’ weren’t wars in the traditional sense. The people involved certainly didn’t think of them as such. They were really just an extended bout of infighting between two branches of the royal family. The event that provoked this rivalry was the overthrow of Richard II by Henry Bolingbroke, Duke of Lancaster, who was crowned Henry IV in 1399. There followed half a century of intrigue, treachery and murder, peppered with minor skirmishes, but it wasn’t until 1455 that the first real battle was fought. And, even though the throne changed hands between the two sides three times over the period – with Edward IV (York) and Henry VI (Lancaster) getting two goes each – most of England was unaffected by the strife.
After the murder of Henry VI in 1471, there were three Yorkist kings in a row: Edward IV (again), Edward V and Richard III. Although Henry Tudor, the man who wrested the throne from Richard III to become Henry VII, was nominally a Lancastrian, his real intention was to start a new dynasty named after himself. The creation of the red-and-white Tudor rose was a brilliant bit of marketing on his part, supposedly merging the white rose of York and the red rose of Lancaster to symbolise a new united kingdom. In fact, until then, the roses had been just two of many livery signs used by either side. Most of the troops were conscripts or mercenaries who tended to sport the badge of their immediate feudal lord or employer. Even at Bosworth Field in 1485, the climactic battle that finally ended the conflict, the Lancastrian Henry fought under the red dragon of Wales, and the Yorkist Richard III under his personal symbol of a white boar.
But Henry’s image manipulation was so successful that, when Shakespeare wrote
Henry VI Part I
in 1601, he included a
scene where supporters of each faction pick different coloured roses. This so inspired Sir Walter Scott that – in
Ivanhoe
(1823) – he named the period ‘the Wars of the Roses’. So it was 338 years after the conflict ended that the phrase was used for the very first time.
Even if they weren’t really wars, or much to do with roses, and didn’t involve inter-county rivalries, they were neither romantic nor trivial. The Yorkists’ crushing victory at Towton in 1461 remains the largest and bloodiest battle ever fought on British soil. Some 80,000 soldiers took part (including twenty-eight lords, almost half the peerage at that time), and more than 28,000 men died – roughly 3 per cent of the entire adult male population of England.