Read A Buzz in the Meadow Online
Authors: Dave Goulson
I was at university in the mid-1980s, and there I was taught that the gold-standard of crop protection was called Integrated Pest Management, or IPM. The basic principle was that farmers should monitor the numbers of pests on their crop, and should only deploy control agents if and when there was a problem. They should try to manage their land to maximise the numbers of natural enemies, boosting populations of predatory insects such as lacewings by providing them with healthy non-crop habitat, such as hedgerows or strips of permanent tussocky grass â places in which they could spend the winter and find food when there were no pests on the crop. Cultural controls, such as crop rotations, can greatly help to reduce pest numbers, as the crops favoured by a particular pest are constantly moving from field to field and are therefore harder for the pest to keep track of. We were taught that chemicals should only be used as a last resort; and then only compounds that rapidly break down should be used, so as to minimise the impact on beneficial creatures.
More broadly, it had become clear that farmland biodiversity was important, both ecologically and economically. Hedges are not just boundaries between fields, but are reservoirs for beneficial insects, and provide flowers and nesting sites for bees. Changes to the Common Agricultural Policy were introduced that enabled farmers to be paid to promote biodiversity. Schemes were introduced to pay farmers to replant hedges and copses, to sow strips of wild flowers, even to create special âbeetle banks' from tussocky grasses. By the 1990s little good habitat was being lost, and lots of money was being poured into protecting and promoting wildlife on farms. The conservationists had won, and could finally put their feet up and watch the flowers grow.
Depressingly, that isn't quite how it panned out. The benefits to wildlife simply don't seem to have materialised, despite the injection of billions of pounds of taxpayers' money â currently running at about £500 million per year. After thirty years or so of agri-environment schemes, the data suggest that most farmland birds, moths and butterflies remain on a downwards trajectory, with populations dwindling year-on-year. There really are fewer butterflies on the buddleia bush â it isn't just my imagination. So what went wrong?
Partly the answer may lie with failures in the design of agri-environment schemes. Many were introduced as best-guesses, without much evidence that they would work. The entry-level schemes, whereby farmers get small payments for a range of simple measures to protect wildlife, often don't differ in any significant way from what the farmers would be doing anyway without payments. Many farmers don't really know what the agri-environment schemes are meant to achieve. They may sign up to create a âspecies-rich grassland', and receive payments, but when the wild-flower seeds don't germinate or the grassland gets invaded by docks and thistles, they are often unsure what to do. The paperwork involved in entering the schemes is dauntingly complicated, and becomes more so with each successive iteration of these schemes, which are ever-changing. Farmers with small farms in marginal areas â often the ones with most wildlife on their lands â cannot afford to spare the time to fill in the forms or cannot make sense of them. Large, rich farmers pay agents to do it for them. Entry to the higher-level schemes â those that are most likely to actually benefit wildlife â is competitive, as most money goes on the entry-level scheme, so farmers who do fill in the forms may be rejected and find that they have wasted their time. Overall it is fair to say that agri-environment schemes have not been an unqualified success, and they probably provide the European taxpayer with poor value for money. Nonetheless, they are surely better than nothing. So why are farmland wildlife populations still heading steadily downwards towards oblivion?
The short answer is that we don't know for sure. But some people think they do, and I'm starting to come round to the idea that they may be at least partly right. In October of 2006 honeybee keepers in the USA found that their bees were disappearing. Whole hives that seemed perfectly healthy one day were deserted the next, the adult bees having simply vanished. There were no corpses, and no clues as to what had happened. Various names were coined, with Marie Celeste Syndrome being the most apt, but Colony Collapse Disorder is the clumsy term that stuck, often abbreviated to CCD. It didn't just affect one or two hives, but hundreds of thousands. Some beekeepers lost most of their hives and went out of business. The almond farmers of California found it almost impossible to find beekeepers who could supply hives to pollinate their crop, and the cost of hiring hives went through the ceiling.
Beekeeping in North America is very different from beekeeping in Europe. Here, many beekeepers have just a handful of hives and beekeeping is generally not big business. All but the very largest-scale beekeepers have fewer than 100 hives. Many hives remain in the same place all year round, although some beekeepers may move their hives a few kilometres to help with pollination of a particular crop, or to gather heather honey from the nearby hills in late summer. In contrast, in North America the big beekeepers have thousands of hives. So many hives cannot be kept together in one place for long, as they would soon exhaust the local food supply, so they are stacked on huge trailers and transported around the continent from crop to crop. Farmers who need pollination pay for the service, so the bees go to California for the almond blossom in March, to Florida for the flowering of the citrus orchards in April, north to New York to pollinate apples in May, up to Vermont for the blueberries in June, and then back to Florida via a few weeks in the pumpkin patches of Pennsylvania. Every year a colony may travel 18,000 kilometres. One might imagine that the bees (and the beekeepers) are exhausted; this is certainly not a natural way of life for a bee.
For both the beekeepers and the farmers the bees are vital to their livelihoods, so CCD was a disaster. It caused widespread panic and an urgent hunt for the culprit and a cure. Beekeepers elsewhere in the world started to look for signs of CCD, and the following year there were reports of similar problems in Europe, although on a smaller scale. Sometimes the symptoms were slightly different, but the panic spread, and the media published dramatic tales of a worldwide scourge, which threatened the very survival of bees (they rarely mentioned that CCD is seemingly confined to honeybees, just one of thousands of bee species).
Seven years on, and millions of pounds worth of scientific research later, we are still not certain what the answer is. Many consider the
Varroa
mite to be the prime suspect. This parasitic mite has spread from Asia throughout the globe in recent years, accidentally transported by humans, and is certainly a major threat to honeybee health. The mite sucks the blood of adult bees and the developing brood, spreading viral diseases from bee to bee as it goes, and it is hard to control. However,
Varroa
was around for quite a while before CCD, so it cannot be as simple as that. Others blame mobile phones, claiming that the signals interfere with bee navigation, causing them to get lost. Interesting though this theory is, there is not a shred of evidence to support it, and mobile phones were also widespread long before the arrival of CCD. Others blame genetically modified crops, but again this doesn't seem to stand up to scrutiny. A more plausible theory is that the diet of honeybees has become very narrow. Naturally honeybees feed on a huge range of wild flowers through the year, but in intensive agricultural landscapes they may get most of their food from just a few different crops, and for weeks on end they may be feeding on a single crop. This issue is particularly acute in North America, where bees are transported from one intensive agricultural landscape to another throughout the year. Just as humans require a balanced diet, so it may be that feeding on just a handful of different foodstuffs does not supply honeybees with all of the nutrients they require. Imagine if you were forced to eat only Brussels sprouts in December, bacon in January and chocolate in February; you might end up feeling more than a little off-colour. Finally, many suspect that pesticides may be to blame, and this brings me back to my French neighbour's sunflowers.
In the mid-1990s a new class of insecticide was introduced. Known as neonicotinoids, they are synthetic variants of nicotine. They block open insect nerve-receptors, thus attacking the insect nervous system and brain, and are phenomenally toxic in tiny amounts. Of course insecticides wouldn't be much use if they weren't toxic to insects, so this might be regarded as a good thing. Neonics (they sound a little more friendly when abbreviated) have a major advantage over most of the insecticides that went before, in that they are systemic. They can be applied as a seed dressing before the crop is sown, and the germinating seedling absorbs the chemical, which spreads throughout the plant. Any herbivorous insect that eats any part of the crop dies. This is a wonderfully neat system. Previously insecticides had to be sprayed on to the crop from a tractor-mounted boom. Much of them landed on the soil, and in even a slight breeze they would blow in to the hedgerows. Only the parts of the crop that were directly coated with the spray were protected, so the lower leaves and roots would be vulnerable to herbivores. As the crop grew, further applications were needed on the new leaves. Overall, more chemical was needed to protect the crop, and the farmer had to spend both time and diesel applying it. Many of the chemicals used were pretty nasty â for example, the organophosphate insecticides were derivatives of nerve agents developed during the war to kill people â and so spraying them around posed a direct threat to the farm worker. All in all, it is easy to see why the neonics proved to be hugely popular, and they quickly became one of the most widely used classes of insecticide in the world. They now comprise about one-quarter of all insecticides used globally, and one type of neonic, known as imidacloprid, is the second most widely used agrochemical (after the herbicide glyphosate). In the UK, agricultural use of it has risen steadily, reaching about 80 tonnes per year at the last count.
The systemic nature of neonics is both their great strength and, perhaps, their Achilles heel. They spread to all parts of the plant, and that inevitably includes the nectar and pollen. If the crop is visited by pollinating insects, then they consume small amounts of these chemicals. Not long after the introduction of neonics in the 1990s, French beekeepers started to claim that they were causing their honeybee colonies to die. Their campaigns led to partial bans on some types of neonics on some flowering crops, but to the concern of French beekeepers, this largely meant that farmers simply used different types of neonics. This controversy rumbled on for some years until an incident in Germany in 2008. A batch of maize seeds had been coated with an incorrect formulation of neonic. The chemical was not properly stuck to the seed, and when the seeds were drilled, much of the coating blew away as a fine powder. Hundreds of honeybee hives in the area were wiped out more or less instantly. There was an uproar, and neonics were banned in Germany pending an investigation, but the ban was subsequently rescinded when it became clear that the problem lay primarily with the incorrect formulation. Nonetheless, this incident raised awareness among beekeepers and environmentalists as to the acute toxicity of these compounds to insects such as bees, and prompted further investigations.
Of course the chemical manufacturers were aware that neonics would get into the pollen and nectar of crops from the start. Agrochemicals go through various safety tests before they are licensed for use, including an evaluation of their toxicity to bees and other beneficial insects. Typically, groups of lab animals are fed on varying doses of a chemical and then monitored to see if they expire. This enables calculation of the âLethal Dose 50%' or LD50 â the dose that causes half of the test animals to die. The LD50s for the various different neonics in honeybees are all very low, just four- or five-billionths of a gram for the common types such as imidacloprid and clothianidin. To put that in context, one gram (not much more than the contents of a sachet of salt) is enough to give an LD50 to 250 million honeybees, or about twenty-five tonnes of bees.
How does the LD50 compare to the amounts found in nectar and pollen? Typically, the pollen of treated crops such as oilseed rape contains concentrations of neonics in a range from one to ten parts per billion â not much, but then these are very toxic chemicals. The amounts in nectar are usually even less, commonly below one part per billion. The big question then is: are these concentrations sufficient to harm bees? At ten parts per billion, a honeybee would need to consume about half a gram of pollen â about five times its own body weight â to receive an LD50. A bee would certainly not consume this much in a short period, although it could easily do so during its life. However, typical lab toxicity tests don't look at long-term effects; most last just a couple of days, and over this period these sorts of concentrations do not kill bees. As a result, the compounds were deemed to be safe for pollinators, and licences were granted for their use all over the world. Whenever beekeepers claimed that these compounds posed a threat to their bees, the agrochemical companies pointed to the data and argued that the amounts bees consume are not enough to kill them.
The arrival of CCD provided new impetus to investigations into bee health, and a re-examination of the hazards that bees face in the modern agricultural landscape. It led to prominent campaigns by beekeepers and environmentalists around the world, many of them targeted at getting neonics banned. In the UK the invertebrate conservation charity Buglife produced a report on neonics that argued for a ban, supported by their counterpart in the USA, the Xerces Society. As someone involved in bumblebee conservation and research, I was regularly asked to support these campaigns, but I was reluctant to do so. I wasn't aware of any compelling evidence that pinned either CCD or bumblebee decline to neonic use. It is the job of scientists, so far as is humanly possible, to be impartial and to provide the evidence that informs the decisions of others, not to become environmental lobbyists, although sometimes the distinction becomes blurred (such as when writing this book). Nonetheless this issue seemed to be one that wasn't going to go away, so in 2011 I decided to do some research of my own.