A few years ago, I started looking online to fill in chapters of my family history that no one had ever spoken of. I registered on Ancestry.com, plugged in the little I knew, and soon was found by a cousin whom I had not known existed, the granddaughter of my grandfather’s older sister. We started exchanging documents: a copy of a birth certificate, a photo from an old wedding album. After a few months, she sent me something disturbing.It was a black-and-white scan of an article clipped from the long-goneArgus of Rockaway Beach, New York. In the scan, the type was faded and there were ragged gaps where the soft newsprint had worn through. The clipping must have been cut out and folded and carried around a long time before it was pasted back together and put away.The article was about my great-uncle Joe, the youngest brother of my cousin’s grandmother and my grandfather. In a family that never talked much about the past, he had been discussed even less than the rest. I knew he had been a fireman in New York City and died young, and that his death scarred his family with a grief they never recovered from. I knew that my father, a small child when his uncle died, was thought to resemble him. I also knew that when my father made his Catholic confirmation a few years afterward, he chose as his spiritual guardian the saint that his uncle had been named for: St. Joseph, the patron of a good death.
I had always heard Joe had been injured at work: not burned, but bruised and cut when a heavy brass hose nozzle fell on him. The article revealed what happened next. Through one of the scrapes, an infection set in. After a few days, he developed an ache in one shoulder; two days later, a fever. His wife and the neighborhood doctor struggled for two weeks to take care of him, then flagged down a taxi and drove him fifteen miles to the hospital in my grandparents’ town. He was there one more week, shaking with chills and muttering through hallucinations, and then sinking into a coma as his organs failed. Desperate to save his life, the men from his firehouse lined up to give blood. Nothing worked. He was thirty when he died, in March 1938.
The date is important. Five years after my great-uncle’s death, penicillin changed medicine forever. Infections that had been death sentences—from battlefield wounds, industrial accidents, childbirth—suddenly could be cured in a few days. So when I first read the story of his death, it lit up for me what life must have been like before antibiotics started saving us.
Lately, though, I read it differently. In Joe’s story, I see what life might become if we did not have antibiotics any more.
“It is not difficult to make microbes resistant to penicillin in the laboratory by exposing them to concentrations not sufficient to kill them… There is the danger that the ignorant man may easily underdose himself and by exposing his microbes to non-lethal quantities of the drug make them resistant.”
As a biologist, Fleming knew that evolution was inevitable: sooner or later, bacteria would develop defenses against the compounds the nascent pharmaceutical industry was aiming at them. But what worried him was the possibility that misuse would speed the process up. Every inappropriate prescription and insufficient dose given in medicine would kill weak bacteria but let the strong survive. (As would the micro-dose “growth promoters” given in agriculture, which were invented a few years after Fleming spoke.) Bacteria can produce another generation in as little as twenty minutes; with tens of thousands of generations a year working out survival strategies, the organisms would soon overwhelm the potent new drugs.
Fleming’s prediction was correct. Penicillin-resistant staph emerged in 1940, while the drug was still being given to only a few patients. Tetracycline was introduced in 1950, and tetracycline-resistant Shigellaemerged in 1959; erythromycin came on the market in 1953, and erythromycin-resistant strep appeared in 1968. As antibiotics became more affordable and their use increased, bacteria developed defenses more quickly. Methicillin arrived in 1960 and methicillin resistance in 1962; levofloxacin in 1996 and the first resistant cases the same year; linezolid in 2000 and resistance to it in 2001; daptomycin in 2003 and the first signs of resistance in 2004.
With antibiotics losing usefulness so quickly — and thus not making back the estimated $1 billion per drug it costs to create them — the pharmaceutical industry lost enthusiasm for making more. In 2004, there were only five new antibiotics in development, compared to more than 500 chronic-disease drugs for which resistance is not an issue — and which, unlike antibiotics, are taken for years, not days. Since then, resistant bugs have grown more numerous and by sharing DNA with each other, have become even tougher to treat with the few drugs that remain. In 2009, and again this year, researchers in Europe and the United States sounded the alarm over an ominous form of resistance known as CRE, for which only one antibiotic still works.
Health authorities have struggled to convince the public that this is a crisis. In September, Dr. Thomas Frieden, the director of the U.S. Centers for Disease Control and Prevention, issued a blunt warning: “If we’re not careful, we will soon be in a post-antibiotic era. For some patients and some microbes, we are already there.” The chief medical officer of the United Kingdom, Dame Sally Davies — who calls antibiotic resistance as serious a threat as terrorism — recentlypublished a book in which she imagines what might come next. She sketches a world where infection is so dangerous that anyone with even minor symptoms would be locked in confinement until they recover or die. It is a dark vision, meant to disturb. But it may actually underplay what the loss of antibiotics would mean.
What would a post-antibiotic era look like? It isn’t hard to imagine what would happen first: infected patients would die. In fact, they already do.
Next to go: surgery, especially on sites that harbor large populations of bacteria such as the intestines and the urinary tract. Those bacteria are benign in their regular homes in the body, but introduce them into the blood, as surgery can, and infections are practically guaranteed. And then implantable devices, because bacteria can form sticky films of infection on the devices’ surfaces that can be broken down only by antibiotics
Dr. Donald Fry, a member of the American College of Surgeons who finished medical school in 1972, says: “In my professional life, it has been breathtaking to watch what can be done with synthetic prosthetic materials: joints, vessels, heart valves. But in these operations, infection is a catastrophe.” British health economists with similar concerns recently calculated the costs of antibiotic resistance. To examine how it would affect surgery, they picked hip replacements, a common procedure in once-athletic Baby Boomers. They estimated that without antibiotics, one out of every six recipients of new hip joints would die.
“The post-antibiotic future means, in effect, the end to modern medicine as we know it,” said Dr. Margaret Chan, Director General of the World Health Organization. “Things as common as strep throat or a child’s scratched knee could once again kill”
A growing body of scientific research links antibiotic use in animals to the emergence of antibiotic-resistant bacteria: in the animals’ own guts, in the manure that farmers use on crops or store on their land, and in human illnesses as well. Resistant bacteria move from animals to humans in groundwater and dust, on flies, and via the meat those animals get turned into.
An annual survey of retail meat conducted by the Food and Drug Administration—part of a larger project involving the CDC and the U.S. Department of Agriculture that examines animals, meat, and human illness—finds resistant organisms every year. In its 2011 report, published last February, the FDA found (among many other results) that 65 percent of chicken breasts and 44 percent of ground beef carried bacteria resistant to tetracycline, and 11 percent of pork chops carried bacteria resistant to five classes of drugs. Meat transports those bacteria into your kitchen, if you do not handle it very carefully, and into your body if it is not thoroughly cooked—and resistant infections result.
Researchers and activists have tried for decades to get the FDA to rein in farm overuse of antibiotics, mostly without success. The agency attempted in the 1970s to control agricultural use by revoking authorization for penicillin and tetracycline to be used as “growth promoters,” but that effort never moved forward. Agriculture and the veterinary pharmaceutical industry pushed back, alleging that agricultural antibiotics have no demonstrable effect on human health.
Few, though, have asked what multi-drug–resistant bacteria might mean for farm animals. Yet a post-antibiotic era imperils agriculture as much as it does medicine. In addition to growth promoters, livestock raising uses antibiotics to treat individual animals, as well as in routine dosing called “prevention and control” that protects whole herds. If antibiotics became useless, then animals would suffer: individual illnesses could not be treated, and if the crowded conditions in which most meat animals are raised were not changed, more diseases would spread.
But if the loss of antibiotics change how livestock are raised, then farmers might be the ones to suffer. Other methods for protecting animals from disease—enlarging barns, cutting down on crowding, and delaying weaning so that immune systems have more time to develop—would be expensive to implement, and agriculture’s profit margins are already thin. In 2002, economists for the National Pork Producers Council estimated that removing antibiotics from hog raising would force farmers to spend $4.50 more per pig, a cost that would be passed on to consumers.
H. Morgan Scott, a veterinary epidemiologist at Kansas State University, unpacked for me how antibiotics are used to control a major cattle illness, bovine respiratory disease. “If a rancher decides to wean their calves right off the cow in the fall and ship them, that’s a risky process for the calf, and one of the things that permits that to continue is antibiotics,” he said, adding: “If those antibiotics weren’t available, either people would pay a much lower price for those same calves, or the rancher might retain them through the winter” while paying extra to feed them. That is, without antibiotics, those farmers would face either lower revenues or higher costs.
Livestock raising isn’t the only aspect of food production that relies on antibiotics, or that would be threatened if the drugs no longer worked. The drugs are routinely used in fish and shrimp farming, particularly in Asia, to protect against bacteria that spread in the pools where seafood is raised—and as a result, the aquaculture industry is struggling with antibiotic-resistant fish diseases and searching for alternatives. In the United States, antibiotics are used to control fruit diseases, but those protections are breaking down too. Last year, streptomycin-resistant fire blight, which in 2000 nearly destroyed Michigan’s apple and pear industry, appeared for the first time in orchards in upstate New York, which is (after Michigan) one of the most important apple-growing states. “Our growers have never seen this, and they aren’t prepared for it,” says Herb Aldwinckle, a professor of plant pathology at Cornell University. “Our understanding is that there is one useful antibiotic left.”
Is a post-antibiotic era inevitable? Possibly not — but not without change.
In countries such as as Denmark, Norway, and the Netherlands, government regulation of medical and agricultural antibiotic use has helped curb bacteria’s rapid evolution toward untreatability. But the U.S. has never been willing to institute such controls, and the free-market alternative of asking physicians and consumers to use antibiotics conservatively has been tried for decades without much success. As has the long effort to reduce farm antibiotic use; the FDA will soon issue new rules for agriculture, but they will be contained in a voluntary “guidance to industry,” not a regulation with the force of law.
What might hold off the apocalypse, for a while, is more antibiotics—but first pharmaceutical companies will have to be lured back into a marketplace they already deemed unrewarding. The need for new compounds could force the federal government to create drug-development incentives: patent extensions, for instance, or changes in the requirements for clinical trials. But whenever drug research revives, achieving a new compound takes at least 10 years from concept to drugstore shelf. There will be no new drug to solve the problem soon—and given the relentlessness of bacterial evolution, none that can solve the problem forever. In the meantime, the medical industry is reviving the old-fashioned solution of rigorous hospital cleaning, and also trying new ideas: building automatic scrutiny of prescriptions into computerized medical records, and developing rapid tests to ensure the drugs aren’t prescribed when they are not needed. The threat of the end of antibiotics might even impel a reconsideration of phages, the individually brewed cocktails of viruses that were a mainstay of Soviet Union medical care during the Cold War. So far, the FDA has allowed them into the U.S. market only as food-safety preparations, not as treatments for infections.
But for any of that to happen, the prospect of a post-antibiotic era has to be taken seriously, and those staring down the trend say that still seems unlikely. “Nobody relates to themselves lying in an ICU bed on a ventilator,” says Rice of Brown University. “And after it happens, they generally want to forget it.”
When I think of preventing this possible future, I re-read my great-uncle’s obit, weighing its old-fashioned language freighted with a small town’s grief.
The world is made up of “average” people, and that is probably why editorials are not written about any one of them. Yet among these average people, who are not “great” in political, social, religious, economic or other specialized fields, there are sometimes those who stand out above the rest: stand out for qualities that are intangible, that we can’t put our finger on.
Such a man was Joe McKenna, who died in the prime of life Friday. Joe was not one of the “greats.” Yet few men, probably, have been mourned by more of their neighbors — mourned sincerely, and sorrowfully — than this red-haired young man.
I run my cursor over the image of the tattered newsprint, the frayed creases betraying the years that someone carried the clipping with them. I picture my cousin’s grandmother flattening the fragile scrap as gently as if she were stroking her brother’s hot forehead, and reading the praise she must have known by heart, and folding it closed again. I remember the few stories I heard from my father, of how Joe’s death shattered his family, embittering my grandfather and turning their mother angry and cold.
I imagine what he might have thought — thirty years old, newly married, adored by his siblings, thrilled for the excitement of his job — if he had known that a few years later, his life could have been saved in hours. I think he would have marveled at antibiotics, and longed for them, and found our disrespect of them an enormous waste. As I do.