Three decades ago, the June 5, 1981, issue of Morbidity and Mortality Weekly Report (MMWR) reported on five previously healthy young gay men in Los Angeles diagnosed with pneumocystis carinii pneumonia (PCP), an infectious disease usually seen only in people with profoundly impaired immune function. As a specialist in infectious diseases and immunology, I had cared for several people with PCP whose immune systems had been weakened by cancer chemotherapy. I was puzzled about why otherwise healthy young men would acquire this infection. And why gay men? I was concerned, but mentally filed away the report as a curiosity.
One month later, the MMWR wrote about 26 cases in previously healthy gay men from Los Angeles, San Francisco and New York, who had developed PCP as well as an unusual form of cancer called Kaposi’s sarcoma. Their immune systems were severely compromised. This mysterious syndrome was acting like an infectious disease that probably was sexually transmitted. My colleagues and I never had seen anything like it. The idea that we could be dealing with a brand-new infectious microbe seemed like something for science fiction movies.
Little did we know what lay ahead.
Soon, cases appeared in many groups: injection-drug users, hemophiliacs and other recipients of blood and blood products, heterosexual men and women, and children born to infected mothers. The era of AIDS had begun.
I changed the direction of my career to study this disease — to the chagrin of my mentors and many colleagues — and began a 30-year journey through this extraordinary global health saga. The early years of AIDS were unquestionably the darkest of my career, characterized by frustration about how little I could do for my patients. At hospitals nationwide, patients were usually close to death when they were admitted. Their survival usually was measured in months; the care we provided was mostly palliative. Trained as a healer, I was healing no one.
In the first couple of years, few scientists were involved in AIDS research, and there was very little funding to study the disease. Initially, we did not know the infectious agent — if indeed there was one — so researchers had no precise direction in which to search.
The first major research breakthrough came in 1983 with the discovery of the human immunodeficiency virus, or HIV, and then in 1984, with proof that it caused AIDS. Our knowledge of HIV/AIDS rapidly grew with the development of a diagnostic test in 1985 that revealed the frightening scope of the pandemic. Our desperately ill patients were just the tip of the iceberg.
The first drug that slowed the progression of HIV/AIDS — zidovudine, initially called AZT — was licensed by the Food and Drug Administration in 1987. For those in the field, this was a major high point. Finally, we could treat the disease instead of just its complications. Soon, however, we learned that the benefits of AZT as a stand-alone treatment waned within months as HIV developed resistance to the drug. The disease relentlessly progressed. The realization that we were in for the long haul began to set in.
In 1984, I became director of the National Institute of Allergy and Infectious Diseases and soon established a distinct AIDS research program. This met considerable opposition from some senior figures in medicine, who believed that I was overreacting and that focusing on AIDS would divert resources from other important infectious diseases. Despite our intensive efforts to find solutions to this emerging global plague, federal scientists — including me — and my colleagues at the Centers for Disease Control and the FDA were vilified by growing numbers of AIDS activists, who thought that the government was not moving fast enough to fight the epidemic and should modify its research agenda and drug approval procedures to meet the special circumstances of the pandemic. In many respects, the activists were correct. Looking back, one of the most productive decisions I made in the 1980s was to fully engage with the activists. Clinical trials were soon modified to be more flexible and user-friendly, and through activist engagement with the FDA, the drug approval process was markedly accelerated while retaining proper attention to safety.Soon, cases appeared in many groups: injection-drug users, hemophiliacs and other recipients of blood and blood products, heterosexual men and women, and children born to infected mothers. The era of AIDS had begun.
I changed the direction of my career to study this disease — to the chagrin of my mentors and many colleagues — and began a 30-year journey through this extraordinary global health saga. The early years of AIDS were unquestionably the darkest of my career, characterized by frustration about how little I could do for my patients. At hospitals nationwide, patients were usually close to death when they were admitted. Their survival usually was measured in months; the care we provided was mostly palliative. Trained as a healer, I was healing no one.
In the first couple of years, few scientists were involved in AIDS research, and there was very little funding to study the disease. Initially, we did not know the infectious agent — if indeed there was one — so researchers had no precise direction in which to search.
The first major research breakthrough came in 1983 with the discovery of the human immunodeficiency virus, or HIV, and then in 1984, with proof that it caused AIDS. Our knowledge of HIV/AIDS rapidly grew with the development of a diagnostic test in 1985 that revealed the frightening scope of the pandemic. Our desperately ill patients were just the tip of the iceberg.
The first drug that slowed the progression of HIV/AIDS — zidovudine, initially called AZT — was licensed by the Food and Drug Administration in 1987. For those in the field, this was a major high point. Finally, we could treat the disease instead of just its complications. Soon, however, we learned that the benefits of AZT as a stand-alone treatment waned within months as HIV developed resistance to the drug. The disease relentlessly progressed. The realization that we were in for the long haul began to set in.
There is a stunning contrast between how I felt as a physician-scientist in the 1980s and the optimism I feel today as more infections are prevented and lifesaving drugs increasingly become available throughout the world. Annual funding for HIV/AIDS research at the National Institutes of Health exceeds $3 billion, thanks to consistent support from Congress and each successive administration. Thousands of researchers globally are intensively studying HIV, developing therapies, and designing and implementing prevention modalities — including a thus-far-elusive vaccine. The surge in research efforts has enabled enormous medical advances, especially in therapeutics. More than 30 anti-HIV drugs have been developed and licensed; in combinations of three or more these medications have proved extremely effective since the mid-1990s in slowing and even halting HIV’s progression. In the 1980s, patients received a prognosis of months. Today, a 20-year-old who is newly diagnosed and receives combination anti-HIV drugs according to established guidelines can expect to live 50 more years. Furthermore, HIV treatment not only benefits the infected individual but can reduce the risk of transmitting the virus to others.
In 2002, President George W. Bush sent a team to southern Africa on an HIV/AIDS fact-finding mission. Upon our return, the president asked me to help design a plan for providing HIV-related services on a large scale in low-income countries. Eventually, this became the President’s Emergency Plan for AIDS Relief (PEPFAR). Visiting African hospitals and seeing scores of HIV-infected people, I noted that the physicians were experiencing the frustration that I and so many of my colleagues in rich countries had felt 20 years earlier, as we saw people die because of our inability to treat the disease. In Africa in 2002, the effective treatments that had transformed HIV/AIDS care in wealthy countries were available only to the privileged. The developing world clearly needed PEPFAR, and President Bush made it happen.
The implementation of PEPFAR — as well as programs such as the Global Fund to Fight AIDS, Tuberculosis, and Malaria; the Bill & Melinda Gates Foundation; the Clinton Foundation; Doctors Without Borders; and others — has changed the landscape of global AIDS. PEPFAR alone has provided anti-HIV drugs to more than 3.2 million infected people in the developing world, predominantly in southern Africa and the Caribbean, and it has offered HIV care, counseling, testing, prevention services and support to millions more. In 2010, PEPFAR’s support of antiretroviral prophylaxis to prevent mother-to-child transmission allowed more than 114,000 infants to be born HIV-free.
With most diseases, these results would sound like an unqualified success story. The HIV story, however, is far from over. There have been more than 60 million HIV infections throughout the world, with at least 30 million deaths. In 2009, 2.6 million people became infected with HIV and 1.8 million died; more than 90 percent of cases occurred in the developing world, two-thirds in sub-Saharan Africa. For every infected person put on lifesaving therapy, two to three people are newly infected. To control and ultimately end the pandemic, we will need to treat many more HIV-infected people, for their health and to reduce the risk of their sexual partners becoming infected. We also must accelerate implementation of other prevention approaches, as well as research toward a cure.
We cannot lose sight of the fact that lifesaving HIV/AIDS programs at home and abroad must be strengthened despite global constraints on resources. Enormous challenges remain and must be met by the next generation of scientists, public health officials and politicians throughout the world. History will judge us as a global society by how well we address the challenges in the next few decades of HIV/AIDS.
The writer is director of the National Institute of Allergy and Infectious Diseases at the National Institutes of Health.
No comments:
Post a Comment