Philip Alcabes discusses myths of health, disease and risk.

A Blog Worth Following

If you haven’t already, put Crawford Kilian’s H5N1 blog on your regular reading list.  There, while you’ll still get updates on the H5N1 avian flu virus and occasional pieces on H1N1 flu (and you can see a multitude of archived posts from 2009  filled with international material on the progress of last year’s flu — and the reaction to it), you now get a much-expanded scope, including news and commentary on the spread of infectious diseases of different sorts.

What I value about H5N1 is the tracking of the mosquito-borne viral diseases, like dengue and chikungunya as well as H1N1, that reveal the effects of the elision of ecosystem boundaries; the close attention to outbreaks that stem from changes in human-animal interactions — like the recent outbreak of plague in Tibet and, of course, H5N1; and the watch it keeps on the vaccine trade, as in yesterday’s post picking up a report in The Nation on the purchase of flu vaccine from France and one last week on a US tech company’s trials of a new flu vaccine (which won’t help the public but is, apparently, already helping the company to get richer).

The kind of close attention to the details of complex interactions amongst humans, animals, and both the natural environment and the economic one that H5N1 shows is indispensable.   It should spur more interest in wresting public health away from the simple-minded mass-vaccination schemes of medical officials in the U.S. and other wealthy countries — the point of which is usually to transfer public monies into the hands of pharmaceutical companies.  And move us to toward a more complex and inclusive view of the nature of health.

Public Health Priorities: Follow the Money

Thanks to Crof at H5N1 for bringing to our attention a strong editorial in yesterday’s Bangkok Post.   The editorialists note that H1N1 preparedness efforts were not always successful and that WHO, fresh from announcing that the H1N1 pandemic is over, is now promoting fears of renewed outbreaks of H5N1 (avian) flu.  The editorial continues:

While it would be foolish to dismiss such warnings as this latest one on bird flu, it is important we keep a sense of proportion and not let them distract us from countering the unfashionable but widespread potential killers such as tuberculosis, HIV/Aids, diabetes, cancer, dengue and malaria. These are the diseases already causing widespread illness and economic harm….

Rather than competing for cash, the threat from newer diseases should serve as a catalyst to combat existing epidemics.

Competing for cash is key.

Funding for TB languishes, dengue incidence expands, more people with the AIDS virus are getting treated but new infections continue to occur, water scarcity (and displacement because of wars and natural disasters) makes diarrheal illness a persistent problem, and malaria transmission continues to threaten billions of people who live in tropical and subtropical regions — but flu preparedness dominates the public health scene.   Why?

Here’s the infernal logic of WHO and the public health officers of wealthy countries (U.S., U.K., etc.):  (a) At the start of the H1N1 outbreak in 2009, a sensible worst-cast forecast was about a million deaths worldwide; the more likely scenario was well under 500,000 deaths.  (b) TB + malaria + diarrhea + AIDS together kill 6 or 7 million people a year.   (c) Immunization against flu is notoriously variable in its effectiveness and mass immunization is almost never effective (except if instituted in an isolated population well before the flu virus makes inroads into the population).

Sounds like it would be worth it to pump lots of resources into reducing the incidence of malaria, TB, AIDS, and diarrhea.  But that’s hard.  It takes political will.  Whereas immunizing against flu is easy: it just takes money.  And national health officials were eager (it turned out) to transfer billions of dollars, pounds, and euros into the hands of vaccine manufacturers in order to be able to immunize their populations against H1N1 flu.

To an official whose job is to watch out for the needs of the economic machine, immunization pays.

One flu vaccine manufacturer estimates that in the U.S., employers lose $2.1 billion each year in productivity because of flu-related absences from work.  Let’s be skeptical about this estimate, coming as it does from one of the beneficiaries of federal largesse in response to flu fears.  But the point is clear enough:  it was a great boon to the private sector to have the federal government spend $1.6 billion of taxpayer money on flu vaccine in 2009 even though the outbreak was mild and vaccine did virtually nothing to stop it.  Because with the feds footing the bill, the burden on corporations was slight, whereas the private sector would have lost a lot of money if many Americans had fallen ill with flu.

It’s not just the vaccine manufacturers and pharmaceutical companies who stand to capitalize on the absurd calculus of protecting American businesses instead of poor people’s lives:  scientists do, too.

Robert Webster is an eminent virologist who has become dean of those American scientists who purport to be able to foresee a future flu catastrophe.  Perhaps he’s right, but of course nobody knows.  So when Webster says

We may think we can relax and influenza is no longer a problem. I want to assure you that that is not the case,

as he just did in a meeting in Hong Kong, it’s a good sign that the preparedness crusaders are worried about their funding.  They should be.

The preparedness crusaders have been unmasked as shameless shills for the private sector,  even if the vaccine and antiviral manufacturers aren’t paying them directly.  And the ones who are scientists have been revealed as self-important promoters of their own research — so fiercely protective of their own turf that they might use their prestige and the imprimatur of science to hoodwink officials into ignoring the more serious, and more certain, problems of the developing world.

Let’s hope that more opinion makers take the stand that the editors in Bangkok just did.

Media Culture: Beyond Fat and Salt?

Over at Media, Culture & Health, Steven Gorelick notes that a story on salt and the food industry, which appeared on page A1 of the print NY Times on Sunday, would not have made the front page in the past.

What has changed?  How does the story of wrangling over the sodium content of American food merit space in the main news sections of the most influential media — even the front pages of the NY Times or LA Times?

1.  One answer is that health occupies much of the American conversation today.  A visitor from another planet watching our TV news shows or reading the main newspapers would have to be forgiven for thinking that Americans are dying from a multitude of irrepressible disease threats.  We can’t seem to stop talking about how to improve our health.

(In fact, as Michael Haines notes at the Economic History Association website, U.S. life expectancy almost doubled between 1850 and 1960, from 39.5 years to 70.7 years; since then it has increased slowly, and is now estimated to be about 78.2 years.  In other words, health wasn’t a matter of news much during the time when longevity was improving dramatically, in the late 19th century and first half of the 20th.  By the time health became a cultural preoccupation, the majority of Americans were living well past middle age.)

2.  Another answer, perhaps more important is that when we talk about health today we mean personal responsibility.

When I began studying epidemiology, in the late 1970s, public health essentially meant disease control.  Yes, lip service was paid to so-called health promotion — much was made of the World Health Organization’s definition of health, promulgated in 1946:

Health is a state of complete physical, mental and social well-being and not merely the absence of disease or infirmity.

But no metric for complete well-being was widely recognized.  And the usual epidemiologic measures of incidence and mortality rates, life expectancy, and so forth seemed to work just fine as ways of understanding why some groups of people lived longer and more capable lives, while others lived miserably and died young.

Sometime since then, the health sector, including public health, has turned to individual responsibility as the key to well-being.

If each of us is responsible for his or her own health, then it’s our own fault if we get sick.  Naturally, advice abounds:  buckle up, use a condom, eat less fat, know your cholesterol level, wash your hands, use mosquito repellent containing DEET, wear sunblock, eat fresh fruit and vegetables every day, lower your stress.

The advice adds up to this:  know your limits.  Federally sponsored research tells us that self-control is ontagious.

The personal-responsibility view of health says, “control your appetites.”

3.  But let’s think about another change:  more people are concerned about the American diet.  As noted last week, the food movement has given us ways to think about eating that go beyond the tiresome story of obesity and hypertension — Beyond Fat and Salt, you could say.

Of course, the main media outlets still tell the food story in Fat-and-Salt language, as the news articles in the NY Times, LA Times, and others show.  It’s the food industry vs. the foodies, or the food industry vs. public health, or the food industry and public health vs. appetites — anyway, somebody against somebody in the name of health.

The media aren’t quite past obesity and hypertension yet.  But as the culture moves beyond obsessive self-inspection in the name of health, no doubt media will, too.


AIDS Goes to Ground

This week, Donald McNeil, Jr. continues his praiseworthy efforts to highlight the sad reality of AIDS among the world’s poor.

In an article posted on the NY Times website Sunday (and published in the print edition Monday), McNeil reports on the inability of treatment programs in parts of Africa (this piece focuses on Uganda) to keep up with the need for AIDS medication as funding falls.   A very compelling video report accompanies the online version of the article.

An accompanying article explains the decline in funding, starting with the fall in the U.S. administration’s request on behalf of PEPFAR, as a Times graphic shows.

The number of new infections with the AIDS virus is estimated to be about 2 million per year now.  Some observers think annual incidence will rise as the population expands; even if not, the annual number of new AIDS virus infections is unlikely to fall in the near future, given present circumstances.

At the same time, the Times reports, anticipated PEPFAR funding is essentially flat to 2013, at $5 to $5.5 billion per year.  Financing for AIDS medications through the Global Fund to Fight AIDS, Tuberculosis and Malaria is in dire straits.

In terms of people, not dollars:  of the 33 million or so individuals who are infected with the AIDS virus worldwide, only about 4 million get regular antiretroviral therapy.

A few years ago, I wondered why,  after a quarter-century of AIDS and with the availability of effective treatment (at least in wealthy countries), Americans still didn’t see AIDS as an ordinary illness.

Now I have an answer:  we do see AIDS as ordinary… for poor countries.  To us, AIDS is no longer an epidemic problem worth our getting worked up over, or so it would seem judging by PEPFAR.  AIDS is like malaria, tuberculosis, or schistosomiasis.  It’s like diarrhea.  The Bill and Melinda Gates Foundation will put money into research or specific programs but we as a country will not need to care anymore.  We shift the funding away from the people in Africa, who are going to die young anyway, and put it into the hands of institutions (often, pharmaceutical companies) that can give us the promise of immunity from disaster.

The U.S. put less funding last year into PEPFAR than it did into preparations for H1N1 flu ($7.6 billion) or the school lunch program ($14.9 billion, according to the Robert Wood Johnson Foundation’s Center to Prevent Childhood Obesity), battleground in the war against childhood obesity.

Flu and obesity are epidemic.  They threaten American assumptions about ourselves.  “Epidemic” means:  crisis in our society.  Our epidemiologists say that malaria, diarrhea, and the other problems that collectively kill 20,000 or 25,000 people (mostly children) every day are endemic

“Endemic” means:  not our problem.

AIDS is endemic too, now.  It has gone to ground, gone the route of other once-dreaded infections that caused calamity in America and triggered heated debate (yellow fever, cholera, typhoid, TB) but have disappeared from our scene.  It’s their problem, now.

Early Onset of AIDS Therapy

Late last week, the NY Times reported that the city of San Francisco’s Department of Public Health is going to begin advising people with HIV to begin antiretroviral treatment (ART) immediately, rather than waiting for the CD4 count to decline.

The policy seems to be based primarily on a secondary analysis of longitudinal data from a multi-center study of HIV-infected people in the U.S. and Canada, the NA-ACCORD study.  The results were reported in the New England Journal of Medicine a year ago.  In that analysis, people with HIV whose CD4 counts were between 351 and 500 who began ART immediately were compared to those who deferred ART until CD4 count was 350 or less.  The deferred-ART group was found to have a 69% higher risk of mortality (from any cause) than were those who began ART before CD4 count fell to <350.  Similarly, among HIV-infected people with CD4 counts above 500, those who began ART after CD4 count was <500 had a 94% higher risk of mortality compared to those who began ART immediately.

But is this a good basis for across-the-board policy for a city the size of San Francisco?  Some physicians worry about the development of drug resistance among viral strains.  Others are concerned about toxicity, always a problem worth considering.  Paul E. Sax tracks the history of the idea and includes a few quotes in his blog post yesterday.

Some commentators wonder whether the new policy is meant to be a boon to pharmaceutical companies.  That’s not a crazy concern:  the Bay Area Reporter noted a couple of weeks ago that San Francisco plans to shift to a “status awareness” policy, increasing HIV testing by 70,000 people per year in an effort to halve the rate of new infections by 2015.  If successful, the increase in testing combined with an increase in recommendations for early ART would expand the market for antiviral medications substantially.

A few aspects of the April 2009 report on NA-ACCORD raise worries about whether it should be the basis for broad-based policy.  First, people who deferred therapy were observed very briefly (median 1.3 years, many of them for only 6 months), so any advantage to early therapy appears to refer to the period immediately post onset of therapy.  That’s important because toxicity and/or resistance might not be evident right away.  Second, looking only at people with an initial CD4 count above 500 and holding constant self-reported history of drug injection, there was only weak evidence for a slight effect of early treatment on mortality (the relative mortality hazard was 1.28 (95% confidence interval 0.85 to 1.83)).  Drug users had a higher mortality risk, and this finding—on which the authors of the New England Journal paper do not comment—suggests that early ART did not reduce the hazard of death for drug users.

Also, the authors of the NEJM paper could not possibly account for some of the hard-to-regiment aspects of HIV infection.  These would include variations in cause of death, treatment adherence, and monitoring of treatment effects — all of which would either not be evident in a cohort study or could not be controlled in a secondary analysis.

Finally, the authors are slightly cagey about the effect of drug-injection history in the above-500-CD4-count group, reporting a twofold increase in death hazard for those who delay ART after excluding people with a drug-injection history – but never reporting information on the effect of ART delay among drug injectors alone.

Most important, observations on people who transitioned to the next-lower CD4 compartment (i.e., from above 500 to <500, or from 351-500 to <350) were censored after 6 months if the individual had not yet initiated ART.  Therefore, the real comparison the NEJM authors are making is between immediate-onset ART while CD4 count remains in the same CD4 compartment vs. immediate-onset ART after CD4 count has dropped to the next lower compartment.  It’s not really a study of immediate versus delayed onset ART.

There’s plenty of reason (including the 2009 NEJM paper) to think that suppressing HIV early rather than late should be helpful, and some reason to think that the reduction in viral load produced by ART will lower infectivity in a way that makes transmission to uninfected sexual or drug-sharing partners less likely.  That in turn could be of public-health value.

Of course, nobody is being forced to start ART before he or she wants to, no matter the policy recommendation. Still, it’s worth wondering whether the expansion of testing and extension of early treatment will substantially improve the public’s health in a way that makes the cost, and self-evident advantages to pharmaceutical (and test-kit) manufacturers, worthwhile.