|Biosense or Biononsense?|
Years of development and hundreds of millions of dollars later, what has the CDC's syndromic surveillance program accomplished?
It was two days before Thanksgiving 2004, when an epidemiologist in New Jersey's public health agency picked up a heart-stopping electronic message. The "Sentinel Infection Alert," from a computer program run by the Centers for Disease Control and Prevention, had detected what appeared to be the beginning of a smallpox outbreak in the state. There, however, New Jersey's public health officials were stuck. The alert, generated from deep within the CDC's headquarters in Atlanta, didn't clarify the data's origin, what sparked it, or how state officials could respond.
Calling the CDC two days before Thanksgiving didn't help. New Jersey's frantic health officials found the CDC's staffing and pace reminiscent of a sleepy, preterrorist agency. No one was in at the CDC's Biointelligence Center, which is supposed to support the computer program. Even the CDC's emergency operations center seemed unable to shed light on the warning. "We certainly were concerned," says Christina Tan, deputy state epidemiologist for the New Jersey Department of Health and Senior Services, recalling how they called the CDC's technical help desk and left a message.
It took several days for New Jersey officials to establish that the alert wasn't the beginning of a deadly bioterror attack, but had been triggered by someone's allergic reaction to a smallpox vaccine at a local military facility. This false alert came from the government-funded computer program, Biosense. The complex program, which culls electronic health data from 350 of the nation's urban hospitals as well as veterans' hospitals and defense department facilities, comes after a string of costly, and never fully realized computer ventures before it. But three years into its development, with a price tag of around $230 million (on top of millions more spent on unsuccessful systems before it), it is unclear as to exactly what the program can accomplish.
CDC officials acknowledge Biosense's shortcomings. The program was initially designed to detect the earliest possible symptoms from a bioterror attack. Instead, it lacks real-time capability and has issued a stream of false alarms that would be comical were the stakes not so high. Due to a hospital coding error, for example, congestive heart failure (CHF) became an alert for the frequently fatal Crimean Hemorrhagic Fever (CCHF). Tan, in New Jersey, recalls a Biosense alert of plague that, in fact, was cholesterol plaque.
Barry Rhodes, deputy director for the CDC's division of emergency preparedness and response, says the system is moving away from false alerts - but also from early event detection. Instead, he says that Biosense provides "situational awareness" that can help health officials respond to "flu preparedness, and understanding from a remote location what's going on in a city without having to send people there," says Rhodes. "Policy makers will be able to look at this to understand where schools might need to be closed, resources deployed."
Last year, in testimony before a Senate subcommittee overseeing the national response to bioterror, two experts urged senators not to spend "another penny" on Biosense until the program's goals, and how to achieve them, became clear. Nicole Lurie, codirector for Public Health at the Rand Corporation's Center for Domestic and International Health Security, stated that among local public health departments, the program has another name: Biononsense.
A SURFEIT OF SURVEILLANCE SYSTEMS
Typically, the nation's public health officials fear for the worst and hope for the best, but lack real-time data to guide them in responding to emergencies. "One of the reasons there is so much debate about health policy is because it's all based on nothing," says William Yasnoff, describing the woeful condition of the nation's health-data collection efforts. Yasnoff is a physician and computer scientist formerly with the CDC and the federal Health and Human Services department, and is now an independent consultant.
After the September 11, 2001, attacks and deadly anthrax letters, the government embraced syndromic surveillance - once an obscure epidemiologic tool that zoos used to safeguard their rarest animals - as the best hope for detecting a bioterror attack or pandemic outbreak at the earliest possible stage.
Since then, efforts to monitor the nation's symptoms electronically have been costly and chaotic. More than 30 city, state, and federal syndromic surveillance programs are now in operation, all with vastly different and often contradictory approaches and little ability to share information. If anything unites these programs, it is the central question for syndromic surveillance planners: What symptoms, at what intensity, should make the system go "ding?" And if it creates an alert, how can one ensure that it is accurate?
Systems sensitive enough to detect the beginnings of a real event are also likely to detect and flag the beginnings of a nonevent, sending overworked, underpaid public health officials down countless irrelevant alleys. With "federal systems pumping out alerts, states pumping out alerts, it's sort of like Three-Mile Islands, with a cacophony of alerts," says Ivan Gotham, director of the Bureau of Healthcom Network Systems Management with the New York State health department.
This problem, the "signal-to-noise ratio," is critical. To succeed, what should a system's threshold be? A rise in the purchase of antidiarrheal medicine? Maybe it's an outbreak, or maybe there's a discount being offered. A case of gangrene in a New York emergency room? Maybe it's an intentional release of the plague, or maybe it's a tourist from New Mexico, who caught the plague from flea-bitten rats.
Can a data-mining system such as Biosense successfully isolate a "signal" of a true event amid all the background "noise" of routine events? "To be frank, we have admitted since the beginning that the utility of syndromic surveillance has yet to be proven," says Jim Seligman, the CDC's chief information officer. "Until we sharpen the algorithms, there will always be noise in the data." The anthrax letters were likely too small an event for syndromic surveillance to pick up, says Seth Foldy, a public health informatics consultant. Instead, an astute clinician discovered it. For an attack big enough for Biosense to pick up, you might not need technology at all, says Dan Desmond, president of the SIMI Group, a California information technology firm. You'd learn as much from hanging out at the local morgue.
Foldy, formerly Milwaukee's health commissioner, knows better than almost anyone that public health departments can miss an iceberg, a huge catastrophe unfolding before its very eyes. In 1993, over the course of a single week, 400,000 people in the greater Milwaukee area - one third of the entire county - became ill with severe diarrhea. It was the result of an outbreak of cryptosporidium in the drinking water, and more than 100 people died.
The public health department was blindsided for good reasons. It was an unusual disease, and back then, clinicians were generally not required to report emerging symptoms; they reported confirmed diagnoses instead. Two sets of phone calls finally "rang the bell" for public health officials: One came from pharmacies reporting that antidiarrheal medicine was flying off the shelves. The other came from laboratories, which reported that they couldn't keep their testing reagents in stock. (Iceberg is an apt metaphor, because if a single patient surfaces with alarming symptoms, public health officials know that the bulk of cases are likely submerged and will explode into view shortly, depending on the disease's incubation period. One study has shown that an aerial anthrax attack of a large American city could cost hundreds of millions of dollars for each hour of delay in preventive treatment, Foldy points out.)
A FEDERAL MANDATE, MIXED RESULTS
Last year, Congress passed the Pandemic and All Hazards Preparedness Act, which mandates the use of information technology to improve public health awareness. The need is clear, say some experts. "Surveillance capacity is no longer optional," says Farzad Mostashari, assistant commissioner of New York City's Department of Health and Mental Hygiene, who oversees the city's highly praised syndromic surveillance efforts. "It's necessary and critical for state and local health departments," he says, but notes that a centralized Federal data system thousands of miles away from an incident cannot be a substitute for local officials having up-to-the-minute local information. "You wouldn't do that with police. You wouldn't do that with firemen," he adds.
Congress has responded with funding. In 2002, it passed the Public Health Surveillance Act, which authorized $1 billion for the CDC to distribute to states for public health preparedness. Included was a mandate that states develop some kind of electronic means of monitoring emerging symptoms, but there were few guidelines and no national strategy.
As the states set to work, many encountered difficulty. They struggled to capture data without asking for a new investment of time by harried physicians. They strove to design systems sensitive enough to detect the beginnings of an attack or outbreak, but not so sensitive that they would set off false alarms or trigger panic.
Some programs worked well. New York City developed a successful program that every day crunches the electronic visitor logs from 50 city emergency rooms. Emergency rooms routinely generate these records, which capture a patient's chief complaint. The city's surveillance system clearly shows swells and dips in flu, asthma, and gastroenteritis.
Pittsburgh has a system called RODS (Realtime Outbreak and Disease Surveillance) that tracks emergency room visits and over-the-counter drugstore purchases. Milwaukee took a slightly different approach, routing varying streams of data through what they call "surveillance dashboards." They use data from RODS, as well as from ambulance runs, and have built-in capability to query emergency rooms.
Alongside the success stories, however, have been some grand failures. California still lacks a syndromic surveillance system, and the effort to develop one has bogged down amid a Balkanized system of 61 local health departments that are accustomed to acting autonomously. After a public commission found California to be woefully unprepared to detect and respond to a bioterror attack, the state senate passed legislation to test three different electronic surveillance systems with the goal of choosing one. The plan is now dead, as Governor Schwarzenegger vetoed it last year.
Without strong leadership, many of the systems that have evolved across the country don't communicate with each other. Standards and progress vary widely, with cities such as New York way ahead while other regions and places have no connectivity at all. Some local health departments still lack Internet connections and use fax machines instead.
Nonetheless, the problem is "not just technology," says Yasnoff. It's the mentality and resources of low-performing organizations. "Not only do public health agencies not know anything," he says, "but they don't have the resources they need if they did."
In 2003, the CDC directed every local health department to have a well-publicized emergency phone number and someone around the clock to respond to urgent calls within 30 minutes. Alerting a responsible human being would appear to be a basic requirement in any emergency, let alone when confronting suspected bioterror. In 2005, RAND researchers tested this by calling 19 public health agencies in 18 states with urgent reports of suspected smallpox or bubonic plague. They found that only two responded within 30 minutes, and some didn't return calls for days. Three agencies didn't respond to any of the first five calls. In one case, a health official told the caller not to worry about symptoms of bubonic plague and to "go back to sleep."
"An alarm system is only as good as what happens when the alarm goes off," says Michael Stoto, associate director for public health at the RAND Center for Domestic and International Health Security.
In December 2003, after two years of sinking money and effort into developing various surveillance systems, the states got a new and unexpected message from the CDC. Their work would now be slotted into a new national $230 million system called Biosense.
Biosense was born soon after the 2003 State of the Union address, in which President Bush announced Bioshield, a $6 billion initiative to jumpstart the development and production of vaccines and treatments for bioterror agents. This torrent of money was just the beginning. Biowatch - a series of environmental sensors to detect the release of bioterror agents, and Biosense followed.
Bioshield has foundered publicly. Last December, the Bush administration cancelled an $877 million contract with VaxGen, a company slated to produce 75 million doses of an anthrax vaccine after it had failed to make critical progress. The company had been selected even though it had never successfully produced a single drug.
Comparatively, Biowatch and Biosense appear to be making progress. The CDC projects that by year's end, Biosense will be receiving and analyzing data from 1,400 of the nation's hospitals. Some local health officials say they believe the program is "very effective" and should continue to be expanded. But others question what, if anything, the program actually does.
A HIGH NOISE TANGENT
When first launched, Biosense was supposed to be a data-mining Goliath that would gather real-time electronic data from clinical laboratories, hospital systems, clinics, health insurance plans, military treatment facilities and pharmacy chains. The data would then be crunched and retransmitted to local hospitals and other federal health agencies.
Over a year later, the program was beset by problems ranging from privacy concerns to questions of jurisdiction. State officials complained that the program was an inscrutable black box: It took their data but did little to return it in useable form or to the correct officials.
The biggest problem, say public health experts, is that the CDC lacked the technological expertise for the undertaking. A fact tht CDC officials acknowledge and which led to the establishment of an information center in 2005 to address this deficit. But as Elin Gursky, principal deputy for biodefense and public health programs at the ANSER Institute for Homeland Security asks, "Do you give a Maserati to someone who just got their driver's license?"
States that had already expended time and money developing their own systems resented the intrusion. "We already have these reporting streams from our hospital to us," says Ivan J. Gotham, who oversees the system that culls data from 256 hospitals in New York State. (After 9/11, New York's hospitals had been bombarded with demands for information and had no idea who was entitled to what. They had asked for the system.) "Why invest additional money to circumvent a system we already have?" Lisa Hines, a CDC spokesperson, says the agency is not recreating the wheel, but rather is gathering data from existing networks, thereby making Biosense the "network of networks."
Tigi Ward, the public health coordinator for surveillance in Lubbock, Texas, recalled that the demonstration of the Biosense program she saw didn't make epidemiologic sense. Why upload tons of raw data to the CDC and wait hours, if not days, for them to contact you with something of local importance, she asks, when local physicians can flag it for you immediately? And since Biosense sifts massive amounts of data from events that have already occurred, it risks sending already-harried local public health officials on an irrelevant "high-noise tangent," she says, as shown by the New Jersey smallpox alert.
GETTTING REAL-TIME SITUATIONAL AWARENESS
Initially, the CDC appealed to state and local officials for data from big urban hospitals. Officials in west Texas say they were glad not to be asked to participate. Instead they chose a Web-based computer program called SYRIS, the Syndrome Reporting Information System. SYRIS is a stripped-down, clinician-driven surveillance program that its designer, Alan Zellicoff, says was intended to capture in electronic format what doctors do every day when treating patients: distinguish urgent or bizarre cases from routine maladies.
The clinicians using SYRIS log in the gender, age range, zip code and symptoms of any patient with an unusual condition, a process that takes about 30 seconds for each. The program costs about 15 cents per capita to use, or less than $255,000 total for the 1.5 million patients it covers in West Texas. Health officials say that the simple, real-time program has already allowed them to detect an early outbreak of flu, rule out a bioterror attack, and manage the evacuees from Hurricane Katrina.
In Katrina's wake, about 800 refugees from New Orleans poured into the Reese Air Force base outside Lubbock, Texas. Since they had trudged through toxic brown water for days and arrived with symptoms of skin rash, fever, and diarrhea, health officials braced for a cholera outbreak. As local clinicians mobilized, they downloaded the SYRIS program, already in routine use in the surrounding 41 counties and entered in patient cases.
Sixteen miles away at an emergency operations center, Lubbock health officials monitored a radar screen in which blips appeared, grouped by zip code, each representing a patient seen by an area doctor. These showed that the cluster of symptoms at the Air Force base had not spread. The SYRIS program had given health officials real-time "situational awareness." They did not need to waste resources fighting a phantom outbreak.
Because trained physicians only enter critical data, SYRIS has a good signal-to-noise ratio, says Ward. It signals only what is happening, not what might be happening. But in New Orleans after Katrina, as hospitals flooded, patients died, and trapped doctors made life-and-death decisions, local public health officials and those arriving from the CDC were flying blind. They knew painfully little about the symptoms emerging from the chaos or what was happening before their very eyes.
Of all the failings during Hurricane Katrina, public health experts view this situational blindness as inexcusable, given the available technology. There was no reason not to know which of the hospitals needed the resources, which were under water, says Gursky of the ANSER Institute. "There is enough technology to look down and see it," she says. "It's the same technology in New York that catches you when you make an illegal left turn."
Zellicoff, SYRIS' inventor, says he would love to see a double-blinded study comparing his program's results to that of Biosense. Despite his urging on three national teleconferences with Julie Gerberding, CDC director, no one has yet to initiate such a study.
A SQUANDERED INVESTMENT?
Electronic monitoring of diagnoses and symptoms has emerged as both a freight train and a gravy train. The states must scramble to meet the costly and frequent changing electronic standards from the CDC. Failing to do so means they may lose grant money or get negative write-ups in ever-proliferating preparedness report cards. Last year, one report by the Trust for America's Health found that 12 states and Washington, DC, did not have technology compatible with the CDC's National Electronic Disease Surveillance System (NEDSS), a standard that would allow them to share electronic data of diagnoses.
But it is not clear whether the lack of compliance actually means less preparedness. "It might mean they are more prepared," says Yasnoff, pointing out that a portion of the software is largely defunct, but has generated multi-million dollar contracts for private firms. The CDC is "doing an excellent job spending money," says Dan Desmond of the Simi Group.
With a whistleblower-fueled investigation underway in the Senate's finance committee into the CDC's oversight of at least $6 billion in grants for emergency preparedness, many public health experts and oversight officials fear that this unprecedented investment has been squandered. As ranking member on the finance committee Senator Chuck Grassley (R-Iowa) says, "The bottom line is we don't know what exactly has been purchased with the grant money. Like the saying goes, 'A billion here, a billion there pretty soon adds up to real money'."
Katherine Eban's reporting was funded in part by The Nation Institute. Eban was a 2006 Alicia Patterson Fellow reporting on public health and homeland security.