Thursday, January 22, 2009
The LATimes reports the results of a rating system for California hospitals. Lisa Girion writes,
Some hospitals are better than others. But for many years all patients had to go on was reputation, doctors' advice, word of mouth and advertising. Today, California follows some other states, the federal government and a few private groups in offering a window on hospital quality.
The study by state officials of hospital death rates shows that for eight common conditions and procedures -- including stroke, hip fracture and brain surgery -- the rates vary widely.The study looked at mortality rates for 2007 and 2006. It found that, in 2007, 25 hospitals had death rates that were significantly better than the state average on at least one indicator, while 94 were significantly worse in at least one area.
In 2006, 33 hospitals had mortality rates that were significantly better on at least one indicator, while 98 hospitals rated significantly worse on at least one indicator. Los Angeles County hospitals fared especially well in acute stroke care, based on mortality statistics in 2007. Of 97 hospitals in the county, 13 had significantly better than average mortality ratings for stroke, while only one was worse than average on the indicator. . . .
Officials plan to post the study today at www.oshpd.ca.gov and said they hoped it would help improve care. "It is our hope that the timely release of these new indicators will encourage California's hospitals to examine their practices and improve their quality of care and help inform consumers and patients about their healthcare choices," said David Carlisle, director of the Office of Statewide Health Planning and Development.
But the study was immediately criticized. Torrance Memorial Medical Center, which received a worse than average mortality rating for gastrointestinal hemorrhage, said the information was badly flawed. The hospital's own review of the 40 deaths in 883 gastrointestinal hemorrhage cases during the two-year study period "revealed a startling result: 15 of the 40 patients did not expire at Torrance Memorial," the hospital said in a statement. "In fact, many of the patients listed by OSHPD as deceased are still known to us to be alive." The hospital said it discovered a programming error in the electronic data transfer from its medical record system to the state. A recalculation without the 15 cases inadvertently classified as deaths would result in a mortality rate well within the state average, the hospital said. . . .
Neil Romanoff, vice president for medical affairs at Cedars, said the study offered a limited view of hospital care because it failed to take into account deaths that occurred shortly after hospitalization. "If a hospital . . . transfers their patients out alive earlier and they die at the next level of care, what does that tell you?" Romanoff said. "These are complicated questions that are not clearly answered by one measure of quality."
Joseph Parker, director of the statewide health office's Health Outcomes Center, said a study that took into account deaths after hospitalization would be less timely. "There's a trade-off here," he said. "We wanted to get information here that is more recent and actionable."
The state plans to update the study annually and to expand the categories. The federal Centers for Medicare and Medicaid and about 15 states publicly report various hospital quality indicators. Some report how well hospitals adhere to model practice standards, while others look at mortality and other outcomes.