Monday, June 2, 2014
PCAST, Big Data, and Privacy
The President’s Council of Advisors on Science and Technology (PCAST) has issued a report intended to be a technological complement to the recent White House report on big data. This PCAST report, however, is far more than a technological analysis—although as a description of technological developments it is wonderfully accessible, clear and informative. It also contains policy recommendations of sweeping significance about how technology should be used and developed. PCAST’s recommendations carry the imprimatur of scientific expertise—and lawyers interested in health policy should be alert to the normative approach of PCAST to big data.
Here, in PCAST’s own words, is the basic approach: “In light of the continuing proliferation of ways to collect and use information about people, PCAST recommends that policy focus primarily on whether specific uses of information about people affect privacy adversely. It also recommends that policy focus on outcomes, on the “what” rather than the “how,” to avoid becoming obsolete as technology advances. The policy framework should accelerate the development and commercialization of technologies that can help to contain adverse impacts on privacy, including research into new technological options. By using technology more effectively, the Nation can lead internationally in making the most of big data’s benefits while limiting the concerns it poses for privacy. Finally, PCAST calls for efforts to assure that there is enough talent available with the expertise needed to develop and use big data in a privacy-sensitive way.” In other words: assume the importance of continuing to collect and analyze big data, identify potential harms and fixes on a case-by-case basis possibly after the fact, and enlist the help of the commercial sector to develop profitable privacy technologies.
The report begins with an extremely useful (and particularly frightening if you aren’t familiar with the internet of things) description of big data possibilities, now and in the near-term future. The description emphasizes the distinction between data “born digital”—that is, created in digital form—and data “born analog”—arising from the characteristics of the physical world and then becoming accessible in digital form. Data born analog are highly likely to contain more information than just that of particular digital interest; for example, surveillance cameras record everything that is occurring in a particular location, not just acts that are the target of surveillance. But with analytics that allow data fusion, the combination of data sources may reveal new meanings, for example profiling individuals. Big data are high volume, high velocity, and high variety, an intersection that presents serious privacy challenges.
PCAST then attempts to anticipate the privacy harms that might be associated with big data collection and analysis. The harms are in the main presented as byproducts of the benefits of developments of particular types of technologies. The list is impressive, but may miss additional harms associated with the development of a big data world. Here’s a table listing developments, benefits, and harms; I’ve marked with an asterisk benefits that I’ve reconstructed from what PCAST says but that PCAST does not state explicitly.
Technological development |
Benefit |
Associated Harm |
Digital communication |
Social networking across geographical boundaries; social and political participation on a far larger scale |
Shared pipelines and the possibility of interception |
Virtual home |
Ability to store, organize, and share personal records, e.g. cloud storage of photographs. |
“Home as one’s castle” should extent to “castle in the cloud,” not currently protected |
Inferred facts about individuals |
Delivery of desired or needed services, e.g. targeted marketing |
Inferences may be drawn about highly sensitive facts about the individual (e.g. sexual orientation)—facts of which the individual may not even be aware (e.g. early dementia) |
Locational identification |
Services such as navigation or routes, finding people or services nearby, avoiding hazards |
Stalking and tracking |
Personal profiles |
Benefits of use of statistically valid algorithms |
False conclusions about individuals may be drawn |
Discovery of special cases that apply to individuals within a population |
May allow tailoring of services to special cases—e.g. personalized medicine, instruction linked to learning styles* |
Foreclosure of autonomy—individuals may not want to take the predicted path |
Identification of individuals |
May allow individuals to be warned or protected or otherwise benefited* |
Loss of desired anonymity |
PCAST intentionally omitted from this list desires that information be used fairly and that individuals know what others know about them or are doing with their information. In the view of PCAST, neither of these “harms” can be sufficiently defined to enable policy recommendations. Also omitted from this list are more overarching concerns such as effects on identity, security, stigmatization of groups, freedom of expression, or political liberty.
PCAST’s discussion of the current technologies of privacy protection is highly informative and readers with interests in this area would do well to read the report—I won’t summarize it here. The report also debunks several standard methods for privacy protection: notice and choice (a “fantasy”), de-identification (ineffective in light of the development of analytics enabling re-identification), and non-retention or deletion (hopeless given potential for copying including the creation of multiple copies at the point analog data become digital).
Instead, the report suggests several different approaches for protection against data misuse. As a successor to notice/consent, PCAST recommends the development of “privacy preference profiles,” perhaps by third parties such as the ACLU or Consumer Reports; apps or other internet entities could then indicate whether their privacy policies comport with a profile specified by the consumer. Or, the profile developers might offer the service of vetting apps. Ideally, technologies could be developed to perform the vetting automatically. PCAST also recommends developing use controls associated with data collection, use, and subsequent transmission of data or uses. Metadata might serve this purpose but there is clearly need for further development. Another suggested strategy is audit capability as a deterrent to misuse. Finally, PCAST suggests implementing the Consumer Privacy Bill of Rights through recognition of potential harmful uses of data. Emphasis should be placed on development of best practices to prevent inappropriate data use throughout the data life cycle.
Five major policy approaches (they are called recommendations, but they are far better characterized as general directions rather than specific recommendations) conclude the report. They are:
--attention should focus on uses of big data rather than collection and analysis
--policies should not be stated in terms of technical solutions but in terms of intended outcomes
--the US should strengthen privacy-related research, including relevant social science informing successful application of technologies
--the US Office of Science and Technology Policy should increase education and training efforts
--the US should take international leadership by adopting policies that stimulate the development of privacy protective technologies.
These recommendations seem remarkably anodyne after the detailed discussion of technologies that preceded them. Moreover, they are also preceded by some other, less anodyne policy observations (I found these quite troubling—for reasons I just begin to suggest parenthetically below):
--basing policy on data collection is unlikely to succeed, except in very limited contexts (such as health information) where there may be possibilities for meaningful notice and consent. (Why, I ask, is notice/consent the only way to approach collection practices? What about other sorts of restrictions on collection? Or, is the thought that getting the data is both inevitable and desirable, no matter what the context?)
--regulating at the moment individuals are particularized by analytics might be technically possible—but even so, it’s preferable to focus on harms downstream (Doesn’t this expose people to risks of harm, correctable only after the fact? Shouldn’t we consider building ways to detect and deter re-identification that could intervene before the harm occurs?)
--drafting savvy model legislation on cyber-torts might help improve the current patch-work of liability rules for privacy violations (Why not a public law approach to violations rather than placing the onus on individual litigation?)
--forbidding the government from certain classes of uses might be desirable, even if these uses remain available in the private sector (So is the government the only or even primary problem with big data use???)
June 2, 2014 in Innovation, Obama Administration, Policy, Politics, privacy | Permalink | Comments (0) | TrackBack (0)
Monday, February 10, 2014
The AOL Babies: Our Healthcare Crisis in a Nut
Where does one start with AOL CEO Armstrong's ridiculous and unfeeling justifications for changes in his company’s 401(k) plan. Cable TV and Twitter came out of the blocks fast with the obvious critiques. And the outrage only increased after novelist Deanna Fei took to Slate to identify her daughter as one of the subjects of Armstrong’s implied criticism. Armstrong has now apologized and reversed his earlier decision.
As the corporate spin doctors contain the damage, Armstrong’s statements likely will recede from memory, although I am still hoping The Onion will memorialize Armstrong’s entry into the healthcare debate (suggested headline, "CEO Discovers Nation's Healthcare Crisis Caused by 25 Ounce Baby”). But supposing (just supposing) your health law students ask about the story in class this week. What sort of journey can you take them on?
February 10, 2014 in Affordable Care Act, Cost, Coverage, Employer-Sponsored Insurance, Health Care, Health Care Costs, Health Care Reform, Health Economics, Health Law, HIPAA, privacy | Permalink | Comments (0) | TrackBack (0)
Monday, August 26, 2013
Of Data Challenges
Challenges designed to spur innovative uses of data are springing up frequently. These are contests, sponsored by a mix of government agencies, industry, foundations, a variety of not-for-profit groups, or even individuals. They offer prize money or other incentives for people or teams to come up with solutions to a wide range of problems. In addition to grand prizes, they often offer many smaller prizes or networking opportunities. The latest such challenge to come to my attention was announced August 19 by the Knight Foundation: $2 million for answers to the question "how can we harnass data and information for the health of communities?" Companion prizes, of up to $200,000, are also being offered by the Robert Wood Johnson Foundation and the California Healthcare Foundation.
Such challenges are also a favorite of the Obama administration. From promoting Obamacare among younger Americans (over 100 prizes of up to $30,000)--now entered by Karl Rove's Crossroads group--to arms control and identification of sewer overflows, the federal government has gone in for challenges big time. Check out challenge.gov to see the impressive list. Use of information and technological innovation feature prominently in the challenges, but there is also a challenge for "innovative communications strategies to target individuals who experience high levels of involuntary breaks ("churn") in health insurance coverage" (from SAMHSA), a challenge to design posters to educate kids about concussions (from CDC), a challenge to develop a robot that can retrieve samples (from NASA), and a challenge to use technology for atrocity prevention (from USAID and Humanity United). All in all, some 285 challenges sponsored by the federal government are currently active, although for some the submission period has closed.
These challenges are entertaining, call on crowdsourcing for knowledge production, find new sources of expertise way beyond the Beltway or even US borders, encourage private sector groups rather than government to bear costs and risks of development (or failure), and may bring novel and highly useful ideas to light. So what's not to like? I may be just grumpy today, but I have some serious worries about the rush to challenges as a way to solve persistent or apparently intractable problems.
Challenges may be more hype than achievement, more heat than ultimate light. They may emphasize the quick and clever--the nifty over the difficult or profound. They may substitute the excitement of awarding and winning a prize for making real progress on a problem. Most troubling to me, however, is the challenge strategy's potential to skew what government finds interesting and what it is willing to do. Many challenges have private partners in industry, appear likely to result in for-profit products, or set aside values that may be more difficult to quantify or instantiate.
Take the HHS Datapalooza, for example. Now entering its fifth year, the Datapalooza is an annual celebration of innovations designed to make use of health data available from a wide variety of sources, including government health data. "Data liberation" is the watchword, with periodic but limited references to data protection, security and privacy. A look at the 2013 agenda reveals a planning committee representing start-ups and venture capital. It also reveals a $500,000 prize awarded by Heritage Provider Network, a managed care organization originally located in Southern California but now expanding in markets in Arizona and New York and serving many Medicare Advantage patients. The prize was for a model to predict hospitalizations accurately and in advance--so that they could be avoided. The winning team, powerdot, didn't reach the benchmark needed to win the full $3m prize. So . . . Heritage is continuing the competition, making more (and apparently no longer deidentified) data available to a select set of leading competitors in the original competition in order to improve the accuracy of the modeling. (A description of deidentification methods for the data made available to all entrants in the original competition is available here.) There are of course real advantages in developing a good predictive model--for patients in avoiding hospitalizations, and for Heritage in saving money in patient care. This is potentially a "win win"--as Mark Wagar, the executive awarding the prize stated, "it's not just about the money; it's personal." But "it's not just about the money" is telling: the risk of these challenges is that they are about the money, and that the money will come to dominate personal or other values unless we are careful.
Solutions, if my concerns are well-founded? Trying to turn back the disruptive clock and fight the appeal of challenges is probably futile--although perhaps some of the initial enthusiasm may wane. One solution is to join in--after all, challenges are infectious and potentially innovative--encouraging more challenges aimed at different problems--say, challenges for privacy or security protection alongside challenges for data liberation and use. Or, challenges for improving patient understanding of their health conditions and informed consent to strategies for managing them--as some of the challenges aimed at patients with diabetes illustrate. Another solution is to watch very carefully what challenges are offered, who funds them, who wins them, and what is ultimately achieved by them.
[LPF]
August 26, 2013 in Bioethics, Biotech, Competition, Health Care Costs, Health Care Reform, Health IT, Health Reform, Obama Administration, privacy, Reform, Technology | Permalink | Comments (0) | TrackBack (0)
Thursday, May 30, 2013
Are Health Care Providers Deliberately Misunderstanding HIPAA--And if So What You Can Do About it?
Ever since HIPAA was implemented in 2002, it has been used by health care providers to make life more difficult for patients by preventing their family members from being with them in care areas and by refusing to share information with those the patient wants to be kept informed. This has caught the interest of the U.S. House Energy and Commerce Sub-Committee has been holding hearings into various consumer issues regarding HIPAA. I think this review is long overdue--and that HHS is well aware of how providers are misusing HIPAA. The problems are so prevalent that the HHS website actually has a myth-buster section.
So what can a family do? A good step is to be skeptical when told that something is being done "because of the law." As I have explained again and again to health care providers and lawyers, “If you think something that a) the patient or his family wants or b) is in the best interests of his care is prohibited by HIPAA, you don’t understand HIPAA.” Ask for clarification from the hospital's lawyer--will you get that person on first call? Maybe not, but be persistent.
Inform yourself on the HHS website. There is also a very useful FAQ section that addresses questions like, “is it illegal for a family member to pick up a prescription for a patient” (no) and It starts with an over-all statement of principle: “The Privacy Rule provides federal protections for personal health information held by covered entities, and gives patients an array of rights with respect to that information. At the same time, the Privacy Rule is balanced so that it permits the disclosure of personal health information needed for patient care and other important purposes.”
Moreover, there is no provision in HIPAA that requires or allows a health provider to step in as “guardian of privacy” for a patient who is not conscious or competent. When there is an identifiable surrogate decision maker, that person can make any decision about disclosure that the patient could have made himself. And, no, there need not be any written document expressly allowing sharing of information with the surrogate. Moreover, if a person has legal authority to make medical decisions, then he is entitled to review the medical records so that the decision can be an informed one.
At this point, more than ten years later, it’s reasonable to wonder if some of these “misunderstandings” are fostered by the fact that they make things easier for the providers in that it limits time consuming questions—like, “why isn’t my mother receiving pain medication?” or “what are our options for Mom’s care?” Certainly the posts in this nursing blog suggests that’s the case. Here’s my favorite from Ortho-RN, “We usually don't allow family in the recovery room... I don't feel it's a place where family belongs.. No privacy, totally in HIPPA violations. Families like to be nosey and watch other things, and things do not always go smoothley...” (sic) Here’s another insight into how providers see their obligations
A brief foray into common sense should demonstrate the absurdity of these restrictive interpretations. The premise should be that outside a few narrow health & safety exceptions, no one other than the patient is in a better position to decide who can and cannot have information about the medical care being provided. And keeping family members away because of the risk they will see "other patients" is an absurdity. None of us have the right to receive health care in complete seclusion. Maybe high profile patients can pay for private wings—but all of us are stuck with the reality that in going to receive health care we may well be seen by other people, including those who know us. Finally, health care providers themselves are not entitled to protection from the observations and questions of family and friends about the care of their loved ones. Could there be times during an emergency when the team can't stop and talk. Sure. But if these would be reasonable requests for information from the patient, then they are reasonable from the people the patient trusts most to protect his interests.
May 30, 2013 in Health Law, HHS, HIPAA, Hospitals, Nurses, Physicians, Policy, privacy | Permalink | Comments (0) | TrackBack (0)