Document Type


Publication Date



During the 1960s and 1970s, the individual rights revolution that swept through American society remade much of the nation's health law in its image. Sick people acquired the right to be told of the risks and benefits of proposed treatments and then to give thumbs-up or thumbs-down to their doctors' decisions. Successful suits for medical negligence went from rare to commonplace. Elderly and poor Americans achieved statutory rights of access to publicly funded healthcare, and courts burnished these rights with myriad procedural protections. The critically ill and their families won the right to refuse aggressive, life-sustaining treatments. Psychiatric patients acquired new veto power over hospital confinement and drug therapy, and biomedical research subjects gained myriad safeguards grounded in the principle of informed consent.

By the early 1980s, the law governing American medicine embodied, in form if not in practice, the ideal of the individual as author of his or her own clinical fate. This ideal portrayed patients as sovereign clinical consumers, entitled to make decisions about their care without regard for the financial consequences borne by others. So long as the assorted others-mainly employer-sponsored health insurance plans and taxpayersupported federal and state programspaid more or less uncomplainingly, this ideal seemed immune to challenge. It appealed, diversely, to liberal proponents of the individual rights revolution and to conservatives inclined toward pursuit of efficiency through deference to consumer choice. It disregarded the fact that consumers of healthcare often do not pay for what they choose.

Publication Citation

Hum. Rts., Fall 1998, at 19-20.