Document Type

Article

Publication Date

5-2022

Abstract

Face surveillance is animated by deep-rooted demographic and deployment biases that endanger marginalized communities and threaten the privacy of all. But current approaches have not prevented its adoption by law enforcement. Some companies have offered voluntary moratoria on selling the technology, leaving many others to fill in the gaps. Legislators have enacted regulatory oversight at the state and city levels, but a federal ban remains elusive. Both approaches require vast shifts in practical and political will, each with drawbacks. While we wait, face surveillance persists. This Article suggests a new possibility: face surveillance is fueled by unauthorized copies and reproductions of photographs, and resisting face surveillance compels us to consider countering it with copyright law.

So why haven't face surveillance companies been overwhelmed with copyright infringement litigation? Fair use. This Article lays out the litigation landscape before analyzing the recent Supreme Court decision in Google v. Oracle, alongside other key fair use cases, to examine why this complex doctrine may permit many uses of machine learning without allowing face surveillance to copy and reproduce online profile pictures. Some face surveillance companies claim to be transformative search engines, but their business models are more like private subscription services that are rarely found to be fair use. And scraping profile pictures harms the unique licensing market for these photographs, which grows as companies and researchers increasingly reject scraped photos as sources of face analysis training data. This Article concludes that copyright law could curtail face surveillance without waiting for companies or Congress to catch up--and we ought to use it.

Publication Citation

North Carolina Law Review, Vol. 100, Issue 4, 1015-1071.

Share

COinS