Document Type
Article
Publication Date
2025
Abstract
Scientific and technological advances in the latter part of the twentieth century transformed the field of biometrics. Carleton Simon, for instance, first postulated using retinal vasculature for biometric identification in 1935, but it was not until forty years later that an Eyedentify patent brought the idea to fruition. In 1937, John Henry Wigmore anticipated using oscilloscopes to identify individuals by speech patterns. Decades later, digitization and speech processors made voiceprint identification possible. In the 1970s, biological discoveries similarly led to the development of deoxyribonucleic acid (DNA) sequencing. And while Alphonse Bertillon in the late nineteenth century postulated iris distinctions, it was only in 1991 that John Daugman patented a means of extracting and encoding their unique patterns.
In this century, as algorithmic sciences, big data analytics, and artificial intelligence (AI) have gained ground, the biometric landscape again has radically altered. The range of collectable Physiological Biometric Characteristics (PBCs), which measure innate human traits, has exploded. The legal literature lags far behind, with almost every treatment of biometrics limited to a few PBCs, such as fingerprinting, facial recognition technology (FRT), or DNA. Nor have scholars considered the rapid expansion in Behavioral Biometric Characteristics (BBCs)—biologically grounded habits and proclivities, such as voice prints, eye movement, or gait signatures. Instead, just a handful of pieces focus on one or two BBCs. Yet thousands of scientific articles over the past fifteen years have focused on how to collect, analyze, and use PBCs and BBCs. Hundreds of thousands of patent applications have kept pace. Looking at just six of the most prominent companies, the numbers are staggering: between 2012 and 2022, they collectively applied for or obtained 12,000 to 19,000 biometric-related patents per year.
Legal scholarship has not only missed the depth and breadth of information that can be collected, analyzed, and deployed, but it also has largely overlooked a concerning new practice: biomanipulation, which I define as the use of biometric data to identify, analyze, predict, and manipulate a person’s beliefs, desires, emotions, cognitive processes, and/or behavior. Books and articles on consumer and market manipulation, of course, have been around for decades; but the role of biometric data in presenting an immediate, more personalized, and more concerning form of insight and potential control has gone largely unnoticed.
For the past fifteen years, companies have delved headlong into this realm, pushing the boundaries and looking for ways to capitalize on biometrically enabled inventions. Paralleled by scientific and technological advances, a fundamentally different world has emerged. Early on, emphasis was placed on consumer behavior. Meta, for example, has patented a system to extract linguistic data (words, word stems, and communication patterns) and facial markers, and pair them with demographic and social network information. It considers the level of influence wielded by a node in a network, the number of connections, and engagement patterns, as well as biographic data (e.g., affinities, work experience, education, hobbies, location, and preferences), for news feeds, ranking, advertising, and other activities.
What is at stake, though, is more than just purchasing patterns. Biometric data can be used to generate insight into an individual’s beliefs, desires, emotions, and fears—and then to alter them. In 2022, for instance, Amazon secured a patent to analyze an individual’s emotional state, set a new target state, deliver content to get the individual to hit that goal, evaluate the impact of stimuli delivered, and continue to shape the individual’s emotions until the desired emotional state has been reached. The company explained,
[I]f a content provider intends to scare a user playing a game, the system may select content known to be scary, such as monsters or zombies, or may present video or audio (e.g., dark colors, scary sounds, or the like) to present in the game to the user. . . . The system may modify content based on a target or desired emotion to cause. For example, additional zombies may be added to an existing scene, or the tone or pitch of audio may be adjusted without causing an interruption to the presentation of the content.
Prior systems fell short; they failed to “account for a user’s current emotional state and how significant the transition from the user’s current emotional state to a target emotional state at a given time may be.” The proposed system selected and customized content to elicit the most direct emotional impact for each user, allowing it to obtain the “desired change to the user’s emotional state” within time limits. It employed “cameras, microphones, heartrate monitors, biometric sensors, [and] other . . . devices . . . to analyze and identify a user’s emotional state at a given time.” It could take into account body, arm, and hand position, heartrate, and other indicators, such as “fingerprints, face recognition, blood flow, retinal data, voice data, scents, and other data” to determine the user’s precise emotional state. The information could yield insight into “which content is associated with causing certain emotions, how often, how long it takes a user to transition from one emotion to another emotion, and other data.” The aim was to develop a system which could manipulate a target’s future emotions.
Publication Citation
Georgetown Law Journal, Forthcoming.
Scholarly Commons Citation
Donohue, Laura K., "Biomanipulation" (2025). Georgetown Law Faculty Publications and Other Works. 2622.
https://scholarship.law.georgetown.edu/facpub/2622
Included in
Business Organizations Law Commons, Intellectual Property Law Commons, Law and Economics Commons, Science and Technology Law Commons, State and Local Government Law Commons