Abstract
This article updates the traditional discussion of privacy and technology, focused since the days of Warren and Brandeis on the capacity of technology to manipulate information. It proposes a novel dimension to the impact of anthropomorphic or social design on privacy.
Technologies designed to imitate people-through voice, animation, and natural language-are increasingly commonplace, showing up in our cars, computers, phones, and homes. A rich literature in communications and psychology suggests that we are hardwired to react to such technology as though a person were actually present. Social interfaces accordingly capture our attention, improve interactivity, and can free up our hands for other tasks.
At the same time, technologies that imitate people have the potential to implicate long-standing privacy values. One of the well-documented effects on users of interfaces and devices that emulate people is the sensation of being observed and evaluated. Their presence can alter our attitude, behavior, and physiological state. Widespread adoption of such technology may accordingly lessen opportunities for solitude and chill curiosity and self-development. These effects are all the more dangerous in that they cannot be addressed through traditional privacy protections such as encryption or anonymization. At the same time, the unique properties of social technology also present an opportunity to improve privacy, particularly online.
Recommended Citation
M. R. Calo,
People Can Be So Fake: A New Dimension to Privacy and Technology Scholarship,
114
Dick. L. Rev.
809
(2010).
Available at:
https://ideas.dickinsonlaw.psu.edu/dlra/vol114/iss3/3