When Apple announced it would combat the growth of child sexual abuse material (CSAM) on its platform by scanning all its users’ devices without their consent, many of its loyal customers felt betrayed. With tech companies such as Google and Facebook arranging their business models around selling their customers’ personal information, Apple customers saw the company’s focus on privacy as a refreshing alternative. However, as Apple itself privately acknowledged, this emphasis on privacy had led to it becoming a haven for CSAM. Despite the reputational damage it would incur with its customers, Apple resolved to confront CSAM on its platform in an unprecedented manner. Until Apple’s announcement, no major tech company had resolved to install a hashing algorithm directly onto its devices to search for CSAM. Apple’s move places itself in the middle of a legal firestorm with the protections of the Fourth Amendment squaring off against the public demand to eradicate CSAM and protect the nation’s children from abuse. In deciding CSAM cases, courts have often focused on the application of the private search doctrine. Tech companies implementing anti-CSAM hashing protocols have sometimes run afoul of this doctrine and other aspects of Fourth Amendment jurisprudence. This Comment argues that Apple’s move not only complies with the constitutional standards expressed by circuit courts but exceeds those standards. In addition, a strong public policy justification exists for Apple’s initiative. Congress has repeatedly expressed its intent to combat CSAM and protect children from sexual abuse, and by complying with this congressional intent, Apple aligns with public policy. Finally, this Comment recommends that the U.S. Supreme Court resolve the circuit split regarding Fourth Amendment-implicated CSAM cases by adopting a new rule.
Scanning iPhones to Save Children: Apple’s On-Device Hashing Algorithm Should Survive a Fourth Amendment Challenge,
Dick. L. Rev.
Available at: https://ideas.dickinsonlaw.psu.edu/dlr/vol127/iss1/9