Abstract and Keywords
The internet would seem to be an ideal platform for fostering norm diversity. The very structure of the internet resists centralized governance, while the opportunities it provides for the “long tail” of expression means even voices with extremely small audiences can find a home. In reality, however, the governance of online speech looks much more monolithic. This is largely a result of private “lawmaking” activity by internet intermediaries. Increasingly, social media companies like Facebook and Twitter are developing what David Kaye, UN Special Rapporteur for the Promotion and Protection of the Right to Freedom of Opinion and Expression, has called “platform law.” Through a combination of community standards, contract, technological design, and case-specific practice, social media companies are developing “Facebook law” and “Twitter law,” displacing the laws of national jurisdictions. Using the example of content moderation, this chapter makes several contributions to the literature. First, it expands upon the idea of “platform law” to consider the broad array of mechanisms that companies use to control user behavior and mediate conflicts. Second, using human rights law as a foundation, the chapter makes the case for meaningful technological design choices that enable user autonomy. Users should be able to make explicit choices about who and what they want to hear online. It also frames user choice in terms of the right to hear, not the right to speak, as a way of navigating the tension presented by hate speech and human rights without resorting to platform law that sanitizes speech for everyone.
Access to the complete content on Oxford Handbooks Online requires a subscription or purchase. Public users are able to search the site and view the abstracts and keywords for each book and chapter without a subscription.
If you have purchased a print title that contains an access token, please see the token for information about how to register your code.