Check-in and access this session from the IGF Schedule.

IGF 2018 WS #395
Internet Platforms, Sexual Content, and Child Protection

    Organizer 1: Lucio Nerea Vega, Prostasia Foundation
    Organizer 2: Edmon Chung, Dot Asia Registry
    Organizer 3: Andrew Puddephatt, Internet Watch Foundation

    Speaker 1: Fernández Maryant, Civil Society, Western European and Others Group (WEOG)
    Speaker 2: Edmon Chung, Technical Community, Western European and Others Group (WEOG)
    Speaker 3: Andrew Puddephatt, Civil Society, Western European and Others Group (WEOG)
    Speaker 4: Jeremy Malcolm, Civil Society, Western European and Others Group (WEOG)

    Moderator

    Nerea Vera Lucio

    Online Moderator

    Jeremy Malcolm

    Rapporteur

    Andrew Puddephatt

    Format

    Round Table - 90 Min

    Interventions

    A key aspect of this workshop is to connect Internet platforms (we have established contact with several platforms about participating), and subject matter experts (including mental health professionals, sex industry experts, CSA survivors, and human rights advocates) in order promote a more evidence-based approach to the question of what platforms can do to help reduce child sexual abuse, beyond the removal of manifestly illegal content.

    The intent is to help industry participants fulfil the Guiding Principles on Business and Human Rights, which require companies to “Conduct due diligence that identifies, addresses and accounts for actual and potential human rights impacts of their activities, including through regular risk and impact assessments, meaningful consultation with potentially affected groups and other stakeholders, and appropriate follow-up action that mitigates or prevents these impacts.”

    Therefore the purpose of this workshop is to ensure that these affected groups (such as sex workers and CSA survivors) and experts (such as mental health professionals and criminal justice experts) will be able to directly and immediately provide their views and feedback to platforms during the course of the development of the draft model terms of service for Internet platforms with respect to child protection.

    Diversity

    Prostasia Foundation's approach is to give a voice to those who are usually excluded from discussions around child protection, or who are even wrongly held responsible for child sexual abuse. For example our Advisory Council includes representatives of child sexual abuse survivors, sex workers and the adult entertainment industry, and criminal justice reform campaigners. Although this does not specifically represent diversity along lines of race or nationality, it does represent a kind of diversity of background that is missing from the Internet Governance Forum. Further, a majority of our Board of Directors is composed of women, one of whom also has a disability.

    2018 is the year of the techlash, in which public opinion has hardened towards Internet platforms that are seen as having failed to adequately address the online manifestations of a range of social problems. Platforms are increasingly being asked to take proactive measures to prevent misinformation, hateful speech, terrorist content, and copyright infringing content, from appearing online to begin with.

    The paradigmatic case of such proactive content filtering is one in which larger platforms, in particular, already have good experience; namely, the automated filtering and removal of child sexual abuse (CSA) material by reference to hashes of known illegal images. But can platforms do more to prevent child sexual abuse than can be accomplished through such automated means?

    In the United States, an answer to this question has been forced by the passage of FOSTA/SESTA which narrows platforms’ safe harbor protection from liability for users’ content. Although originally touted as a narrow measure targeting child sex traffickers, the final law also makes platforms liable for promoting or facilitating consensual adult sex work, and in practice some content that does not relate to sex work of any kind has also been removed.

    We propose to promote a more evidence-based approach to the question of what platforms can do to help reduce child sexual abuse, beyond the removal of manifestly illegal content, by convening a two-part multi-stakeholder dialogue on this topic, with the objective of suggesting a set of model terms of service for Internet platforms with respect to child protection.

    Currently, many platforms do already have child protection policies as part of their content policies or community standards, however these can be vague and unpredictable in their application even by a single platform, let alone between platforms. Smaller platforms may not have well-developed policies on this topic at all. Even in mid-size platforms, trust and safety teams are typically composed of members who deal with other forms of abusive content such as spam and fraud, but which lack dedicated expertise in child protection.

    Although referring to policies on sexual content more generally, rather than to child protection policies specifically, U.N. Special Rapporteur David Kaye notes in his 2018 report that the application such policies has resulted in the removal of resources for members of sexual minorities, and depictions of nudity with historical, cultural or educational value. CSA prevention resources have also been removed in some cases.

    This workshop will build on an earlier private convening between stakeholders including mental health professionals, representatives of the sex industries, child protection workers, human rights experts, and survivors of child sexual abuse (CSA), in a private gathering with platform representatives to discuss and suggest best practices for policies that would protect children, while avoiding such unforeseen impacts that would infringe on the human rights of children or others.

    A document prepared on the basis of the discussions held at the first convening will be presented at this workshop for broader community feedback. The anticipated outcome of this workshop will be the publication of a set of model terms of service for Internet platforms with respect to child protection.

    Apart from an initial briefing on the discussions to date for those who did not participate in earlier phases of this work, there will be no prepared presentations in this session, which will be wholly devoted to discussion around the draft model terms of service for Internet platforms with respect to child protection. Interventions by participants will be timed to ensure that the discussion is not monopolized by the most confident speakers, and those who have not intervened will be prompted for their thoughts by the moderator.

    The precise policy question to be addressed is: aside from the automated filtering and removal of child sexual abuse (CSA) material by reference to hashes of known illegal images, can platforms do more to prevent child sexual abuse than can be accomplished through such automated means? This discussion will build on online discussions and on a previous half-day workshop to be held the previous month at the Annual Conference of the Association for the Treatment of Sexual Abusers (ATSA).

    Online Participation

    This workshop build on previous online discussions that will have been undertaken in the course of developing the workshop's draft recommendations. Remote participants who contributed to these online discussions will be invited to participate in the workshop via WebEx. Online attendees will have a separate queue and microphone, which will rotate equally with the mics in the room; the workshop moderator will have the online participation session open, and will be in close communication with the workshop’s trained online moderator, to make any adaptations necessary as they arise.