CSEA
Table of Contents
- Definition
- Related Terms
- Background
- Why We Care
- Spotting CSEA: What to Look For
- Before You Act: Common Pitfalls & Nuances
- Managing Suspected CSEA: Key Steps
- Example Community Guidance
- Further Reading
Definition
Child Sexual Exploitation and Abuse – A broad category that encompasses both the sharing of material depicting child sexual abuse, other sexualised content depicting children, and includes grooming.
For the guidance and requirements regarding Child Sexual Abuse Material see CSAM
The top three online CSEA harms are: producing, sharing and/or viewing CSAM, online sexual solicitation, and online grooming.
Related Terms
Online Child Grooming, Child Enticement, Predatory Behaviour, Online Child Endangerment, Luring, Sexual Coercion of Minors.
Background
Child Sexual Exploitation and Abuse (CSEA) for the purposes of this guidance (and distinct from the handling of CSAM), primarily encompasses predatory behaviours such as online grooming – where an adult builds a relationship with a child to gain their trust with the ultimate aim of sexual abuse or exploitation – and the creation or sharing of other forms of sexualised content depicting children that may not meet the CSEA definition but is clearly exploitative or inappropriate. This can also include attempts to coerce or entice children into sexualised conversations or activities.
Perpetrators may exploit the features such as direct messaging to identify and target children. Grooming is a process that can occur over time, making its early stages sometimes difficult for observers to detect without specific awareness. The focus of this page is on identifying these behaviours and non-CSAM exploitative content to ensure immediate safeguarding actions, including mandatory reporting to authorities.
Why We Care
Addressing all forms of CSEA is an absolute and non-negotiable priority. These behaviours cause profound, lifelong harm to children. There is a moral, ethical, and often legal imperative to protect children from sexual exploitation and abuse in all its forms. A zero-tolerance approach is the only acceptable stance.
Failure to act decisively against CSEA not only fails to protect children but also makes the community complicit in enabling harm, and can have severe legal consequences for individuals and service providers. The safety and well-being of children supersedes all other considerations.
Spotting CSEA: What to Look For
Identification of CSEA, particularly grooming, requires vigilance for patterns of inappropriate interaction and specific predatory behaviours. This section focuses on indicators other than the presence of CSAM itself.
Grooming Indicators:
- Inappropriate Relationship Building: An adult account attempting to establish a private, secretive, or overly familiar/personal relationship with an account known or appearing to be a child. This might include excessive flattery, gift-giving (virtual or promises of real), or sharing personal adult problems to elicit sympathy.
- Targeting Vulnerabilities: Exploiting a child’s insecurities, loneliness, or desire for attention/validation.
- Isolating the Child: Attempting to drive a wedge between the child and their parents, friends, or other trusted adults; encouraging secret-keeping; or moving communication to more private, unmoderated channels.
- Normalising Sexual Talk / Boundary Pushing: Gradually introducing sexual themes into conversations, making sexualised jokes or comments, asking intrusive questions about a child’s private life or body, or testing boundaries to see what the child will tolerate.
- Requesting Inappropriate Images/Information: Pressuring a child to send suggestive (but not necessarily explicit by local CSAM definitions) photos of themselves, or detailed personal information about their routines or location.
- Coercion or Threats: Using manipulation, emotional blackmail, or threats to get a child to comply with requests or to keep the relationship secret.
- Attempting to Meet Offline: Suggesting or arranging to meet a child in person.
Other Exploitative Content (Non-CSAM context):
- Discussions or sharing of content (e.g., drawings, fictional stories, AI-generated images that are not CSAM) that sexualise children or promote/normalise sexual attraction to children.
- Accounts seeking or offering to connect adults with children for sexual purposes.
- Content that provides instructions or encouragement for child sexual exploitation.
Key Questions for Assessment (Requiring IMMINENT ESCALATION to Service Admins & Authorities):
- “Is an adult account displaying grooming behaviours towards an account identified or suspected to be a child?”
- “Is an account attempting to sexually coerce or solicit a child?”
- “Is content being shared that, while not CSAM, clearly sexualises children in an exploitative manner or promotes harm to children?”
- “Are there any indicators suggesting a child is at immediate risk of harm?”
Before You Act: Common Pitfalls & Nuances
Do NOT Investigate Independently: Moderators should never attempt to “investigate” suspected grooming or CSEA themselves. This can alert perpetrators, destroy evidence, or re-traumatise victims. Your role is to identify red flags and immediately report to your service administrator and, through them or established channels, to law enforcement and/or specialist child protection agencies (e.g., NCMEC, Internet Watch Foundation, CEOP).
Confidentiality of Reporter/Victim: Maintain strict confidentiality if a child or concerned party reports such activity.
Preservation of Evidence is for Experts: While noting account names and relevant posts is crucial for your report to the service admin, the detailed forensic preservation of evidence is for law enforcement.
Common Gotchas (to be AVOIDED by correct procedure):
- Delaying reporting to authorities: Any credible suspicion requires immediate next steps towards official reporting.
- Confronting the suspected perpetrator: This can be dangerous and counter-productive.
- Making assumptions about a child’s age or maturity: All children are vulnerable.
Key Point: Any suspicion of grooming or CSEA involving a child requires immediate escalation through your Service Administrator to law enforcement and/or designated child protection agencies. There is no room for independent moderator action beyond this critical reporting chain.
Managing Suspected CSEA: Key Steps
The primary “management” by moderators is reporting and preserving initial information for escalation.
- IMMEDIATE REPORT TO SERVICE ADMINISTRATOR: If you encounter any account or content that raises suspicion of CSEA as defined here (grooming, non-CSAM sexualised content of children), immediately report all details to your service administrator or a designated safety officer. This is your first and most crucial step.
- Document Initial Observations: Securely and confidentially note the username(s) involved, relevant post URLs (if applicable), dates, times, and a brief description of why you are concerned. Provide this information to your administrator. Do not download or store any potentially illegal or harmful material yourself.
- Service Administrator Actions (Essential): The Service Administrator (or designated safety personnel) MUST:
- Preserve evidence according to legal best practices (often involving server-side data).
- Report the incident to the appropriate national law enforcement agency specialising in child exploitation (e.g., National Center for Missing and Exploited Children – NCMEC in the US, Internet Watch Foundation – IWF in the UK, or national police cybercrime units) and any other legally mandated bodies. This is often a legal requirement. Use the CSAM Reporting Requirements page to find the appropriate entity.
- Take steps to restrict the offending account(s) from the platform to prevent further harm, in consultation with law enforcement if an investigation is active.
- Ban Offending Accounts (Following Due Process/LE Guidance): Once confirmed by internal review and/or as guided by law enforcement, permanently ban accounts involved in CSEA.
- Cooperate Fully with Law Enforcement: Ensure all relevant information is provided to investigators.
- Support a Reporting Child (with extreme care): If a child reports directly, ensure they are listened to respectfully, assured it’s not their fault, and that steps are being taken to help. Immediately get specialist child protection services involved through your service administrator’s report to authorities. Do not try to counsel or interview the child yourself beyond initial information gathering for the report.
Example Community Guidance
Strike System: “Any activity related to Child Sexual Exploitation and Abuse, including grooming or sharing sexualised content of children, bypasses all warning or strike systems and will result in immediate, permanent bans and reporting to law enforcement.”
General Prohibition: “The safety of children is our highest priority. Any form of Child Sexual Exploitation and Abuse (CSEA), including but not limited to online grooming, attempting to solicit or coerce children into sexual activity, or the sharing of any content that sexually exploits or endangers a child, is absolutely prohibited and will be reported to law enforcement and relevant child protection agencies.”
Strict Enforcement: “We maintain a zero-tolerance policy for CSEA. Confirmed instances will result in immediate permanent bans, preservation of evidence for, and reporting to, national and international law enforcement and child protection agencies. We are legally and morally bound to take all necessary actions to protect children and cooperate with authorities.”