Safety Standards Against Child Sexual Abuse and Exploitation

calendar_todayLast Updated: January 2026

Our Commitment

At Nukoko, we are absolutely committed to maintaining a safe environment for all users. We have zero tolerance for child sexual abuse and exploitation (CSAE) material and take our responsibility to protect children extremely seriously.

This document outlines our comprehensive standards, procedures, and systems designed to prevent, detect, and respond to any potential CSAE content or behavior on our platform.

If you encounter any content or behavior that involves or exploits children, please report it immediately through our in-app reporting system or email us at safety@nukoko.com

block

Zero Tolerance Policy

Nukoko has an absolute zero tolerance policy for child sexual abuse and exploitation material (CSAEM).

The following are strictly prohibited and will result in immediate account termination and law enforcement notification:

  • Child sexual abuse material (CSAM) in any form, including photos, videos, drawings, or text
  • Content depicting, encouraging, or promoting the sexualization of minors
  • Content that grooms, solicits, or attempts to exploit children
  • Content that facilitates or coordinates child sexual abuse or exploitation
  • Content that normalizes or glorifies child sexual abuse
  • Any trading, buying, or selling of CSAM

Users who violate these policies will be:

  • Immediately and permanently banned from Nukoko
  • Reported to the National Center for Missing & Exploited Children (NCMEC)
  • Reported to local and international law enforcement agencies
  • Subject to criminal prosecution
shield

Detection & Prevention Systems

Proactive protection measures

We employ multiple layers of technology and human review to prevent and detect CSAE material:

1. Automated Content Scanning

  • PhotoDNA technology: All uploaded images and videos are scanned using Microsoft's PhotoDNA hash matching against known CSAM databases
  • AI-powered detection: Machine learning models trained to identify potentially problematic content
  • Metadata analysis: Examination of file metadata for suspicious patterns

2. Upload Prevention

  • Content flagged by PhotoDNA is automatically blocked from upload
  • Users attempting to upload flagged content are immediately investigated
  • Accounts showing patterns of attempting to upload prohibited content are automatically suspended

3. User Verification

  • All users must be 18+ to create an account
  • Age verification systems to prevent minors from accessing the platform
  • Regular verification checks for suspicious account patterns
report

Reporting Mechanisms

Multiple ways to report concerns

We provide multiple channels for users and the public to report CSAE concerns:

In-App Reporting

Every post, comment, and profile has a "Report" button. CSAE reports are prioritized and reviewed immediately by our Trust & Safety team.

Email Reporting

Send detailed reports to safety@nukoko.com. These reports are monitored 24/7 and receive immediate attention.

Anonymous Reporting

Users can report content anonymously without fear of retaliation or identification.

Response Time: All CSAE reports are reviewed within 1 hour of submission, 24/7/365.

visibility

Content Moderation Team

Trained professionals protecting our community

Our Trust & Safety team consists of trained professionals who:

  • Receive specialized training in identifying CSAE material and grooming behavior
  • Review flagged content immediately with priority given to CSAE reports
  • Work with mental health support to handle the psychological impact of reviewing harmful content
  • Follow strict protocols for preserving evidence and reporting to authorities
  • Operate 24/7 to ensure rapid response to threats

Moderation Process

  1. Automated systems flag potentially problematic content
  2. Human moderators review flagged content
  3. Confirmed violations are immediately removed and preserved as evidence
  4. User accounts are suspended pending investigation
  5. Reports are filed with NCMEC and law enforcement
  6. Accounts are permanently banned and devices are blocked
gavel

Law Enforcement Cooperation

Working with authorities to protect children

We work closely with law enforcement and child safety organizations:

NCMEC CyberTipline

We are registered with the National Center for Missing & Exploited Children (NCMEC) and file reports through their CyberTipline for all suspected CSAE incidents.

Data Preservation

When CSAE material is identified, we immediately preserve all relevant evidence including:

  • Content files and metadata
  • User account information
  • IP addresses and device information
  • Activity logs and communication records

Cooperation with Investigations

We respond promptly to law enforcement requests and provide full cooperation with investigations, including:

  • Responding to legal process requests
  • Providing preserved evidence
  • Testifying in court proceedings when required
  • Sharing information with international law enforcement through appropriate channels
school

User Education & Awareness

Empowering our community to help keep children safe

We believe in educating our community about child safety:

  • Clear Community Guidelines: Our Community Rules explicitly prohibit CSAE material
  • Reporting Education: We teach users how to identify and report concerning content
  • Safety Resources: We provide links to child safety organizations and resources
  • Regular Communications: We send safety updates and reminders to our community

Resources for Help

contact_support

Questions About Our Safety Standards?

If you have questions about our child safety policies or need to report a concern, please contact our Trust & Safety team: