Skip to Content
SafetyHow to Report Content

How to Report Content

Help us keep Pichr safe by reporting content that violates our policies. This guide explains how to submit reports and what to expect after you report.

Last Updated: 13 November 2025


Quick Report Guide

Step 1: Find the Report Button

  • On any image page, look for the Report button (flag icon)
  • The button is located near the image title or in the action menu

Step 2: Select Report Category

Choose the category that best describes the violation:

  • CSAM (Child Sexual Abuse Material)
  • Terrorism or Violent Extremism
  • Adult Content (not age-restricted)
  • Violence or Graphic Content
  • Harassment or Bullying
  • Copyright Infringement
  • Spam or Misleading Content
  • Other (with explanation)

Step 3: Provide Details

Help us understand the issue:

  • What specific policy is being violated?
  • Why is this content harmful?
  • Any additional context we should know?

Step 4: Submit Report

  • Click Submit Report
  • You’ll receive a confirmation
  • Our team will review within 24-72 hours

Report Categories Explained

🚨 Critical Priority (24-Hour Response)

CSAM (Child Sexual Abuse Material)

What to report:

  • Images depicting minors in sexual situations
  • Sexualized images of children
  • Content that exploits or endangers children

What happens:

  • Immediate escalation to emergency team
  • Content removed within 1 hour
  • User banned permanently
  • Reported to National Crime Agency (NCA)
  • Reported to Internet Watch Foundation (IWF)

Note: Making false CSAM reports is a serious offense and may result in your account being banned.


Terrorism or Violent Extremism

What to report:

  • Content promoting terrorist organizations
  • Recruitment material for extremist groups
  • Instructions for carrying out attacks
  • Glorification of terrorist acts

What happens:

  • Immediate escalation to emergency team
  • Content removed within 1-2 hours
  • User banned permanently
  • Reported to Counter Terrorism Internet Referral Unit (CTIRU)

⚠️ High Priority (48-Hour Response)

Violence or Graphic Content

What to report:

  • Graphic violence, gore, or death
  • Animal abuse or cruelty
  • Self-harm or suicide content
  • Content depicting severe injury

What happens:

  • Review within 48 hours
  • Content likely removed
  • User warned or suspended
  • Repeated violations result in ban

Adult Content (Not Age-Restricted)

What to report:

  • Explicit sexual content not behind age gate
  • NSFW content visible to minors
  • Content with NSFW score ≥0.7 not age-restricted

What happens:

  • Review within 48 hours
  • Content age-restricted or removed
  • User warned (first time) or suspended (repeat)

📋 Standard Priority (72-Hour Response)

Harassment or Bullying

What to report:

  • Targeted abuse or intimidation
  • Doxing (sharing private information)
  • Threats or hate speech
  • Coordinated harassment campaigns

What happens:

  • Review within 72 hours
  • Content removed if policy violation confirmed
  • User warned, suspended, or banned depending on severity
  • Multiple reports expedite review

What to report:

  • Unauthorized use of copyrighted material
  • Images you own being used without permission
  • Trademark violations

What happens:

  • DMCA takedown process initiated
  • Copyright holder contacted for verification
  • Content removed if claim is valid
  • Three strikes = account termination

Note: For copyright claims, use our DMCA process for faster resolution.


Spam or Misleading Content

What to report:

  • Mass-uploaded promotional content
  • Phishing or scam content
  • Malware or malicious links
  • Deceptive or misleading images

What happens:

  • Review within 72 hours
  • Content removed if spam confirmed
  • Account may be suspended for repeat offenses

What Information to Include

Be Specific

Good Report:

“This image contains graphic violence showing an injured person bleeding. It should be removed as it violates the violence policy and has no educational or news value.”

Poor Report:

“This is bad and should be removed.”

Provide Context

  • URL or Image ID (if you have it)
  • Description of the violation
  • Policy being violated
  • Why it’s harmful or inappropriate

Multiple Violations

If content violates multiple policies, select the most severe category:

  • CSAM > Terrorism > Violence > Adult Content > Harassment > Spam

After You Report

What Happens Next?

  1. Confirmation

    • You’ll see a confirmation message
    • The report is logged in our moderation queue
  2. Priority Assignment

    • Critical reports: Immediate escalation
    • High priority: Flagged for review within 24 hours
    • Standard: Added to moderation queue
  3. Review Process

    • Moderators assess the content against policies
    • Context and user history considered
    • Action decided based on severity
  4. Action Taken

    • Content may be:
      • Age-restricted (NSFW content)
      • Removed (policy violation)
      • Left unchanged (no violation found)
    • User may be:
      • Warned (first-time minor violation)
      • Suspended (serious or repeat violation)
      • Banned (critical violation or repeat offender)
  5. Notification

    • You’ll be notified of the outcome
    • Reporter identity kept confidential
    • No contact with reported user

Response Times

PriorityResponse TimeExamples
Critical1-24 hoursCSAM, terrorism
High24-48 hoursViolence, graphic content
Standard48-72 hoursHarassment, spam

False Reports

Consequences

Making intentionally false reports is a violation of our Terms of Service:

  • First offense: Warning
  • Repeat offenses: Account suspension
  • Abuse of system: Permanent ban
  • False CSAM reports: Immediate ban + possible legal action

Good Faith Mistakes

We understand that some reports may be made in good faith but turn out to be incorrect:

  • One-off mistakes are not penalized
  • We may provide feedback to help improve future reports
  • Repeated mistakes may result in report limits

Privacy & Confidentiality

Your Identity

  • Your identity is never shared with the reported user
  • Reports are reviewed anonymously by moderators
  • No retaliation is possible through our system

What We See

When you report content, we see:

  • The reported image ID and URL
  • Your report category and description
  • Your account ID (for record-keeping only)
  • Timestamp of report

We do not share this information with anyone except:

  • Law enforcement (if legally required)
  • Regulatory authorities (e.g., Ofcom, NCA, IWF)

Emergency Contact

For immediate threats or time-sensitive reports:

Critical Content (CSAM/Terrorism)

Law Enforcement

If you believe content depicts an immediate threat to life or ongoing abuse:

  1. Call: 999 (UK emergency services)
  2. Report to Pichr: emergency@pichr.io
  3. Follow up: Our team will coordinate with authorities

Additional Resources


Contact Support

General Safety Questions

Report Status Inquiry

  • Email: reports@pichr.io
  • Response Time: 48-72 hours
  • Include: Report ID or image ID

Appeals

If you believe your content was incorrectly removed:

  • Email: appeals@pichr.io
  • Response Time: 5 business days
  • Include: Image ID, reason for appeal, supporting evidence

Thank you for helping keep Pichr safe for everyone.