Child Safety Standards

Last updated: February 2026

1. Our Commitment

mitis is developed and operated by Neosolus, a company registered in France. We are committed to maintaining a safe environment for all users and have a zero-tolerance policy toward child sexual abuse and exploitation (CSAE) of any kind on our platform.

Company Information:
Neosolus
148 Rue du 8 Mai 1945
92000 Nanterre, France
D-U-N-S®: 269969013

2. Age Requirements

mitis is not intended for children under 13 years of age (or 16 in certain jurisdictions within the EU, in accordance with local regulations). By creating an account, users represent that they meet the minimum age requirement.

If we become aware that a user is under the minimum required age, we will promptly suspend the account and delete all associated personal data.

3. Prohibited Content and Conduct

The following is strictly prohibited on mitis:

  • Child sexual abuse material (CSAM) of any kind, including images, illustrations, or text-based content.
  • Solicitation, grooming, or any attempt to exploit minors.
  • Sharing personal information of minors without parental consent.
  • Any content that sexualizes or endangers minors.
  • Any communication intended to lure, deceive, or manipulate a minor.

Zero-Tolerance Policy

Any account found to be in violation of these standards will be immediately and permanently banned. All associated content will be removed and preserved for law enforcement purposes.

4. Detection and Prevention

We employ the following measures to prevent CSAE on our platform:

  • Content moderation: We review reported content and accounts to identify and remove prohibited material.
  • User reporting: All users can report concerning content or behavior directly within the app. Reports are reviewed within 24 hours.
  • Account enforcement: Accounts that violate our child safety standards are permanently banned and cannot be reinstated.
  • Limited content types: mitis is a mood-sharing app — users share emotional states, not photos, videos, or free-text posts, which significantly limits the attack surface for CSAE content.

5. Reporting to Authorities

When we identify or receive a report of potential CSAE material or activity, we take the following actions:

  • Immediately remove the content and suspend the account.
  • Preserve all relevant data and evidence for law enforcement.
  • Report the incident to the appropriate authorities, including the National Center for Missing & Exploited Children (NCMEC) via the CyberTipline, and applicable local law enforcement agencies.
  • Cooperate fully with law enforcement investigations.

6. How to Report

If you encounter any content or behavior on mitis that you believe involves the exploitation or endangerment of a child, please report it immediately:

  • In-app: Use the reporting feature available on any user profile or mood.
  • Website: Use our contact form and select "Report a safety concern."
  • Email: Contact us at safety@mitis.app

All reports are treated with the highest priority and confidentiality.

7. Staff Training and Responsibility

All team members involved in content moderation and user support are trained to recognize and respond to potential CSAE incidents. Our team is required to:

  • Escalate any suspected CSAE content or behavior immediately.
  • Follow established procedures for evidence preservation and reporting.
  • Maintain strict confidentiality regarding ongoing investigations.

8. Regular Review

We regularly review and update our child safety standards, moderation practices, and detection measures to ensure they remain effective and aligned with evolving best practices, legal requirements, and platform guidelines from Google Play and the Apple App Store.

9. Contact

For questions or concerns about our child safety standards, or to report an incident, please contact our designated safety team:

Neosolus — Safety Team
148 Rue du 8 Mai 1945
92000 Nanterre, France
Email: safety@mitis.app