Voko Back to Home

Child Safety Policy

Voko Child Safety Policy

Voko is committed to creating a safe, respectful, and inclusive online environment for all users, especially minors. We have a zero-tolerance policy against any form of child sexual abuse, exploitation, harm, or sexualization. Any behavior, content, or activity that violates this policy will result in permanent account deletion and may be reported to law enforcement and relevant authorities.

1. Age Restrictions, Eligibility, and Age Verification

  • Minimum Age Requirement: Voko is strictly for users aged 18 and above, and under 75. You must be at least 18 years old to create an account and use Voko. This platform is not intended for individuals under the age of 18 (i.e., "minors").
  • Prohibition of Minor Use and Impersonation: Minors are strictly prohibited from being present on the Voko platform in any form. Submitting false age information, impersonating an adult, or creating an account for a minor is strictly prohibited and will result in immediate account suspension or permanent termination.
  • Age Verification and Monitoring: Voko employs a combination of automated systems and human review to identify minor use, potential child sexual abuse and exploitation risks, and other policy violations. Attempting to circumvent these systems is a serious policy violation.

2. Prohibited Behaviors and Content

All users are prohibited from engaging in or assisting in any of the following activities involving minors (anyone under the age of 18):

2.1 Child Sexual Abuse and Exploitation (CSAE)

  • Sharing, soliciting, producing, or distributing material involving sexual acts or abuse with minors.
  • Engaging in conversations, discussions, or "role-playing" that sexualizes children, even without explicit images or direct contact.
  • Using gestures, symbols, emojis, virtual filters, clothing, or any other features to depict, refer to, or target minors in a sexual or suggestive manner.
  • Attempting to approach, groom, solicit, or entice minors through the application.

2.2 Child Sexual Abuse Material (CSAM)

  • Uploading, storing, distributing, or linking to any type of CSAM.
  • Sharing AI-generated, deepfake, or synthetic content that sexualizes minors.
  • Sharing, creating, or distributing any form of child nudity or sexualized content, including artistic or digital depictions such as cartoons, drawings, or animations.

2.3 Other Harmful or Dangerous Behaviors

  • Threatening, psychologically manipulating, bullying, or harassing minors.
  • Inciting or depicting physical violence, neglect, abandonment, or trafficking of minors.
  • Soliciting, luring, or sharing personally identifiable information of minors.

Such content or behavior will result in immediate content removal and permanent account ban, and serious cases will be directly referred to law enforcement.

3. Safety Identification, Monitoring, and Review Mechanisms

3.1 Automated Screening and Detection

  • Technical scanning of user-uploaded images, video streams, and profiles to identify inappropriate content potentially involving minors.
  • Screening keywords and patterns in chat messages, profiles, and feeds to identify suggestive, inappropriate, or harmful language.
  • Triggering security alerts based on user behavior analysis (e.g., unusual login patterns, frequent changes of identity information, association with known risky accounts).

3.2 Manual Review and Auditing

  • The safety team conducts manual reviews of all content flagged by the system or reported by users.
  • Regular random sampling audits of publicly available platform content to ensure compliance with safety standards.
  • Enhanced monitoring and review of accounts with a confirmed history of child safety violations.
  • Upon verification of violations, immediate action such as deletion and banning will be taken, and a decision will be made on whether to report to the authorities.

4. Platform Security and Reporting Features

4.1 Dedicated Reporting Channel

Priority Reporting Category: In the in-app reporting menu, "Underage" is set as a priority reporting category for quick identification and selection by users. We strongly encourage every user to act responsibly and report any harmful or suspicious behavior.

Core Reporting Process:

  1. Click the "Report" button in the chat window, user profile, or application settings.
  2. In the reporting categories, prioritize selecting "Underage".
  3. Provide as detailed a description as possible, and include evidence such as screenshots, chat logs, and timestamps to support our investigation.

Direct Contact: Users can also contact our security team directly by sending an email to service@voko.chat.

4.2 External Reporting Resources

If you find content involving child sexual abuse, we recommend reporting directly to:

  • National Center for Missing and Exploited Children (NCMEC)
  • Contact your local hotline through the INHOPE network

5. User Awareness, Transparency, and Education

5.1 Enhancing Security Awareness

  • Policy Accessibility: A banner is placed on the application's launch page or in a prominent location, directly linking to this policy.
  • Proactive Reminders: When users create content, the system will display warnings as needed, reminding users to comply with child protection rules.

5.2 Preventive Education

  • Conducting safety-themed activities within the application, pushing educational banners, and disseminating child safety principles.
  • Regularly updating and promoting community safety guidelines to align with evolving global online child protection standards and best practices.

6. Penalties for Violations and Accountability

6.1 Prohibited Individuals

Anyone known to have been convicted of crimes against children (including but not limited to sexual abuse, exploitation, violence, or trafficking) is strictly prohibited from using Voko. Discovery of such individuals will result in permanent expulsion.

6.2 Enforcement and Cooperation

  • Platform Penalties: Voko takes swift and decisive action against any violations, including content removal and permanent account bans.
  • Legal Reporting: All verified serious violations, especially those involving CSAM, will be reported to NCMEC and/or other relevant law enforcement agencies as required by law.
  • Cooperation with Investigations: Voko will fully cooperate with relevant investigations conducted by law enforcement agencies worldwide as required by law.

7. Our Ongoing Commitment to Safety

Protecting children's online safety is a continuous and dynamic responsibility. Voko is committed to continuously investing resources to evaluate and improve our safety policies, technological safeguards, review processes, and educational initiatives. We aim not only to deter violations through punishment but also to cultivate a culture of safety awareness, collective responsibility, and shared protection throughout the community, maintaining a safer digital space for everyone.