India, with its massive digital user base, has quickly become a hub for mobile applications. In recent years, however, a disturbing pattern has emerged. A growing number of mobile applications developed by companies from across the globe are being downloaded in India, many of which allow for and even encourage explicit content, harmful behavior, and a lack of accountability. These apps, often marketed as entertainment or social networking platforms, have become breeding grounds for a range of cybercrimes, including sextortion, harassment, and the exploitation of minors. With names such as Toui, Jelmate, Parol, Musse, Chrd, and others, these applications are gaining traction among Indian users, particularly the younger demographic, without facing significant scrutiny or regulation.
The most alarming aspect of these apps is their failure to implement basic age restrictions, their encouragement of anonymous interactions, and the ability to avoid the submission of personal details. While anonymity can be a feature of privacy-focused apps, it becomes dangerous when it enables users to engage in harmful activities without fear of identification or consequences. The lack of proper safeguards has led to a rise in cybercrimes, many of which disproportionately affect Indian citizens.
This article delves into the nature of these apps, the companies behind them, their impact on Indian society, and the urgent need for intervention by the Indian government, tech giants like Apple and Google, and the global community.
The Dangerous Trend of International Companies Exploiting Indian Users
In a world where mobile technology has evolved at a rapid pace, India has emerged as one of the largest markets for apps. With over 600 million internet users, a large proportion of whom are young and tech-savvy, India has become a prime target for both legitimate and malicious digital companies. While some global apps provide value through social networking, entertainment, and education, others are exploiting gaps in regulation to harm users.
The apps that are spreading vulgarity and cybercrimes are often developed by companies based in various countries, including the United States, Russia, European Union countries, and other international markets. Despite their geographic diversity, these companies share one disturbing characteristic: they are not adequately safeguarding Indian users, particularly minors, from exploitation.
Key Apps Contributing to the Issue
- Toui: A mobile app that enables users to engage in live video streams and chat with strangers, Toui has rapidly gained popularity in India. However, the lack of content moderation and age verification means that explicit content is easily accessible. Users can engage in explicit conversations or share indecent material without any oversight. The platform’s failure to restrict adult content has made it a haven for predatory behavior, cyberbullying, and exploitation.
- Jelmate: Originally launched as a social networking and video chatting platform, Jelmate has become notorious for its association with inappropriate behavior and explicit content. With a user base in India, Jelmate’s lack of regulation encourages users to exploit the platform for voyeurism, sexting, and harassment. Many users, particularly teenagers, report being exposed to explicit content or being manipulated into engaging in explicit chats.
- Parol: A video chat and dating app, Parol allows users to connect with strangers without providing sufficient safeguards. Although it claims to offer a space for making friends, the app is increasingly used for explicit acts. Several instances of blackmail and coercion have been reported, with users pressured into sending explicit material under threat of it being shared publicly.
- Musse: A mobile app that enables users to meet new people, Musse has drawn criticism for its lax moderation policies. Like other apps on this list, Musse’s unregulated space encourages individuals to share inappropriate images and videos. The app has become a platform for both harmless interaction and exploitative, harmful behavior.
- Chrd: Another video chat app, Chrd has gained traction in India for its ease of use and its promise of connecting people. Unfortunately, the app’s lack of age verification and content moderation has led to an increase in harmful activities such as online grooming and sextortion.
These applications, despite being created by international companies, have largely escaped regulation in India. By failing to ensure that their apps adhere to appropriate content standards and failing to implement necessary age-restriction mechanisms, these companies are taking advantage of India’s largely unregulated mobile app market. Furthermore, many of these apps circumvent proper vetting processes by offering users the option to “continue as guest,” thus bypassing identity verification and making it easier for people to engage in harmful activities without being tracked.
How These Apps Target Vulnerable Users in India
India’s youth demographic, particularly the tech-savvy teenagers and young adults, is one of the most attractive segments for app developers. The country’s rapidly increasing internet penetration, coupled with affordable smartphones, means that millions of young people are online at any given moment. This young audience is particularly vulnerable to apps that allow anonymity, lack age restrictions, and, most importantly, encourage the sharing of explicit content.
The anonymity that these apps provide makes it easy for people to engage in inappropriate behavior without fear of being caught. The ability to “continue as guest” means that users do not have to share any personal information, and as a result, the anonymity provides a layer of protection for malicious actors looking to exploit others. Moreover, the addictive nature of some of these platforms, which encourage users to keep returning for live interactions or instant gratification, has led to a growing number of individuals becoming addicted to using these apps, often to the detriment of their mental health.
In many cases, these apps are promoting harmful behaviors like sexting, blackmail, and online predation, which disproportionately affect minors. Perpetrators often use the apps to manipulate or coerce vulnerable individuals into sharing explicit content, which is then used to either extort money or to shame the victims.
Cybercrimes Fueling the Growth of These Apps
As the use of these apps increases, so too does the rise in cybercrimes. The activities enabled by these platforms—sextortion, online predation, and cyberbullying—have become common occurrences.
- Sextortion: One of the most common forms of cybercrime associated with these apps is sextortion. Victims, often unaware of the dangers, are tricked into sharing explicit images or videos of themselves. The perpetrator then threatens to release these materials publicly unless the victim pays a ransom or continues providing explicit content. Sextortion is particularly rampant on platforms where anonymity is encouraged, as it is more difficult for authorities to trace the perpetrators.
- Cyberbullying: The lack of regulation and the unfiltered nature of these apps create environments where bullying and harassment thrive. Perpetrators often target vulnerable users, especially minors, by sending explicit content or making hurtful, threatening comments. The absence of an effective reporting system means that many victims have nowhere to turn for help.
- Online Grooming and Predation: Predators use these apps to build trust with minors, often luring them into engaging in explicit video chats or sharing sexually explicit material. Once the trust is established, the predator uses the threat of releasing these materials to manipulate the victim into sending more explicit content or meeting in person.
The Role of International Companies and App Stores in Spreading Vulgarity
One of the most puzzling aspects of the proliferation of these apps is how they manage to gain access to popular platforms like Google Play and the Apple App Store. Both Google and Apple have content moderation policies for their app stores, but these policies are often not stringent enough to prevent apps with explicit content or harmful behavior from being published. There are several reasons why these apps are able to circumvent regulations and continue to spread harmful behavior:
- Inconsistent Content Moderation: Both Apple and Google rely on automated systems and user reporting to moderate content in their app stores. While these systems work in many cases, they are not foolproof. Some apps, especially those that are newly launched, may go unnoticed by moderators until they have gained a significant number of users. Additionally, there are many apps that operate under the radar by offering deceptive descriptions or creating multiple variations of the same app to bypass filters.
- Financial Incentives: Many of the companies behind these harmful apps generate significant revenue, either through in-app purchases, advertisements, or subscriptions. This financial success may incentivize the companies to take advantage of the loose regulations in certain markets, allowing them to bypass scrutiny. Both Google and Apple, as platforms, benefit from the wide range of apps they offer, so they may be reluctant to remove an app that is generating revenue, even if it promotes harmful behavior.
- Lack of Effective Reporting Mechanisms: While both Apple and Google offer users the ability to report harmful apps, the process is often slow and not always effective. Many apps remain on the store for weeks or even months after receiving reports of inappropriate content. This delay in action contributes to the widespread use of harmful apps and the continued exploitation of users.
The Role of the Indian Government and Regulatory Authorities
The Indian government has made significant strides in regulating the digital space, particularly with the introduction of laws such as the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011, and more recently, the Personal Data Protection Bill, 2019. However, there is still much work to be done in addressing the issues posed by harmful mobile apps.
- Stricter App Store Regulations: The government should collaborate with Apple and Google to enforce more stringent guidelines for app approval and ensure that all apps available in the Indian market meet basic standards for privacy, safety, and age-appropriate content.
- Enhanced Reporting and Accountability: An effective system for reporting harmful apps should be established, and developers should be held accountable for any violations. Apps that promote harmful content should be swiftly removed, and the companies behind them should face severe penalties.
- Public Awareness Campaigns: The government should launch nationwide campaigns to educate users—especially young people—about the risks of using unregulated apps. Digital literacy programs should include education on identifying safe and legitimate platforms and avoiding apps that promote harmful behavior.
- Collaboration with International Agencies: Given the global nature of the companies behind these apps, India should work with international regulators to ensure that the creators of these apps are held accountable. This could involve sharing intelligence, coordinating investigations, and working together to establish global standards for digital safety.
The proliferation of apps like Toui, Jelmate, Parol, Musse, and Chrd is a growing threat to India’s digital landscape, especially for its young and vulnerable population. These apps, which exploit the anonymity of users and lack proper safeguards, have become breeding grounds for cybercrimes, exploitation, and harmful behavior. The failure of international companies to properly regulate these apps and the lax policies of app stores like Google Play and the Apple App Store have allowed this problem to persist.
It is imperative that the Indian government, tech giants, and international regulators take immediate and effective action to address this issue. Without proper regulation, the digital space in India will continue to be exploited, putting millions of innocent users at risk. The future of India’s digital landscape depends on a safer, more accountable online environment, one that prioritizes user safety and ensures that the internet remains a space for growth and opportunity, not exploitation.
*Disclaimer: We hereby declare that we have conducted a thorough investigation into the matter at hand and, based upon our findings, have compiled this report. The information contained within is a result of our independent research and analysis.