Newsnews

Tech Industry Group Criticizes India’s Proposed Rules Against Dark Patterns

tech-industry-group-criticizes-indias-proposed-rules-against-dark-patterns

India’s proposed guidelines to regulate dark patterns, deceptive practices used by online companies to manipulate consumers, have faced opposition from an industry group representing major U.S. tech companies. The Asia Internet Coalition, which includes Apple, Google, Meta (formerly Facebook), Amazon, and X (formerly Twitter), argues that the move would hinder the growth of India’s digital economy and result in “regulatory overlap” with existing laws.

Key Takeaway

The Asia Internet Coalition, representing major U.S. tech companies, opposes India’s proposed rules against dark patterns, calling it “regulatory overlap” and warning against hindering the growth of the digital economy. The industry group suggests relying on the existing self-regulatory framework and adopting an approach similar to the European Union’s. The Indian government acknowledges the concerns but emphasizes the need to address deceptive practices and protect consumers.

Opposition to Proposed Guidelines

The Indian government released draft guidelines last month, seeking to prevent and regulate the use of dark patterns by online companies. These guidelines were put up for public consultation until October 5th to gather feedback from stakeholders. However, the Asia Internet Coalition has suggested that the current self-regulatory framework should be the primary measure to address the issue of dark patterns, as online platforms are already regulated in India under various existing laws.

The industry group’s detailed note to the consumer affairs department, which released the draft guidelines, argues that introducing a separate regulatory framework could lead to uncertainty and compliance challenges. They emphasized that online platforms in India are already regulated as online intermediaries under the Information Technology Act 2000, and e-commerce platforms fall under the governance of the Consumer Protection Act 2019 rules. Sector-agnostic obligations are also covered by the recently released Digital Personal Data Protection Act 2023.

Recommendations and Concerns

The Asia Internet Coalition recommended that India consider adopting an approach similar to the European Union, which is also working on regulating dark patterns. They suggested that if a separate framework is introduced, it should be sector and medium-agnostic and apply to both offline and online content and advertisements.

The industry group also requested a sufficient buffer period between the publication and implementation of the rules. They emphasized the need to protect the safe harbor provisions available in the Information Technology Act and suggested that online intermediary platforms, including e-commerce marketplaces, should not be held responsible for dark patterns present in third-party content and advertisements.

In their note, the Asia Internet Coalition also urged the consumer affairs department to define the term “endorser” in the rules to address disguised advertisements, including endorsements from influencers and celebrities.

The Indian Government’s Response

The Indian government acknowledged the concerns raised by the industry group, but emphasized that dark patterns are causing concern and require proactive handling. The government had previously consulted stakeholders, including e-commerce platforms, on the issue and formed a task force to address the deceptive practices.

It is important to note that India has a significant internet user base and is a crucial market for global online platforms. As the online population continues to grow, the government is implementing more regulations to ensure consumer protection. The upcoming Digital India Act aims to address concerns related to dark patterns, alongside regulations on cybersecurity, data management, and emerging technologies such as artificial intelligence and blockchain.

Leave a Reply

Your email address will not be published. Required fields are marked *