EXPLAINED

From consent to breach alerts: An explainer on how India’s new privacy law affects you

C

Chinmay Chaudhuri

Author

November 16, 2025

Published

DPDP Rules, 2025 strengthen user data rights with clearer consent, tighter security & mandatory breach notifications, but their full impact will depend on enforcement once all provisions take effect

Insight Post Image

Aisha, a 28-year-old software engineer, starts her day by unlocking her phone, browsing social media, making payments through fintech apps, and checking emails. Until now, she has rarely paused to think about where all her personal data goes, whether it is being protected, or how many organisations have copies of her identity, behaviour patterns, financial trail, or location history.

With the formal notification of the Digital Personal Data Protection (DPDP) Rules, 2025, this everyday digital routine is about to change. Although only parts of the law are currently in force, its direction marks the biggest shift in how citizens’ data is treated in India since the internet became mainstream.

In August 2023, the Digital Personal Data Protection Act received Presidential assent. The rules now notified place India much closer to having a fully functional privacy regime, eight years after the Supreme Court held ‘privacy’ to be a fundamental right. While the law is already operational, many of the highest-impact protections — such as requiring informed consent before data processing and forcing companies to notify users of data breaches — will come into effect only after a transition period of about 18 months. For individuals like Aisha, that means the privacy landscape has entered a transformation phase rather than an overnight makeover.

The first clear impact of the law is greater transparency in how personal data is collected. Companies, called “data fiduciaries” in the DPDP framework, must provide users with a clear, standalone notice before processing their information. Instead of hiding terms deep inside lengthy agreements, platforms will be expected to explain, in simple language, exactly what information they want, why they need it, and what goods, features, or services will depend on that data.

The law deliberately pushes companies to move away from vague wording like “data may be used to improve user experience”, and towards precise justification for each category of information they collect. This means Aisha should soon be able to understand exactly why a messaging app wants her contacts, why a ride-hailing app wants her location at all times, or why streaming platforms request device identifiers.

Penalty Structure

While transparency is core, security is the other big shift. Data fiduciaries must implement reasonable security safeguards to protect personal information. That includes encryption, access control, monitoring for unauthorised access, and secure backups. The penalty structure has been designed to deter negligence rather than simply punish misconduct. If a company suffers a breach and did not implement adequate safeguards, the financial consequence could go as high as ₹250 crore.

Though this will not prevent data breaches entirely, the cost of ignoring security has been raised so sharply that corporate behaviour is expected to change. Perhaps even more important is the requirement that individuals must be informed “without delay” if their data has been compromised. The notification must include what was breached, how the breach occurred, when, and what consequences might follow, along with actions the organisation is taking to mitigate risks.

For Aisha, who may have experienced anxiety during previous breaches where companies stayed silent, this promises a corrective shift in power — she can take steps to protect herself immediately rather than wait for rumours or media reporting.

The data of children receives special attention in the rules. Platforms must obtain verifiable parental consent before processing children’s data. The government has not imposed a specific verification method, leaving flexibility for companies to decide how to implement it, something tech giants had earlier argued would be necessary to make the rule workable in practice.

Behavioural tracking and targeted advertising for minors are largely prohibited. However, limited personalisation is allowed for safety and content filtering, acknowledging that a 13-year-old’s digital environment cannot be made safe without some level of content evaluation. For Aisha, this means that if her younger sibling uses a social platform, advertisements cannot be profiled based on their viewing habits, but the platform may still monitor harmful content to shield the child.

Insight Post Image

Data Fiduciaries

Another structural shift results from the classification of “significant data fiduciaries.” The government will designate some companies — typically entities handling massive or sensitive volumes of data or posing potential risks to sovereignty, democracy or national security — into this category. Tech majors such as Google, Meta, Apple, Microsoft, and Amazon are expected to be included. These companies face additional obligations, including possible restrictions on transferring certain categories of personal and traffic data outside India.

This effectively embeds a form of data localisation, which global technology firms have consistently resisted but which the government sees as a sovereignty requirement. In practical terms, this may lead to more data belonging to users like Aisha being processed and stored on Indian soil rather than moving freely across servers outside the country.

A key institutional change has already taken effect — the Data Protection Board of India (DPB). It will act as the adjudicatory body to ensure compliance, handle complaints, and impose penalties. It may also accept voluntary undertakings and enable resolution processes that do not require full-scale litigation.

What this means for individuals is that there is a dedicated grievance mechanism for data misuse or mishandling. If a company misuses Aisha’s information or fails to respond to her concerns, she will be able to escalate the matter to a statutory authority rather than having to navigate legal action or consumer forums.

Ripple Effects

Alongside these direct benefits, there are larger ripple effects. Perhaps the most immediate is increased trust in digital services. India is one of the world’s fastest-growing internet economies, but mistrust of how apps treat data has also grown. For fintech, health-tech, and ed-tech platforms, privacy concerns have sometimes slowed adoption. Stronger protections and clearly defined responsibilities could make people more confident in using online services, particularly when sharing sensitive data.

At the same time, the ecosystem faces challenges. Compliance will not be cheap, especially for smaller startups that rely heavily on data-driven analytics. As the transition unfolds, some companies may raise prices or restrict features to manage compliance costs. The innovation curve may initially slow, not because innovation becomes impossible, but because companies will have to redesign products with privacy in mind instead of collecting everything first and worrying about implications later. This is the exact cultural change the law aims to trigger — one in which privacy is not an add-on feature but the starting point of digital design.

There are also political and civic implications. The Act has drawn criticism for giving wide exemptions to government agencies. Personal data of citizens can still be processed by the State for reasons such as national security, public order, or relations with foreign states. In such cases, the usual consent architecture might not apply, and individuals may not know when their data is accessed. In addition, the amendment restricting disclosure of public officials’ personal information under the Right to Information Act has alarmed transparency advocates. While this does not directly infringe on Aisha’s data rights, it reduces her access to information when public accountability intersects with personal details of government officials, and it raises broader questions about whether privacy protections are being applied evenly to State and non-State actors.

Implementation Challenges

India now stands at a critical inflection point. On one hand, the DPDP Rules, 2025 give ordinary citizens a genuine pathway to agency over their digital lives. For people like Aisha, the benefits include clearer choices, better security, timely information after breaches, stronger grievance redressal, and eventually, meaningful consent. On the other hand, some provisions are delayed, exemptions for the government remain controversial, and enforcement will determine whether the law becomes a powerful protector or merely a symbolic one.

When the 12- to 18-month transition period ends and the full suite of protections becomes operational, India could effectively move from a market where data is freely extracted to one where data is treated as a right held by the individual.

The evolution won’t be friction-free; companies will need to change habits, regulators will need capacity, and users will need awareness. But for the first time, digital privacy in India has a legal foundation with teeth, consequences, and oversight. And that means that Aisha, and millions like her, may soon navigate the internet with both convenience and dignity — not because platforms choose to respect privacy, but because the law compels them to.