16
Nov

Privacy on the Line: What do Indians think about privacy & data protection?

By Beni Chugh, Varun Aggarwal and Malavika Raghavan, Dvara Research

We met Sulekha[1] in a village in Uttarakhand. She was talking about the information she considered most important to her: her ration card, Aadhaar card, NREGA job card and her phone number. When asked how much she would sell this information for, she visibly withdrew saying she did not want any money for it. What would she need to share this information? She replied simply: a guarantee that it would not be misused.

Sulekha was one of the 50 people we spoke to as part of a small, deeply qualitative study on which the Future of Finance Initiative (FFI) at Dvara Research partnered with Dalberg Design and CGAP. We set out to understand: ‘how do ordinary citizens of India think and act on their privacy and data protection?’ Across four regions of the country (Maharashtra, Uttarakhand, Tamil Nadu and Delhi) we used the Human Centred Design (HCD) method to have discussions to understand not just what people say, but how they think, act and feel. The final report on the study is available here.

Our conversations in the field revealed that contrary to common perception, people in India care deeply about their personal data and privacy. Respondents were surprised that service providers could share their personal information with third parties and wanted to be informed of such sharing. People were also sensitive about sharing their personal data such as photos, messages and browsing histories—even with their family—and were unwilling to sell certain types of personal data like their telephone numbers.

Even the data that they were willing to share in order to receive services came with conditions. People wanted to know how their data was handled. They also, much like Sulekha, wanted an assurance from providers that no harm would come to them through the use of their data. Many of the interviewees recognised their inability to understand standard notice clauses and wanted more visual forms of consent that they could easily understand without relying on others.

Alarmingly, most interviewees had experienced fraud (especially via phone impersonators), and did not know how to protect themselves or seek redressal. Women, in particular, were highly vulnerable to reputational harms, and self-censored themselves (for example by not sharing phone numbers or photos) to protect themselves.

Although the government and its institutions inspired universal trust, people working in government institutions were not trusted with personal data – unless the employees came from the same community group or geographic area. Agents of banks and mobile network providers were also recognised as common perpetrators of illicit disclosures of personal data.

In cases where harm was caused to them as a result of a data breach, the respondents wanted easy access to seek redressal, and wanted to be compensated fully.

We heard individuals asserting their right to have their personal information treated responsibly. They indicated clear and strong preferences for a system that provides them agency and control over their data. Citizens at the grassroots want a data protection regime where providers are held accountable and are obligated to treat personal data responsibly.

You can read the full report here and watch the below video on the study.


[1] Name changed.

25
Aug

The Right to Privacy Judgment: Initial Reflections on Implications for Digital Financial Services

By Malavika Raghavan, IFMR Finance Foundation

The Supreme Court of India’s judgment on the fundamental right to privacy yesterday, 24 August 2017, speaks directly to the sweeping changes we are witnessing in the way that the State and private companies use citizens’ personal data. The collection and aggregation of individuals’ data to inform the entire chain of any welfare or commercial service provision is now de rigueur. In recent years, finance has become the poster child of this opportunity to use data: for first-time users of formal finance to be identified and diligenced; for products to be designed around their needs; for their digital and social information to stand-in where they have no assets to back their promises to re-pay credit. No where is this trend more alive than in India, and no where are the risks also writ as large. In the last 2 years we have seen a billion Indian mobile subscriptions, a billion Aadhaar numbers with over 67 crore bank accounts linked to Aadhaar numbers for direct DBT transfer among other services. We have also witnessed over 3.2 million individuals financial information being compromised by PoS/ ATM malware; the potential for stored biometrics to be used in unauthorised authentications, and for unauthorised entities to access citizen’s personal data for eKYC purposes.

If the direction of travel is towards a more digital world, what are our protections and how should we think about regulating data in our country? The judgements in Justice K S Puttaswamy & Anr v. Union of India & Ors have laid down some touchstones to anchor how we navigate these questions in the years ahead. This post first picks out some key messages from the judgment (especially around informational privacy which has special relevance for the use of personal data in retail finance) and then presents initial reflections on implications for financial services.

Privacy is recognised as an inalienable, natural right situated across our fundamental rights

This judgement—coming to the Court as it does, as a result of cases filed on the legality of the Aadhaar project—grounds its reasoning within the context of the world we find ourselves in today. Technology is now part of our lives in a way that could not have been imagined when the Indian republic was formed 67 years ago. However, the principles on which we have founded our republic have continued relevance precisely because they guide us towards solutions for the intractable problems of our time.[1] Taking stock of this, the Supreme Court has confirmed that privacy is a constitutionally protected right that emerges primarily from the guarantee of life and personal liberty in Article 21 of the Indian Constitution, and also arising across a whole raft of fundamental rights contained in Part III of the Indian constitution.[2]

The Court has tied back the right to privacy to the basic values that the Constitution and Indian society aspire to. These are given voice to in the preamble, among other parts of the Constitution. Across all six judgement texts delivered by the nine judges of the bench, certain values have been seen as inherent and intertwined with individual privacy.

Privacy is seen as a postulate of human dignity, and an essential part of individual liberty. Privacy enables individual autonomy. Indeed it is seen as lying across the spectrum of protections—for instance, its existence is needed to prevent the state from discriminating between citizens (and infringing the right to equality) by keeping certain aspects private. The Court has also noted that privacy has both subjective and objective elements i.e. subjectively, the expectation of individuals (where they desire) to be left alone AND objectively, those constitutional values that shape a protected zone where the individual ought to be left alone.[3]

In Puttaswamy, the Court has made several important observations about the nature and content of privacy protections which will no doubt be the subject of scholarship and interpretation for years to come. But two observations in particular merit the attention of those working to improve access to finance for the underserved. Firstly, the Court refuses any notion of a trade-off between individual freedoms and development. The Kesavananda Bharati[4] judgment’s view is re-iterated, that Parliament cannot abrogate the essential features of the individual freedoms secured to citizens in India. Our Constitution does not take the perspective that in order to build a welfare State, it is necessary to destroy some human freedoms. Indeed, to quote “Our constitutional plan is to eradicate poverty without destruction of individual freedoms.”[5]

Secondly, and crucially for those of us tracking the use of personal data in financial services, individuals’ informational privacy is now firmly within the protection of fundamental rights.

Informational privacy is part of our expectation of privacy as Indians

Informational privacy i.e. the interest in limiting or controlling the access to information about ourselves, is dealt with in the lead Puttaswamy judgement by Chandrachud, J which devotes an entire section to it.[6] The Court takes note of the way in which technology has changed our lives, the digital trails we leave behind as we transact online, and the aggregation of these data points to reveal things about us that we may not expressly disclose. It notes the use of cookies to track online behaviour, the collection of users’ browsing histories, and other tools like automated content analysis of emails which can be analysed with algorithms to profile individual users. The Court notes that the use of data mining techniques, Big Data and the possibility of database linking essentially allow for aggregation of data about every single person in a manner previously not encountered.

Given this context, the Court notes the important role of data protection laws in safeguarding the privacy and autonomy of an individual, and ensuring non-discrimination on the basis of racial or ethnic origin, political or religious beliefs, genetic or health status or sexual orientation. The Court has recognised that a good data protection law will need to delicately balance the complex issues between individuals’ privacy interests and legitimate concerns of the state.

Para 180 of the leading judgment by Chandrachud, J contains a three-fold prescription to act as important guidance when considering how privacy might be safeguarded by ensuring:

  • that there must be a law: A law is needed to justify any encroachment on privacy, to fulfil the requirement in Article 21 of our Constitution that no deprivation of liberty can be undertaken except by a procedure established by law;
  • that law must be reasonable: Such a law must fall within the zone of reasonableness as required by Article 14 as a guarantee against arbitrary state action;
  • the law must be proportional: Any encroachment on individual privacy must be proportionate to the object and needs sought to be fulfilled by such a law.

Kaul J in his remarks presents the test of proportionality and legitimacy for limiting the state’s discretion, which requires an action to be sanctioned by law, necessary for a legitimate aim, proportionate to the need for such interference and with procedural guarantees against abuse of such interference.[7]

Reiterating the principles set out by the Government of India Group of Expert of Privacy in 2012, the Court takes note of the Committee of Experts chaired by Justice B N Srikrishna that has been constituted and will suggest a new data protection regime for the country. The work of ensuring balance is achieved in law and is manifested in practice lies ahead for all of us.

On the regulation of personal data and implications for financial services

The observations of the Court in Puttaswamy have direct implications for operational aspects of retail finance and for newer digital financial services provision. The use of new and alternative forms of data about consumers to target advertising and communication, and to appraise individuals is now a reality, as is the use of algorithms to mine data for use in processes like credit scoring. Negative outcomes from such processes that affect individuals’ privacy or cause discrimination will now be seen as infringements of fundamental rights, where state entities are involved. A horizontal data protection regime (applying to state and non-state actors) based on the same understanding of privacy would extend privacy protections for users against all types of entities.[8] As we debate the contours of privacy for our new data protection regulation and in existing financial sector regulations, we have an opportunity to shine a spotlight on existing data practices around consumers’ personal and financial information in financial institutions.

For those involved in the chain of financial services provision that is increasingly becoming more “digital”, this judgment has flagged up a new understanding of core issues. In particular, it forces more granular reflection on:

  • the kinds of data that can and should be collected, keeping in mind values of privacy and dignity of the individual;
  • the kind of data mining and algorithmic techniques that can be used, keeping in mind that such techniques cannot infringe privacy and liberty, autonomy and free choice, and equality of all individuals;
  • whether individuals’ reasonable expectations of privacy can vary based on categories and context of data; and
  • how a fair, just and reasonable law can help us find a way to ensure that the use of personal data is tied to legitimate proportionate objectives and interests.

This judgement has moved the gears for privacy and data protection in the country, ushering us into an era of change where we are seeing data protection laws globally being re-purposed for rapidly evolving technological advancements. All this will require a shift in our understanding of liability, and for our practices around accountability and reporting. All of this will need to be tackled by new data protection regulation and updating appropriate financial sector regulation – and ultimately, in the way in which our day-to-day data practices evolve within government, industry and between citizens of India.

—-

[1] Justice Puttaswamy & Anr v. Union of India & Ors, ALL WP(C) No.494 of 2012, DY Chandrachud, J at page 213. (Puttaswamy).

[2] ibid, page 262.

[3] supra n 1, para 169, page 246.

[4] Kesavananda Bharati v. State of Kerala, (1973) 4 SCC 225.

[5] Ibid, para 666, pages 486-487 cited in Puttaswamy, para 108, page 105.

[6] supra n.1, para 170 – 185, pages 246 – 260.

[7] supra n.1, Kaul J at para 71, page 27.

[8] The argument of some respondents (including the UIDAI) was that the right to privacy is a common law right. This would mean it was applicable to state and non-state actors. As noted by Bobde, J in Puttaswamy, a right can be simultaneously recognised as a common law and constitutional law right. Bobde, J also noted that the content of privacy in both forms (common and constitutional) is identical, which gives rise for the potential for similar considerations to apply across state and non-state actors. See Puttaswamy, Bobde, J at para 17-18, page 15-16.

22
Aug

Big Data, Financial Inclusion and Privacy for the Poor

Guest Post by Dr Katharine Kemp, Research Fellow, UNSW Digital Financial Services Regulation Project

Financial inclusion is not good in itself.

We value financial inclusion as a means to an end. We value financial inclusion because we believe it will increase the well-being, dignity and freedom of poor people and people living in remote areas, who have never had access to savings, insurance, credit and payment services.

It is therefore important to ensure that the way in which financial services are delivered to these people does not ultimately diminish their well-being, dignity and freedom. We already do this in a number of ways – for example, by ensuring providers do not make misrepresentations to consumers, or charge exploitative or hidden rates or fees. Consumers should also be protected from harms that result from data practices, which are tied to the provision of financial services.

Benefits of Big Data and Data-Driven Innovations for Financial Inclusion

“Big data” has become a fixture in any future-focused discussion. It refers to data captured in very large quantities, very rapidly, from numerous sources, where that data is of sufficient quality to be useful. The collected data is analysed, using increasingly sophisticated algorithms, in the hope of revealing new correlations and insights.

There is no doubt that big data analytics and other data-driven innovations can be a critical means of improving the health, prosperity and security of our societies. In financial services, new data practices have allowed providers to serve customers who are poor and those living in remote areas in new and better ways, including by permitting providers to:

  • extend credit to consumers who previously had to rely on expensive and sometimes exploitative informal credit, if any, because they had no formal credit history;
  • identify customers who lack formal identification documents;
  • design new products to fit the actual needs and realities of consumers, based on their behaviour and demographic information; and
  • enter new markets, increasing competition on price, quality and innovation.

But the collection, analysis and use of enormous pools of consumer data has also given rise to concerns for the protection of financial consumers’ data and privacy rights.

Potential Harms from Data-Driven Innovations

Providers now not only collect more information directly from customers, but may also track customers physically (using geo-location data from their mobile phones); track customers’ online browsing and purchases; and engage third parties to combine the provider’s detailed information on each customer with aggregated data from other sources about that customer, including their employment history, income, lifestyle, online and offline purchases, and social media activities.

Data-driven innovations create the risk of serious harms both for individuals and for society as a whole. At the individual level, these risks increase as more data is collected, linked, shared, and kept for longer periods, including the risk of:

  • inaccurate and discriminatory conclusions about a person’s creditworthiness based on insufficiently tested or inappropriate algorithms;
  • unanticipated aggregation of a person’s data from various sources to draw conclusions which may be used to manipulate that person’s behaviour, or adversely affect their prospects of obtaining employment or credit;
  • identity theft and other fraudulent use of biometric data and other personal information;
  • disclosure of personal and sensitive information to governments without transparent process and/or to governments which act without regard to the rule of law; and
  • harassment and public humiliation through the publication of loan defaults and other personal information.

Many of these harms are known to have occurred in various jurisdictions. The reality is that data practices can sometimes lead to the erosion of trust in new financial services and the exclusion of vulnerable consumers.

Even relatively well-meaning and law-abiding providers can cause harm. Firms may “segment” customers and “personalise” the prices or interest rates a particular consumer is charged, based on their location, movements, purchase history, friends and online habits. A person could, for example, be charged higher prices or rates based on the behaviour of their friends on social media.

Data practices may also increase the risk of harm to society as a whole. Decisions may be made to the detriment of entire groups or segments of people based on inferences drawn from big data, without the knowledge or consent of these groups. Pervasive surveillance, even the awareness of surveillance, is known to pose threats to freedom of thought, political activity and democracy itself, as individuals are denied the space to create, test and experiment unobserved.

These risks highlight the need for perspective and caution in the adoption of data-driven innovations, and the need for appropriate data protection regulation.

The Prevailing “Informed Consent” Approach to Data Privacy

Internationally, many data privacy standards and regulations are based, at least in part, on the “informed consent” – or “notice” and “choice” – approach to informational privacy. This approach can be seen in the Fair Information Practice Principles that originated in the US in the 1970s; the 1980 OECD Privacy Guidelines; the 1995 EU Data Protection Directive; and the Council of Europe Convention 108.

Each of these instruments recognise consumer consent as a justification for the collection, use, processing and sharing of personal data. The underlying rationale for this approach is based on principles of individual freedom and autonomy. Each individual should be free to decide how much or how little of their information they wish to share in exchange for a given “price” or benefit. The data collector gives notice of how an individual’s data will be treated and the individual chooses whether to consent to that treatment.

This approach has been increasingly criticised as artificial and ineffectual. The central criticisms are that, for consumers, there is no real notice and there is no real choice.

In today’s world of invisible and pervasive data collection and surveillance capabilities, data aggregation, complex data analytics and indefinite storage, consumers no longer know or understand when data is collected, what data is collected, by whom and for what purposes, let alone how it is then linked and shared. Consumers do not read the dense and opaque privacy notices that supposedly explain these matters, and could not read them, given the hundreds of hours this would take. Nor can they understand, compare, or negotiate on, these privacy terms.

These problems are exacerbated for poor consumers who often have more limited literacy, even less experience with modern uses of data, and less ability to negotiate, object or seek redress. Yet we still rely on firms to give notice to consumers of their broad, and often open-ended, plans for the use of consumer data and on the fact that consumers supposedly consented, either by ticking “I agree” or proceeding with a certain product.

The premises of existing regulation are therefore doubtful. At the same time, some commentators question the relevance and priority of data privacy in developing countries and emerging markets.

Is data privacy regulation a “Western” concept that has less relevance in developing countries and emerging markets?

Some have argued that the individualistic philosophy inherent in concepts of privacy has less relevance in countries that favour a “communitarian” philosophy of life. For example, in a number of African countries, “ubuntu” is a guiding philosophy. According to ubuntu, “a person is a person through other persons”. This philosophy values openness, sharing, group identity and solidarity. Is privacy relevant in the context of such a worldview?

Privacy, and data privacy, serve values beyond individual autonomy and control. Data privacy serve values which are at the very heart of “communitarian” philosophies, including compassion, inclusion, face-saving, dignity, and the humane treatment of family and neighbours. The protection of financial consumers’ personal data is entirely consistent with, and frequently critical to, upholding values such as these, particularly in light of the alternative risks and harms.

Should consumer data protection be given a low priority in light of the more pressing need for financial inclusion?

Some have argued that, while consumer data protection is the ideal, this protection should not have priority over more pressing goals, such as financial inclusion. Providers should not be overburdened with data protection compliance costs that might dissuade them from introducing innovative products to under-served and under-served consumers.

Here it is important to remember how we began: financial inclusion is not an end in itself but a means to other ends, including permitting poor and those living in remote areas to support their families, prosper, gain control over their financial destinies, and feel a sense of pride and belonging in their broader communities. The harms caused by unregulated data practices work against each of these goals.

If we are in fact permanently jeopardising these goals by permitting providers to collect personal data at will, financial inclusion is not serving its purpose.

Solutions

There will be no panacea, no simple answer to the question of how to regulate for data protection. A good starting place is recognising that consumers’ “informed consent” is most often fictional. Sensible solutions will need to draw on the full “toolkit” of privacy governance tools (Bennett and Raab, 2006), such as appropriate regulators, advocacy groups, self-regulation and regulation (including substantive rules and privacy by design). The solution in any given jurisdiction will require a combination of tools best suited to the context of that jurisdiction and the values at stake in that society.

Contrary to the approach advocated by some, it will not be sufficient to regulate only the use and sharing of data. Limitations on the collection of data must be a key focus, especially in light of new data storage capabilities, the likelihood that de-identified data will be re-identified, and the growing opportunities for harmful and unauthorised access the more data is collected and the longer it is kept.

Big data offers undoubted and important benefits in serving those who have never had access to financial services. But it is not a harmless curiosity to be mined and manipulated at the will of those who collect and share it. Personal information should be treated with restraint and respect, and protected, in keeping with the fundamental values of the relevant society.

—-

References:

Colin J Bennett and Charles Raab, The Governance of Privacy (MIT Press, 2006)

Gordon Hull, “Successful Failure: What Foucault Can Teach Us About Privacy Self-Management in a World of Facebook and Big Data” (2015) 17 Ethics and Information Technology Journal 89

Debbie VS Kasper, “Privacy as a Social Good” (2007) 28 Social Thought & Research 165

Katharine Kemp and Ross P Buckley, “Protecting Financial Consumer Data in Developing Countries: An Alternative to the Flawed Consent Model” (2017) Georgetown Journal of International Affairs (forthcoming)

Alex B Makulilo, “The Context of Data Privacy in Africa,” in Alex B Makulilo (ed), African Data Privacy Laws (Springer International Publishing, 2016)

David Medine, “Making the Case for Privacy for the Poor” (CGAP Blog, 15 November 2016)

Lokke Moerel and Corien Prins, “Privacy for the Homo Digitalis: Proposal for a New Regulatory Framework for Data Protection in the Light of Big Data and the Internet of Things” (25 May 2016)

Office of the Privacy Commissioner of Canada, Consent and Privacy: A Discussion Paper Exploring Potential Enhancements to Consent Under the Personal Information Protection and Electronic Documents Act (2016)

Omri Ben-Shahar and Carl E Schneider, More Than You Wanted to Know: The Failure of Mandated Disclosure (Princeton University Press, 2016)

Productivity Commission, Australian Government, “Data Availability and Use” (Productivity Commission Inquiry Report No 82, 31 March 2017)

Bruce Schneier, Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World (WW Norton & Co, 2015)

Daniel J Solove, “Introduction: Privacy Self-Management and the Consent Dilemma” (2013) 126 Harvard Law Review 1880

23
Dec

Electronic Financial Data and Privacy in India

By Bhusan Jatania, IFMR Finance Foundation

Earlier this week, the Secretary for the Ministry of Electronics and Information Technology (MeitY) confirmed that MeitY is set to review the legal framework for digital payments and cybersecurity[1]. This is an important move, and one that needs to take note of important blind spots in a key legislation that governs the handling of personal financial information – the Information and Technology Act, 2000 (IT Act). This post draws from our work as part of the Future of Finance Initiative and flags some blind spots in the IT Act that must be addressed in an environment where retail finance is seeing increasing digitisation.

Looking back at 2016, the push towards the digitisation of financial services has been one of defining themes of the year. As more and more Indians make digital payments, we are creating digital footprints of our financial behaviour on a scale the country has never seen before. Meanwhile, India remains one of the world’s largest economies without a law on privacy rights of citizens. This has prompted the Supreme Court to consider – in the context of making Aadhar mandatory for availing governmental benefits[2] – if our Constitution provides for a fundamental right to privacy, although there is no express mention in this regard. As it currently stands, we have retrofitted the Information Technology Act, 2000 (IT Act), originally enacted to give legal sanctity to electronic governance, to provide minimum safeguards in this regard.

This begs the question: who collects the data from this trail, and what are the general obligations that bind them to keep this confidential?

Part of the answer to this question lies in the IT Act – the overarching law governing the collection and use of personal information in electronic form.[3]

1. Requirements

The IT Act applies to these types of entities set-up in India and engaging in commercial/ professional activities (Body Corporates):

(a) company,
(b) firm,
(c) sole proprietorship, or
(d) other association of individuals.

A Body Corporate which either collects, processes, stores, transfers or accesses any sensitive personal data or information (Sensitive Data) in a computer resource has certain compliance requirements[4]. Financial information, defined as “bank account or credit card or debit card or other payment instrument details”, is classified as Sensitive Data.

The Body Corporate must take prior written consent of the data subject for collecting Sensitive Data, adopt a privacy policy and appoint a grievance officer for resolving complaints within 30 days. The Body Corporate must also inform the data subject (i.e. the person whose data is being collected) of:

(a) the fact that Sensitive Data is being collected,
(b) the purpose for which Sensitive Data is collected,
(c) the intended recipients of Sensitive Data,
(d) the name and address of the entity collecting Sensitive Data, and
(e) the entity retaining Sensitive Data.

The Body Corporate must also:

  • provide options to the data subject to decline providing Sensitive Data for availing a service and to withdraw consent which has been given already,
  • allow data subjects to review their Sensitive Data and modify/ update/ correct it (if found outdated/ incorrect), and
  • ensure that Sensitive Data is used as per specified purpose and not retained for a period longer than required for its lawful use (or as required by any other law).

2. What are the blind-spots?

Transaction records: For starters, it remains unclear if ‘financial information’ includes transaction records of the individuals as well, such as say credit card spending patterns or utility bill payments.

Newer forms of data: Newer forms of personal data that may be of a sensitive nature, such as browsing history, call records, social media behaviour, and so on, that are recently finding use in underwriting in financial services, do not have protections that sensitive personal data or information has.

Data retention and collection: Moreover, while a Body Corporate cannot hold Sensitive Data beyond the purpose for which the information was collected, there are no bright-line rules (such as purging the information within 30 days of purpose expiry). Market practice has also evolved in the direction of taking all-encompassing consents, making purpose limitation difficult to enforce.

Foreign banks, government departments and non-Body Corporates: The IT Act will likely not apply to foreign banks branches operating in India (of which there were 325 as of 31 December 2015 [5]) where they have not set-up Indian subsidiaries. The IT Act will also not apply to non-profit organisations, banking business correspondents, individual chartered accountants/ mutual fund distributors/ investment advisors/ insurance brokers etc. Significantly, there is no right to privacy under the IT Act for data collected by a government department, authority, commission or board as these will not be regarded as Body Corporates.

3. What happens if the IT Act is violated?

In India, we lack a dedicated data protection authority to supervise breaches of the IT Act, which are generally dealt with by the Secretary of Department of Information Technology at the state-level, who can impose up to 3 years of imprisonment or fine up to Rs. 500,000. Appeals from such decisions are heard by the country’s only Cyber Appellate Tribunal in New Delhi, which has decided a total of 17 matters since inception and had 66 appeals pending as of March 2016 (due to the continuing absence of a Chairperson since mid-2011). There has also been a long-standing proposal to have a bench of the Cyber Appellate Tribunal in Bengaluru[6].

In theory, an individual whose data has been mishandled under the IT Act can get up to Rs. 5 crore as compensation for negligent handling of his Sensitive Data by a Body Corporate, if he suffers a wrongful loss or a third party makes a wrongful gain.

4. Way Forward

While India deserves a stand-alone privacy statute, the IT Act framework can be extended to all non-public personal information[7] handled by a financial service provider in the interim.

To strengthen the current regime, financial service providers could be required to have nodal privacy officers for overseeing compliance with privacy requirements and to act as single point of contact for addressing customer complaints. Filings with financial regulators could also include a section on the status of such compliances with built-in consequences for violation.

Financial service providers should also be required to provide privacy notice (in model form) to each customer at the point of first engagement and on an annual basis subsequently. The notice can have the provider’s privacy policy in plain language, details of customer information collected, entities with which it can share the information and an accessible opt-out option to prevent information sharing (other than for compulsory purposes such as credit reporting).

Overall, electronic financial data protection in India is based on rudimentary regulations with limited enforcement and lack of distinct treatment by financial sector regulators. It is essential to make major upgrades to the data protection regime given the size, scale and detail of electronic data collection in the financial space.

About the Future of Finance Initiative:

The Future of Finance Initiative (FFI) is housed within IFMR Finance Foundation and aims to promote policy and regulatory strategies that protect citizens accessing finance given the sweeping changes that are reshaping retail financial services in India – including those driven by Indiastack, Payments Banks, mobile usage and the growing P2P market.



1 – See: http://www.thehindu.com/business/Economy/Reviewing-legal-framework-for-securing-digital-payments/article16896971.ece and http://www.livemint.com/Industry/VcLcVc6huMHGloWSSfe2EK/Govt-plans-tighter-privacy-rules-for-electronic-payments.html. Note that the The Information Technology Act, 2000 is administered by MeitY.
2 – In the matter of Justice K.S. Puttaswamy v. Union of India, order dated 11 August 2015.
3 – While we focus on the IT Act, we do note that codes of conduct have been developed by sector-specific regulators which impose an obligation of customer data confidentiality. However there is currently no clear mechanism for tracking/ reporting of privacy violations (under say Reserve Bank of India’s banking ombudsman scheme or Securities and Exchange Board of India’s SCORES system) and also no specific penalty implications for such conduct.
4 – There is a safe harbour provision for Body Corporates handling customer data under outsourcing contracts and not dealing directly with data subjects.
5 – See: https://www.rbi.org.in/commonman/upload/english/content/pdfs/71207.pdf.
6 – See: http://www.thehindu.com/news/cities/bangalore/Proposal-to-set-up-Bangalore-bench-of-Cyber-Appellate-Tribunal/article14948497.ece.
7 – The IT Act defines ‘personal information’ as “any information that relates to a natural person, which, either directly or indirectly, in combination with other information available or likely to be available with a body corporate, is capable of identifying such person.”