Curating Innovative Ideas through I-Innovate

By Nikhil John, IFMR Holdings

In our endeavor to achieve our mission of ensuring that every individual and every enterprise has complete access to financial services, we realize the crucial role that innovative ideas can play in this journey. These innovations, either the big-impact ones or the incremental steps forward, play a critical in aligning our efforts to serve our customer better and are a constant source of inspiration that nudges us to reflect and adapt to future challenges.

To systematically channelize these ideas from our colleagues across the board, right from our KGFS branches to our corporate offices, we had launched “I-innovate”, an organization wide-effort aimed at inspiring and expediting the spirit of constant innovation by facilitating the generation of the next wave of innovative ideas. Each idea received through the platform is funneled through a screening process and the selected ones are allocated resources, incubated and taken to scale.

To qualify the idea must achieve two outcomes:

  • Solve a fundamental client problem – where the client can be a household, financial service provider, regulator, or internal clients like IFMR employees.
  • Accelerate the mission of IFMR Trust.

Since opening the call for proposals in the first cycle the submissions from KGFSes covered a wide spectrum of areas, pertaining to customer engagement, product design, infrastructure and system design, human resources management and administrative processes.  The ideas ranged from and included loan management system improvements, installing Everywhere Teller Machines (ETMs), combining equipment like biometric embedded thermal printers to save costs, a rewards and recognition program for employees, inter-branch idea exchange for best practices, business intelligence improvements, and product improvements in credit and non-credit products.

The submissions from Chennai office ranged from a rewards program for KGFS customers, incentivizing savings up behaviour, digitizing payments to gain insights on migration from cash to cashless solutions, creating a repository of the occurrence of natural disasters and its effects at originators, a customer self-service IVR system, and a social game designed to immerse the user to decisions faced by the Indian low-income household.

After careful deliberations we have shortlisted the below ideas from the first cycle that have made it to the final stage. These ideas and the innovators behind them are developing the ideas further and are in the process of structuring relevant pilots where necessary. With successful pilots and further development appetite the following ideas will be implemented in the current cycle:

Informing customer consent at KGFSes – Rachit Khaitan

Objective: A more responsible financial services provision at KGFS – through a point of sale engagement with the client– facilitating better customer outcomes & documenting the challenges and lessons for advocacy towards a stronger customer protection regime in India.

Description: There are often instances when customers of retail financial products are inadequately informed and caught by surprise about the financial contracts they enter into, leading to unplanned and adverse financial outcomes such as over-indebtedness and delinquency. While there is an important role for norms on transparency and product disclosure in bridging the information gap between the provider and the customer, it is not clear whether the point of sale processes of Indian retail financial services providers today take into account a customer’s truly informed consent. This innovation seeks to design and pilot a small but key process at KGFS to better ensure the informed nature of customer consent at the point of sale of a particular financial product (which could be determined based on mutual stakeholder agreement), based on a customer’s demonstrated understanding of its key aspects. The process entails administering a short verbal quiz to customers, at the point of sale, on the key aspects of a financial services product such as specific features, customer obligations, and potential risks and benefits in a manner that is understandable to the customer, to form the basis of more meaningful informed consent.

FAQ and answer database platform for originators and investors – Arjun Subramanian

Objective: Web-platform/Mobile App that serves as an answer database for the key questions asked by originators, investors, partners and management to serve as a best practices sharing tool for IFMR Capital.

Description: The idea is to have an interactive web platform that would answer the top questions/best practices from originators, investors, senior management and get answers from stakeholders in the context of specific clients. As an organization that has multiple access points with clients and teams that constantly change in terms of managing relationships this innovation will make the organization’s journey more inclusive.

KGFS rewards card – Deepa Anand

Objective: Rewards card to enhance customer experience and enable higher customer retention, stronger customer referrals, incentivize uptake of particular products, encourage timely repayment and other good financial behavior. 

Description: The KGFS reward program is intended to be offered to all KGFS customers. Every time a customer avails a KGFS product or repays on time for the life of a loan, or exhibits any other financial behavior that is to be encouraged, the customer earns reward points that accumulate on her rewards card.  The points can then be redeemed at the end of the year or at product closure date/renewal date (for example at loan closure date or insurance renewal date) in exchange for discounts on the next product availed (for example discounts on the EMI or processing fees of the next loan availed, or discounts on remittance charges etc.), or free mobile recharges.

Inculcating saving-up behavior – Aditi Kumar

Objective: To deploy a program at the KGFSes designed to inculcate saving-up behaviour.

Description: Saving up requires discipline and gives slow gratification.  The idea is to incentivize saving-up behavior through a program at the KGFSes that could be designed in either of the following ways:

  • Mobility of services – Local Agent model for savings transactions, which could include:
    • Door step collection of savings amounts
    • Transactions through smart cards and point of transaction machines with local agents (present in the village)
  • Piggy bank method – Give the customer a KGFS piggy bank in which she deposits what she can and get it at the end of the month to the branch to unlock and get her savings – probably do an activity around this in a group and reward the customer who has the highest savings balance – this is purely to inculcate savings behavior. Gradually instead of the piggy bank, one can move the customer to a savings bank account under a mobility model.
  • Using a JLG group structure to encourage savings in different ways (individually or as a group) – Could involve savings as a group and one of the people in the group gets access periodically and cyclically. More like a reverse SHG bank linkage model. This would also involve looking at whether we currently tap into the SHG network for lending and if we can mobilize their savings.
  • Via a micro-saving platform delivered through an investment/savings planning tool that determines how much each customer is to save periodically and then synchronize it with their repayment schedules.

 Development Impact Bond (DIB) to ensure suitability of MFI products – Nikhil John

Objective: A product that affords MFIs a source of cheaper debt to incentivize them to ensure their employees are offering suitable financial products to customers – an outcome social impact investors and grant funders are willing to fund.

Description: Through this idea the intent is to launch a Development Impact Bond in which IFMR Capital invests in the equity tranche with DFIs/ financial institutions, interested in suitability, investing in the mezzanine tranche. The objective is to fund the outcome where MFI employees are tested to ensure products they offer are suitable, with financial investors participating in senior tranches, getting an additional return if the outcome of suitability is achieved.

 Through I-innovate we aspire to engender new grassroots thinking and believe that over a period of time it will pave the way for systematic innovation that propels us further in our path towards our mission.


Building Natural Catastrophe Protection for Low-income Households – Notes from the Joint Workshop hosted by Asian Development Bank and IFMR Holdings


By Vipul Sekhsaria & Nikhil John, IFMR Holdings

Natural catastrophes, whether in the form of the severe drought that regions like Bundelkhand are currently witnessing or floods, like the one which deluged Chennai in 2015, leave behind them a tale of destruction that is both unparalleled and deeply disturbing. Natural disasters cost India $3.3 billion in 2015[1], the figure not accounting a crucial factor – “loss of livelihood”.

Amongst the ones who are hit during a natural catastrophe, the impact is most severe on low-income households and ones living below-the-poverty line. Natural disasters such as floods, droughts, cyclones and earthquakes keep pushing a majority of these households back, curtailing their attempts to move to a higher financial path every time such a catastrophe strikes.

According to a report, “An annual global investment of $6 billion in disaster risk management strategies would generate total benefits in terms of risk reduction of $360 billion[2] (suggesting a 1:60 ratio of cost to benefit) but sadly it adds that this is equivalent to just a 20% reduction of new and additional annual economic losses due to natural disasters. In the Indian context, as a nation we will be spending close to $9 billion over the next 5 years towards disaster risk management, but the question remains, is that enough?

While government expense on disaster management will surely mitigate the effects of disasters in terms of preparedness, but how does one go beyond the insured assets and insured lives, particularly given that insurance penetration is very poor, especially in low income groups?[3]   Solutions for the low-income household and business should not only insure hitherto uninsurable assets, uninsured lives, but also protect livelihoods and income streams.

With an endeavour to think through the challenges & opportunities that catastrophic risk entails and take the first steps in finding a solution to protect low-income households in India from the aftermath of natural catastrophes, Asian Development Bank & IFMR Holdings organised a joint workshop on June 2-3, 2016 towards this. The workshop brought together high quality originators, data scientists, climatologists, insurers, reinsurers, social impact investors and the regulator under one roof.


Finding a solution to protect customer households and businesses as well as the originators that have a relationship with such households and businesses was the imminent theme of this workshop.What we can think of, we can do”, as Sucharita Mukherjee, CEO of IFMR Holdings put it, was the underlying spirit that drove the particiapnts. Delivering the keynote address P.J. Joseph, Member, Non-Life, Insurance Regulatory and Development Authority of India (IRDAI), very clearly suggested how insurance penetration continues to be extremely low – only 0.9% of GDP is protected by non-life insurance. He added how only 8% of economic losses were insured and thus there was a need for out of the box solutions for catastrophe risks. The premium collected by the non-life industry for natural catastrophes amounted to about INR 4,500 crore annually whereas the loss from the Chennai floods alone stood at about INR 14,000 crore said R. Chandrasekaran, Secretary General, General Insurance Council, in his opening address. He added, although tax rebates initially helped give impetus to product take-up, today tax rebates are not the foremost reason why people buy health insurance, a similar approach might be worth considering for catastrophe protection covers.

Udaya Kumar, MD and CEO of Grameen Koota, in his talk highlighted the difference an ideal natural catastrophe protection solution can bring to the lives of low-income customers served by his organisation. He expounded on how natural catastrophes initiate vicious cycles of poorer life to the already poor and how a safety net provided by catastrophe insurance can make clients more resilient and aid the organisation’s efforts on financial inclusion.

Bama Balakrishnan, CRO of IFMR Capital later shared analysis of how some of the originators had medium to very high risk exposure to their net worth owing to catastrophe risk and how this continues to be a barrier to financial inclusion, keeping originators away from high risk prone geographies.

In addition to having specialists present their insights on key issues and participants engaging themselves in group discussion on key themes, the workshop also provided for a couple of panel discsussions.


The first panel discussion was on the different types of catastrophe risk protection products that exist globally and in India. In addition the panel deliberated on what are the data, risk models and loss curves available and how can these be improved? The panel consisted of Dr. Murthy Bachu, Principal Hydrologist at AON Benfield Analytics, Alex Chen, CEO of Asia Risk Transfer Solutions (ARTS), Ulrich Heiss, Senior Advisor, Sector Project Insurance at GIZ, Pushpendra Johri, VP of Risk and Insurance at RMSI, Vineet Kumar, Head Cat Perils Asia at SwissRe, in a discussion moderated by Arup Chatterjee, Principle Financial Sector Specialist, Sustainable Development and Climate Change Department at the Asian Development Bank (ADB). Key learnings from the panel:

  • Natural catastrophe risk has been a subject under discussion in India for more than two decades; lack of robust data, event curves and loss models have prevented the development of holistic solutions, said Arup Chatterjee. The wait for perfect data might never be over, but there already exists a large understanding by data scientists and climatologists that is more than sufficient for natural catastrophe-specific products in India.
  • Pushpendra Johri suggested availability of flood model based on 50 years of river flow data and 109 years of rainfall data with RMSI. He added that there is enough data to begin, but data reporting in the future will be a key game changer to make products more affordable in the long-run.
  • Affordability is crucial; but one should also examine incentive-alignment of various stakeholders within a value chain to arrive at innovative ways of structuring the product premium payment. In one such example, while the cotton farmer is the end recipient within the value chain, it is the cotton buyer who pays for the premium. The increase in cotton produce directly affects the buyers business and a more resilient farmer means higher profit for the buyer entities.
  • Role of technology in e-delivery needs to be explored against best practices around the world and new innovations waiting to emerge.
  • A good starting point appeared to be in the form of a parametric insurance product that transcends the exposure only to assets and looks at items like loss of income and livelihood as the important factors in deciding the amount of insurance cover.
  • With a deep understanding of their customer needs, geographically specialised originators like microfinance institutions, small business lenders, affordable housing finance institutions and similar institutions in financial inclusion are aware of the impact of natural disasters on the lives and livelihood of their customers’ in absence of any financial protection.
  • Catastrophe risk affects credit markets including interest rates, losses on loan causes capital erosion for financial institutions. Re-capitalising and de-leveraging are two options in the aftermath of a disaster, though both with their negatives. Recapitalisation is not so forthcoming and deleveraging affects the financial institution’s capability to lend. Residual risk management can be supplemented by catastrophe risk insurance products. Insurance and re-insurance play an important role in the underlying catastrophe risk – linking credit, risk and savings can result in appropriate risk financing strategies said Christine Engstrom, Director, Private Sector Financial Institutions, Private Sector Operations Department of ADB.
  • Originators acknowledged the geographical risk they carry at an organisational level for operating in such markets and lending to customers highly exposed to natural disasters.


Another star panel moderated by Sucharita Mukherjee, which had Brahmanand Hegde, MD and CEO of Vistaar, Udaya Kumar, Vaibhav Anand, Head Risk Analytics and Modelling of IFMR Capital, Easwar Narayanan, COO of Future Generali, K. Venkatesh, CEO of IFMR Rural Channels, and Ulrich Hess deliberated on informal ways of managing risk by low-income households like income diversification, investing in gold (especially in South India), and investing in other assets like land. Key learnings from the panel:

  • The poor are much more risk resilient than what one can imagine and their ability to bounce back is very high if supported by the right tools, said Udaya Kumar. A catastrophe insurance product could just be that tool.
  • Greater concentration on a quick loss assessment and claim settlement is important.
  • Simplicity of the product cannot be over-emphasized, element of instant relief to the affected households will make it more receptive.
  • Customer contact post disaster is very important as the customer’s trust in the entity that serves them is critical.
  • One of the insurer panellists asked, “Why do people insure motor and health?” the group unanimously agreed that a large reason is that people feel the immediacy of the loss.
  • Idiosyncratic risks are much easier to perceive and customers avoid planning for more systemic risks like disasters world over.
  • Awareness programs can make a difference. Examples cited suggest that campaigns that continued for multiple years saw much higher uptake in other insurance schemes, as signs of losses emerged and more customers realised its value.
  • The use of technology for loss estimation – innovation at delivery and claim settlement can also be a game changer in faster and more accurate claim settlement. For example, if parameterised products are more cost effective than indemnity products, how can technology magnify the intelligence of the indices that parametric products are dependent on? Technology can have a role to categorise the income classes and the duration of the loss of income by studying these models and adding that to the index can be one such way as pointed by one of the panellists. Left to the customer for what cover to choose, they will always choose the cheapest product and hence intelligent indexing can be of good use.
  • Can the government and CSR activities of institutions contribute to a risk fund? Participating Development Finance Institutions (DFIs) committed their support in running pilots, building better risk and loss models and further in normalising prices in the short run with a goal to make the product affordable.
  • One of the points that came to the fore is that financial institutions are reluctant to sell these kinds of products – but the panel agreed that the profitability cannot be measured for each product line and as long as the product enhances customer resilience – the indirect beneficiary is the institution the customer borrows from.

Armed with all the inputs from the speakers and the panellist the entire group spent the first half of the second day working in multiple small groups to come up with product pilot ideas. It was as if all the participants had taken upon themselves to forge a definite way forward and so it was.

The group agreed upon critical design parameters for the pilot:

  • Simple to buy (could cover multiple perils, with simple options, structure could be parametric / indexed).
  • Affordable – capturing the pricing benefit of risk-pooling between different entities and different household profiles.
  • The magnitude of the cover should provide for the loss of assets, loss of livelihood, and a buffer at an organisational level to meet unplanned exigencies or provide for households who were affected but couldn’t get compensation since parametric solutions will carry basis risk (may not cover all actual damages).
  • The loan amount can be used as a proxy to determine the magnitude of cover.
  • To design the vulnerability index that can be used to determine the value of cover (one may need a pre-survey to arrive at such a vulnerability index), pre-defined hazard and loss triggers.

The group, including originators unanimously agreed to build protection for catastrophe risk and were ready to be a part of joint initiatives to bring some of the solutions to light.

[1] http://www.firstpost.com/india/natural-disasters-cost-india-3-30bn-in-2015-heres-why-we-should-be-very-worried-2622940.html
[2] http://timesofindia.indiatimes.com/india/Disasters-cost-India-10bn-per-year-UN-report/articleshow/46522526.cms
[3] http://timesofindia.indiatimes.com/business/india-business/Insurance-penetration-in-India-at-3-9-percent-below-world-average/articleshow/46518607.cms


Response to the Reserve Bank of India’s Consultation Paper on Peer To Peer Lending

By Linda George, IFMR Finance Foundation

Recently the Reserve Bank of India released a Consultation Paper on Peer to Peer Lending aimed at defining the contours of regulating Peer to Peer (P2P) Lending in India. The below table lays out our responses to some of the themes that are covered in the paper.

Theme Potential risks Mitigatory steps considered by RBI Responses and Comments
Systemic Risks, risk of contagion Currently, platforms do not create any threat of systemic risk. There are chances that the platforms will grow into being systemically important.

Considering these entities as NBFCs.

Applying a leverage ratio if needed.

If the Platform is not to take any credit exposure to its borrowers (through guarantees), or co-lend along with its lenders, it will therefore be a pure marketplace, and would therefore not be holding any credit risk on its balance sheet. In this regard, the approach taken by UK’s Financial Conduct Authority (FCA) is worth exploring. Prudential Standards are being based on the total value of the firm’s loaned funds outstanding instead of the total amount of cumulative loans that the firm may have provided during the lifetime. The volume-based financial resources requirement calibration is provided in Annex.
Information Disclosures Lack of information and/or other signals about the performance of the platforms will make it hard for participants as well as regulators to gauge their performance, quality of underwriting and size. Submission of regular reports to RBI.

Detailed reporting requirements should be established which includes book size, default statistics, returns performance, and complaints and their resolution, at the minimum.

UK Peer to Peer Finance Association reporting requirements can be emulated[1] (these include disclosures on cumulative lending, outstanding loan book, net lending, number of lenders and borrowers and so on).

Quality of Underwriting, Suitability Assessments When the problem of information asymmetry[2] exists, the understanding of the lender with respect to the risk involved in the activity of lending would need to be robust, and therefore, platforms’ ability to assess the credit risk involved and its implications to the lender would be important. RBI has left this to the markets to decide.

P2P Platforms must be required to uphold RBI’s Charter of Customer Rights, just as other credit providers are, and within this they should therefore be required to undertake borrower debt servicing capability analysis before they get connected to lenders. Similarly, the platform must undertake an assessment to ensure that the unsophisticated lender (akin to a debt investor) is entering into a contract that is suitable for him/her financial conditions, needs and goals.

UK’s FCA is considering the introduction of rules of suitability of advice, which will make it an obligation for the firms to make sure the recommendations are suitable for the clients[3].

P2P platforms can be provided access to credit bureau reports on their borrowers provided the Credit Information Companies Regulations 2005 is suitably amended as required and robust data sharing guidelines are followed.

Customer protection concerns for borrowers Fair treatment of the borrower, strong-arm tactics being used etc. Lack of information to the borrower about the redressal mechanism. RBI’s NBFC Fair Practice Code is to be applicable. This is a welcome move, and it would ensure that both lenders and borrowers have access to clear grievance redressal processes. However, the Fair Practice Code itself needs to be evolved into a Code for all formal credit providers such that it can take into account new types of intermediaries like P2P platforms.
Customer protection concerns for unsophisticated lenders

When individual lenders get involved, there is a concern that they may not be fully aware of risks involved.

Unclear processes could lead to opacity, with the lenders being misled about defaulting borrowers.

Not considered

Individual unsophisticated lenders should have sufficient information to make informed decisions about whether or not to participate on the P2P Platform.

This includes obligations towards the lenders to inform them about delinquencies if any in a timely manner, and have a clear process in place regarding delinquency management that is disclosed to lenders.

Risk of exclusion Discrimination based on race, caste, or gender, might arise which will lead to the exclusion of certain customers from getting correctly priced credit. Not considered It becomes very necessary from a regulatory point of view to ensure that these forms of discrimination do not exist in the system. This is however a problem that exists with all other forms of lending, and could get compounded by the fact that the regulator has limited information on underwriting methodologies.
Privacy and data security risk Privacy and data protection concerns with respect to personal and sensitive information of lenders and borrowers. RBI places this responsibility on the platform. The platform needs to take steps to adhere to all existing data privacy norms, as well as comply with the Right to Privacy covered by the RBI Charter of Customer Rights.
Unsecured lending Usurious and unfair practices may get adopted by P2P platforms if secured lending is to be allowed. RBI clarifies that P2P platforms will engage in only unsecured lending. It is unclear why this requirement is necessary as, as long as existing norms are not violated, and customer protection requirements are met, if furnishing security can bring down costs of borrowing, it must be permitted.

Regulating peer to peer lending platforms is necessary, as justified in the Consultation Paper, and it will enable a more trust-worthy environment for both lenders and borrowers. Intervention of the regulatory authorities should be focused towards creating a level playing field for P2P platforms among all other market players to ensure better competition and more efficient allocation of resources across the system.


In the context of UK, the volume- based financial resources requirement calibration placed by FCA on P2P platforms is the sum of:

  1. 2% of the first £50 million of the total value of the total loaned funds outstanding
  2. 15% of the next £200 million of the total value of the total loaned funds outstanding
  3. 1% of the next £250 million of the total value of the total loaned funds outstanding
  4. 05% of any remaining balance of the total value of the total loaned funds outstanding above £500m

[1] http://p2pfa.info/wp-content/uploads/2015/09/Operating-Principals-vfinal.pdf

[2] Learning by doing with asymmetric information: Evidence from prosper.com. Freedman (2011). http://www.nber.org/papers/w16855.pdf

[3] https://www.fca.org.uk/news/innovative-finance-and-peer-to-peer-discussion-paper

[4] http://www.fca.org.uk/static/documents/policy-statements/ps14-04.pdf


Comments on the Indian Banking Sector at the Stanford India Conference

By Bindu Ananth

I had an opportunity to participate in the excellent conference organised by the Stanford Centre for International Development (SCID) on Indian Economic Policy as a discussant for a presentation by Dr. Rakesh Mohan. My comments drew heavily from a forthcoming paper on “Modernising Indian Banking” by my colleague Deepti George. The transcript is below, would welcome thoughts/feedback:

  1. I want to start by thanking SCID for inviting me to this conference and for an opportunity to discuss this excellent, data-rich presentation on India’s financial sector reforms by Dr. Rakesh Mohan and Prof. Partha Ray. I will focus my comments on the banking sector and financial inclusion aspects.
  1. From all accounts, it appears that the traditional banking business model will face some threat going forward starting with increased competition on the liabilities side. Low cost deposits are a significant factor contributing to the risk absorption capacity of banks’ balance sheets, a hidden capital buffer of sorts. Raghuram Rajan in his Committee’s Report of 2009 calls this the ‘grand bargain’[i]: where cheap deposits were available to banks in an environment marked by low competitive intensity, in exchange for financing Governments and priority sectors of the Government. This is unravelling with significantly more competition expected for the CASA business. This competition is particularly likely to be sharp from the newly licensed category of Payments Banks whose sole focus will be on deposits and payments. Many of these new banks are expected to exploit the adjacencies with their telecom businesses and significantly increase the outreach of the banking sector and ease of depositing small amounts, frequently. Even over the past few years, data shows that there has been a “flight of CASA” to a few banks that are perceived as being strong. This “sorting” is happening even within Public Sector Banks and represents an important nuance to the authors’ observation on reversal of convergence.
  1. As the presentation notes, a number of measures have been launched by the RBI and the Government including the “Indradhanush” package announced in August 2015[ii]. However, most of these measures stop short of addressing the root cause of troubles in the banking sector. NPAs reflect the outcome of decisions made several years ago by banks and while the current debates on provisioning levels are important from the perspective of assessing the fair value of banks, it does not examine the conditions under which these assets were originated and monitored and indeed, how we ensure that these issues are not recurrent themes in the Indian banking sector. A lot is attributed to ownership issues of banks. However, ownership notwithstanding, there are several pragmatic measures that can improve management of banks and oversight by the Board – these have not received much attention. Specifically, adoption of three building blocks: Risk-based pricing of loans, Activity-based costing and Matched Funds Transfer Pricing will ensure more rational pricing of assets by banks even as the ownership issues get resolved.
  1. In addition to actions that can be taken by bank management, the supervisory regime has an important role to play to enable banks to reveal more information on an on-going basis about the true performance of the bank as well as create an environment where more capabilities are getting built within banks. Over the past few years, there has been a definitive shift towards risk-based supervisory approaches. This has been driven by Basel II requirements on banking regulators to undertake the Supervisory Review and Evaluation Process (SREP) of supervised banks, which includes the review and evaluation of the bank’s Internal Capital Adequacy Assessment Plan (ICAAP)[iii], conducting an independent assessment of the bank’s risk profile, and if necessary, taking appropriate prudential measures and other supervisory actions such as based on the severity of risks, requiring banks to follow through on a prescribed Monitorable Action Plan (MAP)[iv]. However, the RBS framework is still historic and partial in its approach because, for example, in looking at credit risk while it does, for the first time, go to the performing book, it only examines the rating migrations that have already taken place, and uses as a measure of concentration risk, only the top twenty assets.
  1. The Ministry of Corporate Affairs and RBI have recognised the urgent need to converge Indian Accounting Standards with International Financial Reporting Standards (IFRS) – the Ind AS has been recommended by RBI to be commenced for scheduled commercial banks from April 1, 2018 onwards. IFRS 9 represents a significant move in that it will require the computation of provisioning based on a forward looking Expected Credit Loss (ECL) impairment model, even for the performing book. This is likely to result in significantly higher impairment provisions and therefore more capital requirements. This will be a watershed moment for Indian banking.
  1. RBI through its Framework for Revitalising Distressed Assets in the Economy[v], the Guidelines on the Joint Lenders Forum (JLF) and Corrective Action Plan (CAP)[vi], the Strategic Debt Restructuring Scheme[vii], has put in place machinery for the rectification and restructuring of stressed assets on banks’ books.[viii]. However, just as with standard assets, there is a need for recognizing and incorporating expected losses into the loss recognition in restructured assets.
  1. Also, the RBI has indicated its interest in moving to a dynamic provisioning framework in which banks will need to make dynamic provisions which would be the difference between the long run average expected loss of the portfolio for one year and specific provisions[ix] made during the year. Thus, this will ensure less P&L volatility[x].
  1. With the accounting for financial assets moving towards better capturing the effects of potential impairment for the remaining life of the asset either through mark-to-market and expected loss approaches, there is broad consensus that these measures will ensure adequate cover for expected losses in the form of better provisioning, while unexpected losses are to be covered by capitalization[xi] and by more efficient use of banks’ capital. Overall, a move from high regulatory forbearance-low bank competencies equilibrium to low forbearance-high competencies would be essential.
  1. I want to spend a few minutes on financial inclusion. Specifically, the differentiated banking design that was discussed, has great potential for combining sharp increases in financial inclusion while preserving stability. This category includes Payment Banks and Wholesale Banks (the discussion paper on this is awaited). After the last round of universal bank licensing in which 2 new banks were licensed, it was clear that it is not possible to find sufficient number of qualified candidates that satisfy “fit and proper” requirements to significantly accelerate the number of banks in the system in the near-term. An integrated banking regulation framework that permits differentiated banking business models appears desirable for a number of reasons:
  • There would be flexibility to approach payments, savings, and credit both independently (in a Vertically Differentiated Banking Design) and to bring them together (in a Horizontally Differentiated Banking Design) when the efficiency gains are high and the other costs are low. Concerns relating to finding fit-and-proper candidates in the case of vertically differentiated institutions would be far fewer and licensing a relatively large number of them would, consequently, be far easier.  These, over time, could also provide a pipeline for future universal banks.
  • The current fragmented regulatory structure creates far too many arbitrage and lobbying opportunities, and in the absence of  a single unifying framework,  measures are continually being taken to respond to them  in a somewhat ad-hoc manner (such as higher capital adequacy norms for NBFCs combined with an easier Non Performing Assets recognition norms and 100% risk weights)
  1. Even as new differentiated banking business models take root, there is a need to reimagine the role of universal banks as one that is no longer engaged as risk originators, particularly in high-risk segments, but rather as being risk aggregators, with freedoms to rebalance their portfolios based on risk-profiles and diversification outcomes that each bank decides for itself.
  1. Given the progress on the JAM trinity and the emergence of differentiated banks, it may well be the case that the credit and payments strategy evolve differentially within the broader financial inclusion strategy. While progress on credit would necessarily have to be much more measured and prudent no matter what strategies are adopted given the inherent risks and customer protection concerns, there is an urgent need to make access to payments ubiquitous & this seems within striking distance. In addition, there is also a pressing need to create an architecture that allows information relating to customer behaviour, in particular, transactions histories with financial institutions, telecommunications companies, and utilities, to be captured and transmitted with high integrity, while simultaneously maintaining the highest standards of customer privacy. The development of this payments and information architecture will not only have enormous inherent value but could also be thought of as “highways” on which a more diverse credit intermediation system can be built. Fortunately, India already possesses the necessary tools to bring about this rapid change. We have a one billion plus strong Unique ID database; a rapidly growing telecommunication network in rural areas with over a billion mobile phone users;  expanding broadband connectivity which is expected to cover every village in the next 12 to 24 months; and multiple credit bureaus which are all very active.
  1. Getting the financial inclusion strategy on the credit side right has important consequences for the economy and the country. Despite the authors’ observation that the financial depth at the national level has plateaued off at roughly 60%, we continue to have fairly low levels of financial depth (credit-to-GDP) in several pockets of the country. States such as Bihar have an overall credit to GDP ratio of less than 16% despite the fact that it has one of the lowest levels of GDP in the country. It is arguable whether the binding constraint is the availability of credit or the opportunities available in the regional economy – but at very low absolute levels of credit availability, this may be a self-fulfilling prophecy. We need a banking sector that is capable of meeting the growth needs of all sectors and regions of the country and for this; the sector has to exhibit a high degree of resilience and profitability.

[i] Chapter 4 from A Hundred Small Steps: Report of the Committee on Financial Sector Reforms, Government of India, 2009

[ii] Indradhanush Plan for Revamp of Public Sector Banks, Department of Financial Services, Ministry of Finance,  August 2015

[iii] The ICAAP is a forward-looking risk-based process that is approved by banks’ boards and submitted to the RBI annually. It sets risk tolerance levels and lays out processes for managing and monitoring risks, stress testing and scenario analysis, and links back to a strategic plan for meeting current and future needs for capital and reserve funds given the risk tolerance levels.

[iv] RBI can require banks to modify or enhance risk management and internal control, reduce risk exposure to specific  risk levels, achieve minimum CRAR levels above the minimum regulatory capital requirements, and so on

[v] Early Recognition of Financial Distress, Prompt Steps for Resolution and Fair Recovery for Lenders: Framework for Revitalising Distressed Assets in the Economy RBI, January 30, 2014

[vi] Framework for Revitalising Distressed Assets in the Economy – Review of the Guidelines on Joint Lenders’ Forum (JLF) and Corrective Action Plan (CAP), RBI, September 24, 2015

[vii] Strategic Debt Restructuring Scheme, RBI, June 8, 2015

[viii] Master Circular on wilful defaulters, RBI, July 1, 2015

[ix] These are the provisions made for NPAs in accordance with existing RBI circulars

[x] B Mahapatra: Underlying concepts and principles of dynamic provisioning. Keynote address by B Mahapatra, ED, RBI at the Conference on “Introduction of dynamic provisioning framework for banks in India”, organised by CAFRAL, 21 September 2012

[xi] Ibid


“Open data improves the situation from a data privacy perspective” – Interview with S. Anand – Part 2

By Nishanth K, IFMR Finance Foundation

This post is a continuation from our earlier post about a conversation with S. Anand, Chief Data Scientist, Gramener. The earlier post covered the fundamentals of a good data visualisation and the nuances one has to keen in mind while undertaking such an effort. This post will cover aspects of challenges in public vs. private datasets, data privacy and open data movement.

In addition to working with individual organisations on data specific to them, Gramener has taken an interest in disseminating data relating to socio-economic issues such as Parliament elections, socio-economic census etc. What have been some of your personal experiences in working on public datasets? What are the challenges that you face when working with public datasets as opposed to private/organisation datasets?

Let’s talk about challenges: Now the good part about open and public data these days is that they are reasonably well structured. When comparing private and public datasets, there are three commonly discussed issues:

Data Cleanliness/quality: There are always data collection issues but I don’t see this as private/corporate versus public data source issue but I see it as manual versus automatic data collection issue. If I had to go to a bank in which account balances and transactions were entered manually, it would be just as messy in a private organisation as in a public organisation. Whereas, institutions where it’s collected through a system, it is obviously much better, irrespective of whether the institution is private or public. So it’s merely a question of: to what extent has automation entered a domain. Therefore, data cleanliness is not an issue that distinguishes public data from other kinds of data.

Availability of data: is commonly brought up too. There are a lot of people that tell that public data is harder to get one’s hands on. In my experience, private data is no easier. When we are called in to do a project for a private organisation, we ask if particular pieces of data are readily available. The answers you get are not particularly different from the answers you get from a government which is:

  • We don’t know if data is available
  • If it is available, we don’t know where it is or who has it
  • If it is available, then we don’t know what format it is in

Often times, we have been asked by both government and private organisations to scrap their own website. So, availability of data is also an issue that doesn’t distinguish private and public data either.

Data Linkages: Something that does distinguish the two is the linkages. A lot of public data is not standardised by the entities that use or provide it. For example is there a standard district code? The answer is ‘yes’ not one, but several hundreds of standards. Is there a standard school code? Is there a standard village code?

Every organisational unit in the government tends to have a say in what standards they pick and very often they pick differing standards. These differing standards can be seen even within organisations. For example, if I go to NCERT to collect information about marks of students and information about infrastructure in a school, these two pieces of data cannot be merged because there are no common set of IDs. It’s only now that this need for standardization is coming in because there have been several grassroots initiatives around standardization. So, the single largest problem in working with public data is that it is often difficult to link across datasets availability.

Many governments are moving towards an “open data” culture by making datasets publicly available in order to increase transparency (For example: data.gov.in). What are your thoughts on the impact of these movements and how crucial can visualisations be in making sense of such large volumes of publicly available government data?

Open data movements are clearly good in the sense that you now have access to more data and more can be done with them barring privacy concerns. It also allows the government to outsource and/or rather crowd-source some of its analysis. So why should I have to create a team that does analysis when I can get the public to do the same work which certainly helps.

How does visualisation help? It can help us understand better some things that are not obvious. To take an example: we were recently working with the Ministry of Finance on a project to help them understand the budget in a more intuitive way, from under-spend and over-spend respectively.

So we put together a standard tree-map kind of visualisation where we have boxes, each box represents the size of the budget, the colour represents the degree of under-spend or over-spend.


It is easier to see that one department is spending considerably more than others and some departments spend considerably less. You can now break it down into various sub-departments to see where exactly the problem is coming from, move back and so on. These kinds of explorations make it easier to argue and debate and we are no longer stuck in a situation where you have to understand raw numbers. The task is now simplified to looking at something and focusing on the conclusions. It becomes a lot easier to see what was otherwise a much more complex or intractable item. These visuals also help you explore the dataset in a much more intuitive way.

Another example: We were working on a semi-public dataset along with NCERT on the national achievement survey. The question here was: what influences the marks of students? Can we identify the social and behavioural characteristics that have an impact on the child’s marks? This was done on a reasonably large sample (100080 children) across the country studying in class 8.

GIF_4The complete analysis is available here.

If you look at the table where it shows a variety of factors –for example the gender, age, mode of education, reading books etc. and the influence that it has on the total marks as well as marks by subjects. Let’s take the number of siblings as an example. So the number of siblings it says has a 2.4% impact on the total marks. How does that break up? What it shows us is that individual children scored more marks than children who have one sibling who score more marks than two siblings than three siblings. This does not necessarily indicate that having siblings hurts a child. It’s probably just a correlation between various other economic factors. But we do know that the extent of influence that the number of siblings has is very real. We can start looking at overall influence on each of these parameters to say what has a larger influence that enables us to explore these relationships in more detail.

For instance, one of the things that we now know is that watching TV is not such a bad idea in fact if I look at the overall impact of watching TV against the various subjects, it shows us that reading ability actually improves if children watch TV every day. On the other hand, mathematics ability is dropped dramatically if they watch TV every day. It tells us watching TV once a week roughly is a sweet spot for scoring well in most subjects. On the other hand, if we take a look at how much does playing games make a difference and it turns out that it’s almost exact opposite. Playing games improves mathematical ability considerably but actually hurts your reading ability a bit. Of course never playing is a bad idea. The extent to which you play has a different impact on different subjects.

This sort of analysis would not be possible if the data didn’t come out in the open. Even if this kind of data is available in the open, it requires a good visualisation for it to reach a wider audience.

With the increase in the amount of data being collected and shared by various organisations, what are your thoughts on data privacy?

My thought on this is that any data collection and capturing mechanism makes data privacy a serious issue. Open data, on the other hand, improves the situation from a data privacy perspective. I will give you an example. Let’s take land records: who owns a particular piece of land is very useful information. Also consider data from the voter registry: who are eligible to vote is very useful information- at least for certain sets of people. Considering that both these datasets are available to the government and not available to anyone outside the government, it means that the government has more influence and power than public. This effectively means that to a certain extent the incumbent party is more influential or anyone who manages to get access to the data, within the framework of the law, has more power over someone who doesn’t have access to the data. The data exists and continues to exist ever since people have been writing down on paper and pen voter rolls.

Technology is what is raising the privacy question. Today we are living in a world where there is incredible power and technology where we have the ability to track where a person’s mobile is at any point in time, who they are calling etc. This information is certainly available to the ISPs and to any party that has the ability to subpoena this. Open data is merely making it available to a wider audience. So i see open data more as a leveller that at least makes the lack of privacy more uniformly distributed than it really is.

The fact that when data is open, we would have to enforce controls on it and enforcing it in a reasonably uniform way means that the discussion is brought out in the open. Earlier, what was a privately debated and privately enforced policy is now going to be a publicly debated and publicly enforced policy. The fact that open data is bringing that discussion out in the open and also making data access more uniform is good. The privacy issues stem not from the data but from the existence of the technology itself. The case of NSA and Edward Snowden has shown us that there exists authorities who have the ability to extract the data. The question is now how does one govern these authorities. This discussion becomes easier if you say that the same data is potentially, in some form or the other, available to anyone.

Aside from data visualisation as a method of disseminating information, you have also been recently talking about the emergence of Data Art. Can you tell us more about this?

It is certainly at a nascent stage. Data art, if I have to term it, I would say it is something that uses data to create art without purpose or any specific objective. This is relatively new and people are dabbling around with it in the same way that people have been dabbling with new art forms in the past.

People who were purely focusing on aesthetics are now paying more attention to data and how they could utilise it. For example, design schools such as NID and Shrishti etc. are now talking about data visuals. We are also seeing clients, who were earlier focused on the aesthetics, are now interested in moving towards infographics. Media is a classic example: they were earlier focussed on stories and narratives but now are moving towards infographics and in a few cases, are moving towards data driven visual representations.

On the other hand, people who were focussing a lot on the hard content and numbers are gently moving towards more visual representations as well. Financial Analysts are now saying that they would like to see data in a visual representation and companies are wondering if they can make their annual reports more pleasing purely from an aesthetic perspective. So right across there is clearly growing appreciation of this intersection between aesthetics and data per se.

To give an example of data art, consider this visual:


This is directly drawn from data where each of these represents one song. The arc tells you the length of song and the completion of arc represent a total of six minutes. Within that, the strips represent the spectrogram of the song, effectively the frequencies and the beats of the song. So queen’s “We will rock you” has different beats in between that has a very different structure to Eric Clapton’s “Wonderful Tonight” which is remarkably uniform and homogenous. One could argue that this could be useful to understand the structure of the song but in reality it has no purpose. It was created because it could be done. In some sense, art is done because you can do it and because you feel good while you are doing it and not because there’s an audience in mind whose objective you want to satisfy.