The right to explanation of creditworthiness assessment – first such law in Europe

6 min. read


Thanks to Panoptykon’s initiative bank customers in Poland will have the right to receive explanation of their creditworthiness. It’s the first right of this kind in Europe and a higher standard than the one envisioned in the GDPR.

Over 40% Poles have a banking loan. So far in relation with the bank they have been put in the role of a petitioner who – seeking to receive much needed funds – were naturally in a worse position. That manifested itself, among others, in the fact that banks were able to demand that clients present any information connected with their situation and the purpose of the credit, as well as to obtain information from other sources. Apart from the generally binding principles of personal data protection, there were no other restrictions in that scope. In effect, the client who was denied a loan by the bank was able to only guess what the problem was – income, the form of employment, or perhaps any liabilities not paid on time? Owing to the amendments to the Polish banking law fought for by Panoptykon Foundation, that will change: clients of Polish banks will be able to check what were the decisive factors in the assessment of their creditworthiness.

More than GDPR

In accordance with Article 70a added to banking law, a consumer will be able to obtain “information on the factors, including personal data, which affected the evaluation of their creditworthiness”. That right is vested in a consumer irrespective of the fact whether or not a credit decision was automated and regardless of its content.

It is a precedent in Europe, as in no country – thus far – has it been possible to go beyond the standard required by the GDPR, which guarantees transparency limited to automated decisions. As regards credit decisions, the line between the assessment made by the algorithm and the final decision made by an analyst may be blurred. It is difficult to assert in how many cases out of a hundred the analyst is able (due to the time available or the competences he or she has) to critically evaluate what was “spat out” by the system.

Moreover, irrespective of the degree of human involvement, a credit decision is based on an advanced analysis of the personal data and on the profiling of customers. From that perspective, extending the right to clarification to all decisions based on profiling and using the so-called big data is a very good solution. Another factor for granting to consumers the right to be explained their creditworthiness with each credit decision – not only undertaken automatically – is the fact that Polish entrepreneurs have enjoyed such right for several years. Why should Mr Smith not running a business be in a less favorable situation than an entrepreneur?

What can be found in the explanation of creditworthiness assessment?

The right to explanation encompasses the factors – including personal data – which affected the creditworthiness assessment. Therefore, the bank does not need to provide a full list of factors taken into account in that process, however, it has to disclose all those which had an impact on the final decision. An exemplary explanation may be worded as follows:

The negative result of the creditworthiness assessment was affected by the following factors: form of employment (employment agreement for a limited time period); type of work performed (occasional employment); amount of income in combination with the number of dependants (declared two children and one parent as dependants in the household); a delay in repayment of another loan overdue more than 60 days, confirmed in the database of the credit information bureau and undue debit found in the savings and checking account.

From the clients’ perspective it will be crucial for the banks to point out to specific details on the grounds of which their creditworthiness assessment has been carried out. Therefore, it will not be enough to specify that the basis for the negative assessment was, for instance, the income. The bank will be obliged to disclose what amount of income it took into consideration in its analysis. This creates room for dialogue and a chance to correct mistakes (e.g. a missing zero in the amount of income or rectifying an outdated report from a credit information bureau). In a long-term perspective, it also serves as a valuable instruction for those clients who wish to increase their credibility towards banks. The information received may become an impulse to a timely repayment of liabilities or seeking another form of employment.

Translating law to the banking practice

The new regulations will undoubtedly strengthen the customer’s position towards the bank. In relation to each automated credit decision the customer will have the right to request clarifications (“as to the grounds of the decision reached”), to question the decision and to obtain human intervention. In relation to each decision issued with the participation of a bank employee, the client will be able to demand clarification including specific personal data which affected the final verdict. These are two independent procedures, safeguarding a high standard of transparency and data protection.

According to Guidelines no. 251 of Article 29 Working Party, banks should “find a simple way to tell the data subject about the rationale behind, or the criteria relied on in reaching the decision”. We count on high standards of transparency and good practice showing how to respond to borrowers’ requests for explanation of the decisions issued in their cases.

The new right to explanation includes not only “personal data”, but also other “factors”, which affected the decision reached by the bank. In those terms, the actual standard of transparency will be shaped by court jurisprudence. The newly introduced provisions offer a chance to control how the algorithms used in the process of creditworthiness assessment work. We will not discover their logic as it is treated by banks as proprietary information, but – on the basis of cases of particular persons we will be able to check what data (not only personal) was introduced to the system and what data was generated in the form of the final evaluation. It is a good starting point to a discussion on models and assumptions on which the operation of algorithms is based, in particular their potential to discriminate certain client categories (e.g. single mothers, persons employed on the basis of civil law agreements, or foreigners).

Wojciech Klicki, Katarzyna Szymielewicz

Support our work! Donate to Panoptykon Foundation