Thoughts from Money 20/20: Tokenisation – the word on everyone’s lips…

2nd November 2015

Tokenisation – the word was on everyone’s lips at Money 20/20 last week in Las Vegas. But try to find consensus on what ‘tokenisation’ means and you will struggle – different players in the financial services sector all have their own interpretations.

To online merchants, tokenisation is all about card-on-file protection against consequential losses from data breaches, although as we have seen in the UK this week, a major hack into a mobile network operator resulted in somewhere between 1 and 4 million sets of card and bank details being ‘liberated’, and the company CEO was unable to say whether this information had been encrypted or not. There’s a long way to go if this is typical of the way online merchants handle customer data, and of course tokenisation is only a means of minimising the impact after the event, not of preventing breaches from happening.

To retail banks tokenisation holds out the promise of consolidating customer relationships across diverse business lines into the elusive ‘single sign-on’, whereby any credential that the customer chooses to use to identify themselves to the bank resolves, subject to appropriate authentication, into a token providing access to any of the services that the bank provides to that customer, be it checking account banking, loans, mortgages or credit cards. This joined-up approach could lead to improved customer service, and therefore to increased loyalty and customer retention, but also to reduced costs and incremental business from the enhanced cross-selling opportunities.

But it is in the emerging mobile payments business that tokenisation is creating the biggest buzz and where it is potentially the source of most disquiet, as issuing institutions absorb the consequences of the decisions that have got us to where we are today with access to the various ‘x’ Pay models that have launched or that are about to. There’s no doubt that teaming with card networks to provide tokenisation services, coupled with app provisioning, was the expedient way to get these Pay models off the ground quickly, but there are downsides to this approach.

It takes a little more thought to follow through the implications of centralised tokenisation and provisioning, the greatest of which is what we heard called the ‘de-tokenisation dependency’ – the flip side of tokenisation for app provisioning is de-tokenisation for transaction authorisation, a service that can only be performed by the party that holds the token-PAN mapping. Tokenisation in the card network absolutely requires that transactions from a device holding that token must flow through that same network on their journey between merchant and issuer. As the holder of the EMV keys associated with that token and the token-PAN mapping, the network is the only location where authentication and de-tokenisation can take place.

This constraint has many potential consequences for issuers. For example, transactions that would previously have flowed over a competing network must now flow through the token-providing network – no choice (in the spirit of Durbin) is possible and a network fee is incurred each time. And now, as the keeper of the keys, the network is responsible for cryptographic processing for transaction authentication, a chargeable process that many issuers who initially opted for ‘on-behalf-of’ services for physical EMV card authentication have now brought in-house with significant cost savings as a result. While initial tokenisation may be ‘free for ever’, the downstream costs could turn out to be considerable, particularly as mobile issuance and contactless acceptance increase.

Proxama’s view of this landscape is one of evolution, from the pragmatic but ultimately costly and constraining entry-level scenario that we have today, to a position where issuers and processors are able to perform tokenisation in-house and to provision directly to the ‘x’ Pay servers. The ‘x’ Pay providers want and need to maximise participation, and so are likely endorsers of this approach. What is needed now is for major players to confirm their intentions to follow the independent tokenisation route and to work with the industry to move us on from today’s starting position to a fully diversified provisioning environment. Independent software vendors like Proxama will be supporting this move with the in-house or cloud-based platforms needed to achieve this goal.

Nigel Beatty, Proxama

Nigel Beatty, VP Global Business Development, Proxama