Page 26 - GS210501
P. 26
Education
Eliminate all clear-text Combining P2PE and tokenization
cardholder data from Tokenization is a representation of data, using
cryptographically generated substitute characters as
placeholder data to preserve the data format. Tokenization
your network protects PAN data in storage by removing it altogether,
replacing it with an identifier known as a token. In a
typical financial application of tokenization, a payment
By David Close transaction occurs, and the merchant only retains the
Futurex token. The token is linked to that cardholder account and,
by itself, has no intrinsic value. The transaction token can
be used by the processor to look up the PAN needed to
etailers and acquirers are always looking at process the appropriate transaction.
better ways to secure transactions to reduce
exposure of sensitive customer data. It's vital Since tokenized data is random and valueless, it's typically
R that the payment processing infrastructure not subject to the same compliance requirements as clear-
combines security and performance to enable safe, secure text payment data and can help reduce PCI DSS compliance
commerce around the world, minimize audit risks, and scope. Widespread adoption of tokenization in payments
protect customer data even in the event of a breach or ushered in substantial increases in security and an overall
hack. Two technologies, point-to-point encryption (P2PE) reduction in compliance costs.
and tokenization—when used together—can help elimi-
nate clear-text customer data from being anywhere on the Vaulted versus vaultless tokenization
network.
Vaulted tokenization requires large databases mapping
In this model, customer data remains encrypted tokens to their corresponding clear data. In this model,
throughout the entire payment process, thereby reducing detokenization requires the database to be queried with
its exposure. Data is encrypted at the initial point of capture a token to retrieve original data. Token vaults represent
using P2PE, decrypted within the secure boundary of a a high-risk target for theft since they contain clear
hardware security module (HSM) and re-encrypted using cardholder data.
a transfer key for payment validation by the processor,
while simultaneously having a token generated for Vaultless tokenization eliminates the need for a vault or
storage and future use. This is also a sophisticated, easily master token database, providing strong cryptography
implemented tool for reducing compliance scope. to secure data at rest. Vaultless tokenization resolves the
problem of storing encrypted data by protecting it and
Encrypting at every hop being able to use the data to perform everyday functions
in a secure way.
In 2011, The PCI Security Standards Council released
a spec called PCI Point-to-Point Encryption to provide When you combine P2PE and tokenization, you're
governance and guidance for encryption of cardholder minimizing risk of exposing customer data by storing
data. It outlined how financial services encrypt and it in a tokenized format and protecting it at every point
decrypt cardholder data and handle key management of interaction. Innovative acquirers are deploying P2PE
within secure cryptographic devices.
and tokenization simultaneously; large-scale retailers are
typically adopting one or the other.
It's called point-to-point encryption and not end-to-end
encryption because you typically encrypt data from one The payments industry is complicated. And the
point to another, and then either decrypt it or re-encrypt current ecosystem is fairly segmented. Most financial
it under another key for the next point, or hop. Encrypting organizations today use separate systems for encrypting
at every hop eliminates a potential weak link. If malware and tokenizing data. Essentially, what’s needed is for the
were on the card reader, for example, the sensitive entire infrastructure—terminals to HSMs—to support
cardholder data would still be encrypted.
P2PE and tokenization, the security needed to handle
billions of transactions daily.
The PCI Data Security Standard, which defines the
infrastructure where cardholder data is traversing,
contains many requirements: cardholder data cannot be David Close is chief solutions architect at Futurex, a trusted provider
stored in the clear, for example. If retailers store data in a of hardened enterprise data security solutions. He is a subject mat-
database, it must be stored encrypted in a data protection ter expert in enterprise key management best practices and systems
environment. However, the data protection environment architecture and infrastructure design. Contact him at linkedin.com/in/
limits what you can do with the data, and it introduces davidclose or www.futurex.com.
risk if you’re exporting that data to handle functions such
as chargebacks or account lookups, because you have to
decrypt the data. This is where tokenization enters in.
26