facebook noscript

What is Tokenization – and How Does It Influence PCI DSS Compliance?

February 3, 2022
What is Tokenization – and How Does It Influence PCI DSS Compliance?

For years now, tokenization has been the hot topic in the payments industry. Around the world, millions of businesses are now using tokenization to help secure payment data, sensitive transactions, and information such as credit card numbers, bank account numbers, and personally identifiable information (PII).

The revolutionary nature of modern digital tokens is that your data becomes even more valuable to you. You can use a token for any business purpose just like you would the original piece of sensitive information. And if a hacker or data thief steals your token, it is absolutely worthless to them.

By using tokens in the place of sensitive data, your company has dramatically mitigated the risk of data breach, data loss, misconfigured cloud resources, computer hacking, and more. There were some huge data breaches in 2021. Don’t let the same happen to your business in 2022!


What Is Tokenization?

Tokenization is the systematic process of replacing original, sensitive information with a non-sensitive placeholder value called a token. Tokens are randomly generated, and have no intrinsic value on their own. Tokens may or may not have the same format as the original data; with VGS, you can select “Format Preserving Tokens”. Tokens can only be “de-tokenized” or mapped back to the original sensitive data via the original tokenization platform.

A comparison with encryption helps with understanding the advantages of using tokenization. Encryption was historically the preferred technique for safeguarding sensitive material, and is the process of transforming original “plaintext” information into unreadable “ciphertext” via a unique algorithm, that can be transformed back via decryption, using the same algorithm or decryption key.

When you send encrypted data across the Internet, it contains the original data via a specific mathematical relationship or algorithm. A hacker could exploit or “crack” this algorithm with a stolen decryption key or a powerful computer. Tokens are not reversible in the same way. Tokens do not actually contain the “real” data but merely serve as a reference point to the actual data, which is safely kept in a secure vault.

Ultimately, this distinction is crucial. When information (such as cardholder data) is encrypted, it falls under Payment Card Industry Data Security Standard (PCI-DSS) scope because the encryption still retains the cardholder data within it. When tokenized, the same information no longer contains cardholder data, and does not fall within the scope of PCI-DSS. By extension, the same logic applies to an entire database.

Therefore, tokens are widely considered a safe way of storing and transmitting sensitive information, because if stolen, intercepted, or leaked, tokens have no exploitable meaning.


History of Tokenization

As a concept, tokenization is not new. Money itself is an intermediary token, or medium of exchange, representing a specific value to be traded for goods and services. It was invented to take economies “beyond barter” (say, a dozen apples for a dozen bananas) and the inconvenient “coincidence of wants” (are you tired of apples yet?). Other common examples, one step removed, are poker chips and subway tokens.

As modern information technology took off in the late 20th century, the notion of digital tokenization naturally took hold. It was an elegant solution to the problem of storing, processing, and exposing sensitive data to one or more untrusted people, databases, or systems.

In e-commerce, an early example occurred in 2001, when the “classmates.com” website adopted a clever tokenization solution to protect sensitive payment data and shield PII from the risk of a data breach. The firm TrustCommerce devised a system to process Classmates’ payments and then provide its customers with a digital token, subsequently sent to Classmates as proof of purchase.

More recently, much of the world transitioned from swiping payment cards to inserting them into a chip reader, to prevent bad actors from skimming or duplicating credit card information. Similarly, tokenization is now used to mitigate online fraud and protect against data breaches.


Benefits of Tokenization

Today, tokenization is a highly-refined technology, widely used by merchants and service providers as a safe and cost-effective way to protect payments, sensitive data, and PII. Further, the PCI-DSS Security Standards Council (SSC) officially recognizes tokenization.

Here are some of the primary benefits tokenization offers your company:

  • Minimal Contact: Original payment data is safely vaulted in the cloud, while third parties only receive and process tokens.

  • Data Breach Mitigation: Tokens, by themselves, are not sensitive and are worthless to data thieves if they are misplaced, stolen, or hacked.

  • Flexibility: Tokenization increases the utility of your sensitive data and may be used for single-use or multi-use cases, such as card-on-file transactions for easier checkout.

  • Compliance: Tokenization reduces your store of original, sensitive data, as well as your scope relative to security and privacy regulations like PCI, CCPA, SOC 2, and GDPR.

Start Securing Sensitive Data Now

Start Free

Tokenization in Practice

There are countless examples of use-cases for tokenization. Here are two common ways merchants and service providers use it for digital information security.

  • Cardholder Data (CHD): The irreversible nature of tokens offers such a high level of security that tokenization has become a prevalent method to protect cardholder data, including debit and credit card numbers.

  • Payment Data: Banks, e-commerce retailers, and service providers worldwide have adopted tokenization as a way to process payments and store payment data for future transactions. As a result, the risk of data breach, loss, or theft is significantly reduced.


Limits of Tokenization

It is important to understand, however, that tokenization is not a silver bullet for every business requirement or every security standard. Tokens reduce the number of system components that are in-scope, and to which compliance applies. Still, tokenization is merely one facet of a larger data security program that is necessary to achieve PCI-DSS certification.

The scope of compliance pertains wherever sensitive information is vulnerable in its original form, from the point of capture to transfer, processing, and exchange. Thus, if your entire solution, including your original data, resides on your corporate network, hackers may have numerous opportunities to steal it or even to commandeer your tokenization system.


The VGS Zero Data™ Approach

If your business would like to descope from PCI, VGS offers an end-to-end security solution that leverages a system of aliases (a sophisticated VGS token) and proxies to ensure that original, sensitive data never touches your corporate systems.

The VGS approach to information security is called Zero Data™, and it dramatically reduces your scope of compliance and mitigates the risk of data breach, loss, leaks, theft, misplacement, or misconfiguration. It allows your business to offload the responsibility and security for your sensitive data and the liability of losing it to data thieves and hackers.

When businesses implement VGS solutions to handle their sensitive information, they instantly inherit VGS’s best-in-class security posture, which enables them to fast-track compliance and certifications like PCI and SOC 2. Further, for any business purpose, our clients can still operate on VGS aliases – retaining all of the utility, portability, and value of the original data without the liability.

VGS, a VISA and 16z-backed software solutions provider, is the trusted data custodian for over 1,000 clients, and we can also handle 100% of your data capture, transit, vaulting, and exchange. So let VGS handle data security while you focus on your company’s growth.


Start Tokenizing Now

Create your free account to instantly gain access to our flexible Tokenization API and in under 5 minutes, you’ll be protecting your most critical data.


Developer Resources

The VGS Platform is built for developers. Here’s a quicklink to our comprehensive guide to Getting Started with Tokenization and our Tokenization ‘Go Live’ Guide.

For even more developer questions and deeper discussions, check out our FAQs — or if you want it all, access the complete VGSdocs repository.

Ken Geers Kenneth Geers, PhD

Information Security Analyst at VGS

Share

You Might also be interested in...

VGS Tokenization API

Unlimited Data Tokenization on VGS

Kenneth Geers, PhD April 14, 2022

Tokenization-vs-encryption-vs-aliasing

Tokenization vs. Encryption vs. Aliasing - How to Truly Minimize Compliance Risk

Stefan Slattery October 30, 2019

VGS-Data-Aliasing-Security-Compliance

VGS Data Aliasing: A Single Solution for Security & Compliance

Kenneth Geers, PhD January 6, 2022