Learn about Tokenization

What is Tokenization?

Tokenization definition

Generally, tokenization means a process, in which the original data is replaced with non-sensitive equivalents. These equivalents usually are randomly generated characters, also known as tokens, which are created during the tokenization process.

Tokens replace sensitive parts of data, such as credit card numbers, PII, or social security numbers with a string of symbols, which does not display the real cardholder information. By this, we can securely use credit cards for payments, without the risk of sensitive data exposure, which makes tokenization solutions a great option in the payment card industry.

Additionally, tokenization helps companies to reduce the cost of meeting security standards like PCI DSS, or government regulations, as token usage minimizes the amount of data, which needs to be securely stored by businesses and also ensures safe credit card transactions.

You can learn more about tokenization and its benefits here.

Token definition

Token is a piece of the data that is replaced during the tokenization process. At its core, token is random strings of characters, which are a unique identifier for information, after its one-to-one replacement (sensitive data to non-sensitive data).

Token provides secure data storage in the token vault, and also payment transaction processing within the system, which means that sensitive data will not get to third parties.

By using an encryption algorithm during token generating, tokenization has become one of the most efficient and secure ways to transfer sensitive information. Moreover, tokens make the entire payment process more secure and easy for customers.

Tokenization of payment processors gives maximum security in transaction proceedings and provides excellent protection against fraud, which is especially relevant today when data leakage facts surface from time to time.

There are three main types of tokens:

Currency or payment tokens. Tokens are created as a means of payment for goods and services that are not related to the platform where they are stored.

Asset or security token.Tokens provide income payments and investments. From an economic point of view, they are similar to bonds and stocks.

Utility token.These tokens are digital assets for a certain platform, product, or service.

Who invented tokenization?

Tokenization is not a brand new thing. In 2001, TrustCommerce developed the tokenization concept system to protect sensitive payment data for Classmates.com.

To minimize the risk of data leakage as a result of a hack, TrustCommerce created TC Citadel. This system used a token instead of cardholder data meanwhile, TrustCommerce processed the payment on behalf of the merchant. Thanks to this, it became possible to securely make recurring payments without the need to store original data about the cardholder.

By tokenization, card numbers and other sensitive data are replaced with random strings of characters, which have no value to the hacker, even if intercepted.

Amazon was also one of the first to use tokenization. That approach allowed not to request user data each time to process payments, which made customer service more easily.

The concept of tokenization dates back to the 1970s as a means of isolating real data elements from the impact of other data systems. With the first currency payment systems, tokenization was used to minimize the risk when dealing with expensive financial instruments by replacing them with surrogate equivalents, which are tokens.

What does tokenization do?

The main purpose of tokenization is to replace confidential data with randomly generated data, which means the creation of a token.

The use of a token in many cases allows to improve and secure the payment processor. Also, it significantly reduces the amount of data compliance requirements, because tokenization irretrievable replaces sensitive data with a token. For example, exchanging a Primary Account Number (PAN) for a token makes it easier to comply with PCI DSS requirements.

In addition, tokenization simplifies control over data access management, thereby preventing the de-tokenization of sensitive data for third parties. So, if the data is stored in the cloud, the use of tokenization provides access to confidential data exclusively for authorized consumers.

Also, when processing payments and working with third-party vendors, software and service providers, who need access to data, tokenization allows minimizing the risk of data leakage by storing it in a token vault.

During tokenization, the original data is replaced with randomized elements (tokens), which increases cybersecurity and protects sensitive data from potential intruders.

There are three main approaches for tokenization:

  1. Token is recoverable and stored by external service providers. A small number of credit card authorization networks are used for this method. After authorization, the token value includes the card number. The merchant holds the token and can use it for subsequent payments. At the same time, the payment network has the recoverable version of the card number, while the merchant itself does not have the original card number.
  2. Token is created and stored on local servers. Token is protected by encryption, and, if necessary, can be recovered by decryption. In this approach, the token is used in databases, which reduces the risk of losing sensitive data. Often, this method is used to protect credit cards, social security numbers, etc.
  3. Token stored locally and cannot be recovered. Since the original value is not stored, there is no risk of losing sensitive data from the vault. This approach is often used in test and QA environments.
  4. Learn more about our tokenization solution in our documentation.

What is the difference between Digitization and Tokenization?

Tokenization and digitalization are two key elements in the payment card industry, which can be combined.

To understand the difference between these two methods, it is worth looking at each of them.

Tokenization involves the act of generating a token linked to specific data that will be stored in secure storage and used for subsequent credit card transactions within a certain system or platform.

In turn, card digitization (Visa or MasterCard) is a more extensive process, which may also include tokenization. Unlike tokens, which only store a piece of credit card data associated with the generation and token mapping, digitalization concerns the entire process of digitizing a card with its data.

Thus, tokenization is part of digitalization. The key difference between the two is in extensivity. Digitalization involves the full digitization of an asset or card, while tokenization involves the creation of a token with some piece of sensitive credit card data.

What businesses would benefit the most from tokenization?

Most often, merchants and service providers resort to payment tokenization. By the implementation of tokenization, a business receives several advantages:

  • Trust between client and company. The key argument for tokenization is security. Generation of individual tokens provides correct formatting and transmission of data. This greatly reduces the risk of fraud or data leakage during cyberattacks. Reliable and secure service from a company provides a trusting relationship between the business and the customer, thereby promoting trust and a good reputation in the long term.
  • Сompliance with PCI. Tokenization ensures regulatory compliance and meets industry standards. By using tokens in payment processors, businesses can reduce the cost and complexity of matching with the PCI-DSS payment security standards, which helps to avoid unnecessary expenses, and also fines from financial institutions.
  • Recurring payments. Tokenization offers great benefits for companies that accept recurring payments. Token provides convenience and simplifies the payment process as it changes sensitive data into randomly generated strings that cannot be decrypted outside the internal system. This allows businesses to store meaningful card information in secure virtual vaults. Thus, tokenization is a great option for businesses that have to do with credit card payments regularly. These can be banks, e-commerce retailers, service providers, as well as subscription services, mobile application developers, etc.

Tokenization Process

How does tokenization work?

Tokenization of credit card payments changes the way how transactions are proceeding. For example, in a traditional payment card to make a purchase the data needs to be first sent to the payment processor. During this process, the original card information is stored in the POS terminal or within the merchant's internal systems. But with token integration, the payment process comes in a little bit different way.

Tokenization technology sends the card data into the tokenization system, where sensitive data is replaced with a random string of characters, also known as tokens. Once the token is generated, it transfers to the POS terminal and payment processor, where the transaction proceeds. By this, payment tokenization ensures secure payment environments, as with credit card tokenization we can replace sensitive credit card data with a token, while the original information is securely stored in the vault, which also is protected with encryption technologies.

The benefit of this method is that a token doesn't have any value outside the payment system. Thus, even if hacked, criminals wouldn't be able to use card data, as the token doesn't expose it to anyone. That’s why we can consider tokenization a great fraud-prevention technology, especially in the areas of online purchases.

The scheme of payment tokenization looks like this:

  • At a step of transaction initiation, the original card data is replaced with a random token.
  • Then a token is sent to the acquirer bank and transferred to the card network to provide authorization.
  • A token is linked with an account number, while the original data is kept in a vault.
  • The bank verifies card information (like balance) and provides the transaction processing.

Once the transaction completes, a merchant gets a token for the customer account, which can be used for current and future card payments.

See how we can help you tokenize sensitive data.

How to implement data tokenization

The VGS Platform is built for developers. Here’s a quicklink to our comprehensive guide to Getting Started with Tokenization.

What is the difference between tokenization and encryption?

Tokenization and encryption are among the most popular methods to secure information when it is transmitted over the Internet, especially in the online payment industry. Each of these technologies provides data security and also helps to match with regulatory compliance like PCI DSS, ITAR, GDPR, etc.

Tokenization and encryption do an effective job in data protection, however, they have some differences and are not interchangeable. Therefore, the choice of technology for protecting information depends on the circumstances and the company's purpose.

To understand the differences between tokenization and encryption, it is worth considering each of these technologies.

Tokenization
Tokenization is a process that allows replacing of sensitive data, such as a credit card number or cardholder information, etc., into a randomly generated string of characters, called a token, which has no value if breached.

The token acts as a link to the original data, but cannot recognize its values, since a mathematical process is not used for tokenization. Thus, the original data of the token cannot be displayed.

Tokenization uses a token vault (database). It stores information about the relationship between sensitive data and the token. At the same time, the data in the token vault is often protected by an encryption algorithm.

Token is used as a replacement for original data. During the payment, the token is sent to the vault while the index gives a real value for an authorization process.

Token does not have a mathematical relationship with the original data. It helps to avoid the leakage of valuable data during cyberattacks since even if hacked, the token will not have value.

One example of tokenization would be a credit card token, which can display the last four numbers of the card and the rest as asterisks or other symbols. Thus, the merchant can only have the credit card token, and not its numbers, which provide better security.

Encryption
Encryption is the process of transforming text into a non-readable form called ciphertext. For these purposes, an encryption algorithm is used. An algorithm and an encryption key also are used to decrypt the data and return it to its original form. Today, SSL encryption is the most popular option to protect information on the Internet.

Encryption methods are widely used not only for payments but also for protecting data on a PC. Thanks to the built-in encryption capabilities, we can encrypt sensitive data to prevent it from being leaked or unauthorized access if the device was stolen. Also, encryption protects the business from the loss of sensitive corporate data.

In encryption and decryption processes we can use the asymmetric key. In this case, there will be two: the public key and the private key. The first one is used only for data encryption. The merchant can use the public key to encrypt payment data before making a transaction. Meanwhile, the processing company will need to have a private key to decrypt the card data and make the payment. Asymmetric key encryption is often also used to validate SSL certificates.

The main challenge for a successful encryption implementation is to find a compromise between functionality and encryption strength. If it failed, encryption can break application functionality in sorting and searching. Also, encryption changes data type, which may occur field validation trouble. To solve these problems, format-preserving encryption schemes and searchable encryption schemes are used, which provide data protection without compromising the functionality of the application.

In conclusion, the main differences between tokenization and encryption are:

  • Encryption uses cryptographic methods to transform plain text into cipher. Tokenization generates random tokens to replace data.
  • Encryption is used for both structured and unstructured fields. Tokenization is used for structured data fields (like a credit card number).
  • Encryption is a great option to work with third parties that have an encryption key. While tokenization requires direct access to a token vault mapping values.
  • In encryption, original data can leave internal systems. In tokenization, original data does not leave internal systems, which makes it easier to compliance requirements.

You can also read about tokenization and encryption at VGS.

What is Payment Tokenization?

Payment tokenization definition

Payment tokenization is a process of one-to-one replacement of sensitive card data with a non-sensitive equivalent, also known as a token.

For this purpose, tokenization changes all the sensitive data with a unique random string of characters, which provides a safeguard for a card PAN (permanent account number) and other sensitive cardholder information. Replacing the original data with the token makes credit card transactions more secure and convenient for the customer.

Tokenization is widely used in the modern payments industry, especially for making online transactions, and recurring or one-click payments. In addition, token usage for data protection is a great option for the scope of PCI compliance, which makes tokenization a good strategy for e-commerce services.

How does tokenization work in payments?

With the growing popularity of digital currencies and online payments, the concern of data security becomes prevalent. To provide a high level of security in payments, businesses use different methods and tokenization is one of them.

Tokens replace cardholder data with a string of characters, which can be randomly generated or by the algorithm. It provides additional protection and minimizes the risk of fraud, since tokens are useless outside payment systems, as they do not display the original data.

To make a purchase, the customer needs to enter credit card data. Once it is done, the token is generated, and it represents all the necessary information for card-on-file or recurring transactions. Besides that, tokens also allow online purchases and refunds with one click.

The best part of tokenization is that you need only to enter the card data a single time, and then a token can be used both for current and future transactions. And this approach gives more flexibility in use across payment landscapes, including e-commerce and subscription services.

How does credit card tokenization work?

The main reason behind credit card tokenization is to ensure sensitive data protection by replacing it with a generated token, which is a random string of characters. By this method, the merchant can provide safe payments within networks without the risk of actual card details leakage.

Thus, to make credit card transactions secure, and also convenient to the customer, tokens are used, as they do not expose any sensitive data and can transmit it safely and smoothly in the payment system.

To understand better what is the token purpose, we can imagine something like a map, which does not show us real card data, but rather explains in which place the bank stores it.

Tokenization of cards includes a few steps:

  • Once the sensitive card data is entered into an app or website, it is replaced with a token.
  • A token is sent to the bank and then transmits to the authorized card networks.
  • A token maps with the customer account number, while the original card data is kept in a vault.
  • After the bank completes a transaction, the merchant gets a token back and can use it for the customer's current and future credit card payments.

Dive deeper into credit card tokenization here.

You can learn even more about different card tokenization features at VGS.

How storing credit card tokenization works

With tokenization, sensitive credit card data is one-to-one replaced with randomly generated characters, called a token.

After a token is created it is sent to the card issuer bank and card networks for authorization. While the original card data is safely stored in a token vault, which also is protected with encryption methods.

The token is mapped with the real card information and can be used only in the merchant’s payment gateway. Moreover, a real token value is unreadable to anyone, including the merchant.

Thus, credit card tokenization is a great option for both making online payments without the risk of credit card fraud and storing sensitive credit card information in secured virtual vaults. Also, in some cases, the data can be stored vaultless.

Credit card tokenization vault

Tokenization of cards means that the sensitive data is replaced with a randomly generated token, while the actual card data is stored in a token vault or vaulting services.

A data vault is a protected database, which consists of a pair of linked tables to store the information. From a vault, the data can be retrieved to make a payment. After the transaction is done, the original data is sent back to secure storage. Tokenized cards utilize tokens to proceed with transactions. To make a payment, a token is mapped with the sensitive card data, which is stored in a vault.

In addition, securing data elements in the vault reduces the scope of PCI DSS compliance, and additionally makes de-identification of PII, PHI, and personal data possible.

The key benefits of token vaults are:

  • Tokenization provides better security for sensitive credit card data and cardholder information. Vaults protect the original data and reduce the risk of breaches and credit card fraud.
  • Token vault meets the PCI DSS standards. The PCI DSS requires that data cannot be stored within internal systems, even if encrypted. So a secure vault is a great option to match payment industry requirements and avoid fines from financial regulators.

What is Tokenization in Relation to PCI Compliance?

PCI Tokenization

PCI or Payment Card Industry is a standard for secure credit card data storing and transmission. Thus, one of the main tasks of businesses, which accept payments, is to protect cardholder-sensitive data. If it fails, a company can be fined by regulators.

To provide security standards and avoid problems like data leakage or non-compliance with PCI requirements, the tokenization of credit cards is used. As credit card data is replaced with a randomly generated token, while the original data is securely kept in the database, it meets a PCI compliance scope in data protection standards.

PCI DSS Tokenization

Payment Card Industry Data Security Standard (PCI DSS) is aimed to protect the customers’ sensitive data and provide secure card payments. Thus, to meet the PCI scope compliance businesses use tokenization tools.

As tokenization systems transmit cardholder data, they must be configured and maintained in a PCI DSS-compliant manner.

Requirements of payment industry security standards include:

  • The tokenization system can provide PAN only in the merchant’s defined cardholder data environment (CDE).
  • Tokenization elements are located in secure internal systems and isolated from any out-of-scope or untrusted networks.
  • Tokenization solutions provide a high level of security protocols to protect sensitive cardholder data when stored and during transmission over open networks.
  • Tokenization solutions implement security controls and authentication measures according to PCI DSS requirements.
  • Tokenization systems meet standards and are protected from vulnerabilities and security breaches.
  • Tokenization solutions are integrated with a mechanism for secure data delete requests, which is required by a data-retention policy.

What is the Tokenization process for PCI?

Tokenization is a process of replacing sensitive card data with a random string of characters, called a token. By using this method, a business can match with PCI DSS requirements, which makes card payments and data storage secure, as tokens do not expose the original data even if hacked.

In the tokenization process, we can get two types of tokens — single-use and multi-use. The single-use token typically is used for a specific, single transaction. The multi-use token is mapped between a particular PAN value and the same token value within the tokenization system. Depending on specific business needs and environments, companies can choose to retain single-use, multi-use tokens, or a combination of both.

The main reasons for implementing tokenization to meet PCI DSS requirements are to reduce the amount of sensitive data within the CDE (cardholder data environment) and to minimize the need to safeguard the data storage.

A PCI-compliant tokenization system provides the secure generating, mapping, storage, and management of tools used in tokenization. Due to PCI DSS standards, it is necessary to ensure tokens have mathematically reversible cryptographic functions, hashing (one-way non-reversible cryptographic functions), and indexing (use randomly generated characters).

Tokenization provides a secure centralized data storage and reduces the number of data occurrences in an environment. The token integration into card payments allows merchants to reduce or even remove the need for PAN retaining in their environments, once the initial transaction has been processed. Thus, the tokenization solutions help to reduce the cost of PCI DSS compliance, as they minimize the number of system elements, which should be protected by payment industry standards.

PCI Tokenization vendors

If the company processes credit card payments, it is required to meet PCI DSS standards. Thus, to be compliant with PCI scope, businesses are recommended to use reliable tokenization providers.

Vendors offer tokenization services, which can reduce the cost of compliance with PCI DSS requirements, allowing merchants to avoid fines from industry regulators.

Tokenization vendors provide a tokenization service to transmit, store and process sensitive cardholder data (CHD). Also, it is used to ensure the security of the cardholder data environment (CDE).

A PCI service provider should be managed for both business and the merchant if a business is part of the transaction flow for the merchant, and also if the merchant's PCI security controls might be affected by that business.

PCI Tokenization scope

According to PCI DSS standards, all system components, which are affiliated with CDE (cardholder data environment) should be in scope with payment industry requirements. But matching all the security requirements can be a complex task, that’s why a lot of companies start to use tokenization solutions to reduce the scope of PCI DSS. Tokenization provides secure cardholder data storage and makes payments easy and safe. Also, tokenization solutions make the PCI compliance process accessible to all types of businesses that accept card payments.

The main PCI DSS scoping principles include:

  • All tokenization elements are considered as a part of the CDE, as they store, transmit and process data.
  • To protect the original data, tokenization systems generate a token to replace PAN and other sensitive cardholder information.
  • All tokenization system components should be within the scope of PCI DSS compliance, as they are connected to CDE. So even if some elements do not perform tokenization, but are located within or connected to the CDE, those elements should be considered in the PCI DSS scope too.

PCI Tokenization vs Encryption

Tokenization and encryption are two popular methods, which provide access control measures to prevent unauthorized access to sensitive cardholder data.

Tokenization
With payments tokenization, we replace card data with non-sensitive units, called tokens. Tokens are mapped with the original data value but don’t expose it. While we use tokens to transmit payment information, sensitive card data is stored in a secure vault.

Encryption
With the encryption method, the original data turns into ciphertext, and to make it an encryption key is used. To convert the ciphertext back to the real data we use a decryption key, which can be accessed only by authorized parties. The main purpose of encryption is to devalue sensitive card data, so even if compromised the original data would be unreadable and unusable for thieves.

By using tokenization and encryption technologies, companies can protect sensitive customers’ card data, which helps to significantly reduce the cost of PCI DSS compliance, because by encryption, the real information about the card is in an unreadable view, and by tokenization, it is changed with a random token, meanwhile the original data is securely stored in a vault.

PII Tokenization

Personally Identifiable Information or PII is a type of information, which can be used to distinguish or trace an individual’s identity. As PII can be linked to a specific person, this information should be protected and securely stored to meet regulatory requirements like PCS DSS, GDPR, and CCPA.

PII includes cardholder name, bank account numbers, social security numbers, postal and email addresses, unique personal identifiers (like fingerprints), driver’s license, phone number, etc. As we can see, PII may touch different types of individual information, thus it is necessary to provide a secure environment for PII data usage.

By tokenization, businesses can both protect PII original data by replacing it with tokens, and also reduce the scope of industry requirements.

See how to get started with PII tokenization.

Tokenization PII PHI

To meet regulatory standards in the scope of personal data security, businesses must ensure to provide an environment where sensitive customers’ data would be stored safely, and not compromised even in case of a security breach. For this purpose, data tokenization is used.

Personally Identifiable Information (PII) is a wide spectrum of individuals’ data, which can be used alone or in combination with other information to distinguish or trace an individual. PII includes card numbers, cardholder data, SSN, addresses, etc. Also, PII may include non-sensitive information like race, ethnicity, gender, date of birth, etc.

Protected Health Information(PHI) is the identifiable health information of an individual, which relates to healthcare.

Both PII and PHI can reveal sensitive individuals’ data and in some cases can be used for fraud, so regulatory standards require providing a high level of security to protect these types of information. Thus, replacing PII and PHI with randomly generated tokens allows to avoid data leakage, and also meets industry requirements, as tokens are mapped with the original data without exposing it, while the real PII and PHI are securely stored in a vault.