· Apr 18, 2023 4m read

Tokenize your sensitive data

According to the Cambridge dictionary, tokenize data is "to replace a private piece of data with a token (= a different piece of data that represents the first one), in order to prevent private information being seen by someone who is not allowed to do so" ( Today, several companies, especially in the financial and healthcare sectors, are tokenizing their data as an important strategy to meet cybersecurity and data privacy (GDPR, CCPA, HIPAA and LGPD) requirements. But, why not use encryptation? The tokenization process to protect sensitive data is more commonly used than data encryption for the following reasons:

  1. Better performance: encrypt and decrypt data on the fly in an intensive operational processing raises poor performance and requires more processor power.
  2. Tests: it is possible to tokenize a production database and copy to a test database and mantain the test data suitable to more real unit and functional tests.
  3. Better security: if a hacker cracks or get the secret key, all encrypted data will by available, because encryptation it is a reversible process. The tokenization process it is not reversible. If you have the requirement to get the original data from the tokenized data, you need to mantain a a secure and separated database to link to the original and tokenized data.

Tokenization architecture

The tokenization architecture requires two databases: the App DB to store tokenized data and other data from the business and a Token Database to store original and tokenized values, so when you need, your app can get the original values to show to the user. There is also a Tonenizator REST API to tokenize the sensitive data, store into the token database and return a ticket. The business application store the ticket, the tokenized data and the other data in the app database. See the architecture diagram:


Tokenizator Application

See how it works in the Tokenization application:

This application it is a REST API to tokenize:

  • Any value to stars. Example: credit card 4450 3456 1212 0050 to 4450 **** **** 0050.
  • Any real IP address to a fake value. Example: to
  • Any person data to fake person. Example: Yuri Gomes with address Brasilia, Brazil to Robert Plant with address London, UK.
  • Any number value to a fake number value. Example: 300.00 to 320.00.
  • Any credit card data to a fake credit card number. Example: 4450 3456 1212 0050 to 4250 2256 4512 5050.
  • Any value to hash value. Example: System Architect to dfgdgasdrrrdd123.
  • Any value to regex expression. Example: EI-54105-tjfdk to AI-44102-ghdfg using the regex rule [A-Z]{2}-\\d{5}-[a-z]{5}.

If you want any other option, open a issue on the github project.

To tokenize values and get the original values after, follow this steps:

  1. Open your Postman or consume this API from your application.
  2. Create a request to Tokenize using STARS, PERSON, NUMBER, CREDITCARD, HASH, IPADDRESS and REGEX methods to this sensitive data sample:
    • Method: POST
    • URL: http://localhost:8080/token/tokenize
    • Body (JSON):
            	"settings": {
            	"settings": {
        	    "originalValueString":"Yuri Marx Pereira Gomes",
        	    "settings": {
        	    "settings": {
        	    "settings": {
        	    "originalValueString":"System Architect"
        	    "settings": {
    • See the results. You get a tokenizated value (tokenizedValueString) to store into your local database.
  3. Copy the ticket from the response (store it into your local database with the tokenized value)
  4. Now with the ticket, you can get Original value. Create a request to get the original value using the ticket:

All tokens generated are stored into InterSystems IRIS Cloud SQL to allows to you get your original values with performance and confidence.

Enjoy it!

Discussion (0)1
Log in or sign up to continue