What is Tokenization?
Share
With the explosion of data being collected all over the world, there is increasing needs to be able to analyze the data while maintaining privacy and security, but also to be able to securely return to the original data for post-analysis follow-up. A technique called "Tokenization" may be used to safeguard sensitive data in these situations.
There are a number of scenarios in the financial, healthcare and other industries where Tokenization would be applicable:
Financial Industry Use Case - Analysis of transactions for anti money laundering (AML) is a key concern for many financial institutions. However, there is a problem in that often, Personal Information cannot be taken across national boundaries. DMsuite's Tokenization process allows an organization to quickly tokenize the data in the source country, provision it to a central analysis location and then re-identify those records deemed suspect, back in the source country.
Healthcare Use Case - Analysis of patient data is opening the door to new cures as well as a greater understanding of diseases. DMsuite's Tokenization process allows an organization to quickly tokenize the data from a source organization and keep referential integrity across multiple source organizations without any data leaving any of the sources.
DMsuite's Tokenization process quickly and easily creates a secure, non-sensitive "token" based on one or more sensitive data elements, without requiring any coding. The token maps back to the original data through a re-identification process only available on DMsuiteTM. DMsuite's set up is done in minutes and in addition to providing Tokenization, provides a full range of Sensitive Data Discovery, Provisioning, and Data Masking (aka De-Identificaftion) functionality.