If you receive errors when attempting to view this white paper, please install the latest version of Adobe Reader.
"Patrick Townsend Security Solutions specializes in high performance, comprehensive encryption and appliance technologies. Organizations worldwide, including the world's leading retail and financial institutions, rely on our solutions to shield billions of bytes of critical data from prying eyes. As these industries and the markets they serve evolve - whether in response to new technologies, hackers or government regulations - our products will evolve to support them."Source: Patrick Townsend Security Solutions
The Business Case for Tokenization
is also known as : Token
, Practical Use of Tokenization
, Tokenization for Historical Data
, Tokenization for Outside Services
, Implementation of Tokenization
, Tokenization Services
, Tokenization PCI
, Tokenization Information
Tokenization Technologie, Tokenization Software, Tokenization Cost, Tokenization Methods, Tokenizer, Preprocessing Tokens, Tokenization Example, Token Linguistics, Tokenization for Customer Service, Unexpected Token, Tokenization White Papers, Process Token, Tokenizing Sensitive Data, Tokenization Definition, Tokenization for Business Intelligence and Query, Tokenization BI, Security Token, Lost Token, Tokenization Development, Token Manager.
TOKENIZATION IS A NEW TYPE OF TECHNOLOGY that is being used by companies to reduce the risk of losing sensitive data to identity thieves. This paper discusses how you can make practical use of tokenization to reduce your risk. Additionally, this paper discusses specific target applications in your organization that you should consider for an implementation of tokenization.
Tokenization is a technology that helps reduce the chance of losing sensitive data - credit card numbers, social security numbers, banking information, and other types of PII. Tokenization accomplishes this by replacing a real value with a made-up value that has the same characteristics. The made up value, or "token", has no relationship with the original person and thus has no value if it is lost to data thieves. As long as a token cannot be used to recover the original value, it works well to protect sensitive data.
For more information about how tokenization works, see the White Paper Tokenization: A Cost-Effective and Easy Path to Compliance and Data Protection.
Tokenization in Development and QA Environments
Tokenization is an excellent method of providing developers and testers with data that meets their requirements for data format and consistency, without exposing real information to loss. Real values are replaced with tokens before being moved to a development system, and the relationships between databases are maintained. Unlike encryption, tokens will maintain the data types and lengths required by the database applications. For example, a real credit card number might be replaced with a token with the value 7132498712980140. The token will have the same length and characteristics of the original value, and that value will be the same in every table. By tokenizing development and QA data you remove the risk of loss from these systems, and remove suspicion from your development and QA teams in the event of a data loss.
The work of the IT department includes many important activities including:
- Enhancing applications for new business requirements
- Upgrading vendor applications for new versions
- Applying security updates
- Analyzing problems reported by users and customers
- Fixing problems in software and data
Developers and QA teams need to maintain separate copies of production data so that they can do their work without corrupting or damaging production information, or exposing it to loss. These copies must have the exact characteristics of the production data or they will not be able to properly test and analyze their systems. Tokenization is the right technology choice to help your organization meet their goals without exposing sensitive data to loss.
Tokenization for Historical Data
In many companies, sensitive data is stored in production databases where it is actually not needed. For example, we tend to keep historical information so that we can analyze trends and understand our business better. Historical databases may contain sensitive information such as first and last names, credit card numbers, health insurance numbers, and so forth. The real values of these fields may not be important, but they have typically been copied to the database simply for the sake of convenience.
Tokenizing sensitive data, in this case, provides a real reduction of the risk of loss. In many cases it may take an entire server or database application out of scope for compliance regulations. In one large US company the use of tokenization removed over 80 percent of the servers and business applications from compliance review. This reduced the risk of data loss and it greatly reduced the cost of compliance audits.
Tokenization for Customer Service
The loss of sensitive information from in-house and outsourced customer service organizations is a growing fact of life for many companies. Sometimes data loss is accidental, and sometimes data loss is due to criminal activity by insiders. In either case, the cost to a company can be very high. It is estimated that each lost record costs an average of $200. This can quickly add up to millions of dollars in costs for a company experiencing even a moderately large loss.
Tokenization can reduce risk in the customer service department by removing sensitive data from customer service databases. For out-sourced operations you should tokenize data before sending it to the outside service. A customer service worker can still accept real information on the phone from an end customer, but there is no need to store the actual information in a database that can be stolen. Tokenization services will associate real information with tokenized information for data retrieval.
While using tokenization in a customer service environment can't completely remove the risk of data loss, but it can dramatically reduce the amount of data at risk and help you identify potential problems.
Tokenization for Outside Services
Many companies send data to outside services for analysis and reporting. For example, many retail companies send their Point-Of-Sale transaction information to analytics service providers for trend and business analysis. The service provider identifies trends, spots potential problems with supply chains, and helps evaluate the effectiveness of promotions. In some cases, service providers consolidate information from a large number of companies to provide global trends and analysis.
Unfortunately, sending sensitive data to outside companies exposes the data to loss. The data might be lost in transit to the service provider, or may be lost from the service provider's system. In either case, your company will be responsible for the cost of notification and remediation.
You can avoid the risk of data loss by replacing the sensitive data (names, credit card numbers, addresses, etc.) with tokens before sending the data to the service provider. Tokenized data will be meaningless if lost, but will serve the needs of the outside service provider. By tokenizing the data before it leaves your IT systems you eliminate the threat of data loss.
Tokenization can also help in the medical industry by removing private patient information before sending to outside services or to state agencies for reporting.
Tokenization for Business Intelligence and Query
Many IT departments help their business users analyze data by providing them with business intelligence (BI), query reporting tools, and databases of historical information. These tools and databases have empowered end users to create their own reports, analyze business trends, and take more responsibility for the business. This practice has decreased workloads and increased efficiency in IT departments.
Unfortunately, these tools and databases open a new point of loss for sensitive information. A database with years of historical information about customers, suppliers, or employees is a high value target for data thieves. Criminals aggregate this type of information to provide a complete profile of an individual, making it easier to steal their identity.
You can use tokenization to remove the threat of data loss from BI and Query environments. When tokens replace names, addresses, and social security numbers, this makes the BI database unusable for identity theft, while maintaining the relational integrity of the data. Tokenizing business intelligence data is an easy win to reduce your risk of exposure.
Patrick Townsend Security Solutions
We know that data gets out, and that it can and routinely does fall into the wrong hands. When this happens, our solutions for encryption, key management, and system logging ensure that your Enterprise is compliant with regulations and that your sensitive data is protected.
Our data security solutions work on a variety of server platforms including IBM i, IBM z, Windows, Linux, and UNIX. Many of these solutions are currently used by leaders in retail, health care, banking, and government.
You can contact Patrick Townsend Security Solutions for an initial consultation at the following locations:
Phone: (800) 357-1019 or (360) 357-8971
International: +1 360 357 8971
A fully functional free trial is available for all Alliance products. You can evaluate Alliance capabilities on your own server systems.