The need for pervasive data access has dramatically increased over the last 18 months. As knowledge workers have shifted to remote work during the pandemic, their need to remotely access business applications and data has accelerated their companies' move to the cloud.
These major shifts in the way companies do business mean data flows everywhere, and that means data protection must be everywhere. Companies that allow users to log into cloud services and access unprotected data no longer face the question of whether their data will be stolen, but when—and when may already be in the past.
To support the need for greater access to data while protecting it no matter where it resides—on remote worker systems, in cloud storage, or in third-party cloud applications—companies need to adopt modern format-preserving data protection technologies.
Format-preserving data protection allows data to flow in its protected form, wherever it needs to, without constant re-protection. Its deterministic nature means it preserves referential integrity, permitting analytics to be run while the data is protected, and because the secured data looks and feels like the cleartext data, it can be moved and used everywhere.
Here are five items to consider regarding data protection in the age of remote work, pervasive cloud, and frequent breaches.
1. Assume unprotected data is vulnerable, everywhere and always
The growth of bring-your-own-device policies and SaaS and other cloud offerings means companies can no longer define a data security perimeter—and the rise of persistent threat actors and insider threats belies the idea that you can trust people or even software internal to your organization.
Instead of keeping data "switched on" and trying to protect it wherever or whenever it may be at risk, you should just assume that your data—no matter where it is—is always vulnerable and that it will be compromised. Instead use deterministic tokenization technologies such as format-preserving encryption (FPE) to keep data "switched off" without the need to adjust data stores, systems, or applications to accommodate it. Only specific use cases where actual data is absolutely needed will be granted temporary—and perhaps only partial—decryption or detokenization.
For example, a customer service representative might only need access to the last four digits of a US Social Security number in the clear to verify a customer's identity. In this case, providing partial access to only those digits ensures the full SSN remains undisclosed. In addition to enhancing security, this can build customer trust and confidence regarding protection of customers' information.
2. Protect everything, not just your crown jewels
Attempting to identify where all your most precious data is stored so that you can secure it in those locations and grant only a limited number of internal people access is doing it wrong—and you will fail, if you haven’t already.
Keeping data in the clear, for example, in on-premises or cloud data warehouses or data lakes and then dynamically masking it when used by unprivileged users is also doing it wrong. The data is fundamentally unprotected at rest and subject to breach. And data protection and privacy regulators will not be sympathetic to arguments about why this may be a better option for the business. Turn the model on its head: Protect the data as the default, and dynamically unmask the data only as privileged users request access.
Rather than trying to build walls around your crown jewels, protect as much data as possible—more than just the few elements that are obviously sensitive. In addition to avoiding the fruitless focus on trying to define perimeters and controlling where data is used, this minimizes the threat of data re-identification accomplished through analysis of associated, unprotected elements.
3. Avoid needing to unprotect data just to move it
Many companies try to rely only on the security options offered by cloud service providers. But if they use multiple cloud service providers—each offering some security services, but only in their own environments and without full coverage across service sets—companies cannot guarantee that their data is protected, either from the service providers themselves or as it flows between cloud services and applications. A better strategy is to enable data to remain persistently secure as it moves. Application- and platform-specific, environment-based security cannot accomplish that goal.
A global, format-preserving data protection approach eliminates the attack surface created by unprotecting and re-protecting the data every time it crosses a service, organization, or vendor boundary—an increasingly common scenario as organizations adopt multi-cloud strategies and use more cloud services.
4. Preserve the ability to analyze data in its protected form
Gaining protection at the expense of usability is a poor tradeoff for any data protection technology. A data security infrastructure needs to allow selective use of unprotected data as required by business analysts, data scientists, and other users who have privileges to see that data in the clear.
In addition, companies need protection that still allows analytics to be run on the protected data. This requires information to be tokenized in a deterministic way, preserving the internal references of the data. Returning to the example of SSNs—often used as primary database keys for personal information and as foreign keys elsewhere—databases need the reference to be consistent to tie specific personal records together for reporting and analytics.
5. Ensure your provider's protection is independently validated
Do not blindly trust vendors' claims that their tokenization, encryption, or data-masking process is secure. An attacker with access to a great deal of ciphertext could be a threat to that data if the protection offered is weak. Standards-based protection technologies offer assurance that the technology is solid through rigorous cryptanalysis, so your vendors' data protection should be standards-based when possible. Where no standards exist, independent expert assessment is a minimum qualification. Ignoring this requirement is to naively place your trust in security vendors that may be peddling snake oil incapable of meeting real-world challenges.
Plan for your next fail
Revamping your data security approach should be part of a transition to the concept of cyber resilience—not only working to secure data and systems, but also including processes, technology, and mechanisms for when those security measure fail.
We all know that data is in constant movement across hybrid IT, shared with third parties, and transferred internationally. Your company needs to pervasively protect sensitive data wherever it resides, while provably supporting privacy and security regulation requirements and maintaining data usability. It is a challenge that can be met—and format-preserving data protection can help.
Keep learning
Get up to speed on unstructured data security with TechBeacon's Guide. Plus: Get the Forrester Wave for Unstructured Data Security Flatforms, Q2 2021.
Join this discussion about how to break the Ground Hog Day repetition with better data management capabilities.
Learn how to accelerate your analytics securely into the cloud in this Webinar.
Find out more about cloud security and privacy, and selecting the right encryption and key management in TechBeacon's Guide.
Learn to appreciate the art of data protection and go behind the privacy shield in this Webinar.
Dive into the new laws with TechBeacon's guide to GDPR and CCPA.