Difficult as it may be to believe, there was a time when data was stored locally, if at all. But with data traversing more domains — from public cloud storage to on-premises data centers to third-party services — how should organizations think about managing their data?
Cloud computing, an increase in remote work, and a surge in AI usage have chipped away at the compliance and security measures organizations had in place.
Against this backdrop, here are three things organizations need to know in order to manage their data, that can also help in laying the groundwork for securing data and avoiding hefty compliance fees — the who, what, and where of data.
1. Know who accesses the data
Customer data requires someone or something — be it manual or automated data processing — to read and process it in order for the organization to provide services. But, for the sake of preventing breaches and maintaining compliance, organizations must have strict rules around who can view and access data. For instance, to process payments, an organization may require customer credit card numbers. But very few internal systems or users should have access to those credit card numbers.
The challenge is that organizations must secure data while still allowing authorized users to access it. And they must do so in a rapidly evolving digital environment and internal users can access systems from anywhere on almost any device. They also must track who accessed data, and when, for auditing purposes.
Controlling access was simpler when work was location-based: on-premises contractors and employees connected to an internal network, which was often tightly controlled. But trying to scale up VPNs to account for the global and remote nature of the modern organization, to put it bluntly, is not working.
Security and IT need a scalable way to make sure the right people, and only the right people, can handle sensitive data, even in hybrid work environments.
2. Know what accesses the data
Large language models (LLMs) may help to increase productivity. But CISOs around the world are likely waking up in a sweat imagining customer data in ChatGPT, or internal file systems in LLaMa (and these are far from hypothetical examples). Confidential data being brought outside a secure environment is a security nightmare.
But data getting uploaded into an unauthorized third-party platform is not the only aspect of controlling what systems and platforms handle data. Devices that are infected with malware, or just insufficiently secured, can result in a breach.
The explosion in software-as-a-service (SaaS) tools — one source estimates enterprises use from 270 to 364 SaaS applications on average — presents another threat to data. Third-party SaaS tools can vacuum up, or even leak, confidential data without IT and security teams being aware of it.
With visibility gaps, sprawling cloud deployments, and "alert fatigue" introduced by a mishmash of systems, networks, and security solutions, verifying what all touches data is a bigger challenge than ever.
3. Know where data is accessed
The amount of data being generated and stored continues to grow exponentially. In fact, according to IDC, the overall global datasphere reached 64 zettabytes in 2020 and is still growing. New data sources such as Internet of things (IoT) devices generate more data continually, and meanwhile big data collections are essential for training machine learning and AI models.
This data traverses the globe and is often stored in a distributed fashion. Many organizations store data across various cloud deployments as well as physical locations. And the often-siloed nature of data storage makes it difficult to have visibility of what data is and where it is.
Complicating the matter still further is the globally distributed nature of both workforces and customers today. An employee might access stored data from within an internal corporate network in office headquarters, or from the other side of the world. And confidential data might have passed through any number of regions and across multiple unsecured networks before it reaches an organization's backend storage infrastructure, increasing the risks of a breach. All of those regions are likely to have their own separate regulations for data privacy as well.
Put it all together, and it is easy to see why the where of data makes compliance and security so difficult. It is challenging to ensure that data is protected in a way that meets security standards across all the different places it is stored in, let alone when it is in transit.
Security and IT need a way to apply policies everywhere across all infrastructure and in front of all their applications. A connectivity cloud is a cloud-based security platform that integrates with on-premises, the cloud, and everything in between.
The Cloudflare connectivity cloud is a composable platform that helps enterprises streamline compliance with extensible security controls across systems that help you connect, protect and build. Achieve compliance requirements for a wide range of regulations, frameworks & standards including PCI DSS, HIPAA, and more, thereby mitigating risk.
This article is part of a series on the latest trends and topics impacting today’s technology decision-makers.
Learn more about how to simplify critical IT and security tasks in the The connectivity cloud: A way to take back IT and security control ebook.
After reading this article you will be able to understand:
How digital modernization has impacted data
The challenges of managing the who, what, and where of data
How new models like the connectivity cloud allow organizations to gain visibility of and secure their data