How to Design a Cloud Data Protection Strategy.

Evolving Data Protection to Enable Continuous Risk Management.
Over the last decade, data has become one of the most powerful tools an organization can employ. Data is the driving force behind innovation, efficacy, and ultimately, the success of an organization. The ability to collect, analyze, and interpret data provides an organization with insights that can and should guide strategic decisions, both for internal and external matters.

At Netskope, we believe in that thesis and that data is just not only a tool but also a key value-creation asset for organizations. Organisations service customers, and as a result, business processes and customer interactions generate the data that is needed to facilitate business operations. Without data, there is no business process that can be executed and consequently, a service can not be provided to a customer. Data must be protected so as to maintain and ensure competitive advantage, ensure the privacy rights of customers, ensure the privacy rights of employees, and ensure the stability and accuracy of business operations.

Data is also growing at an exponential rate, which in turn is making it harder to manage and secure. As a result, a new way of thinking about data security is required. Data protection is the process of protecting data throughout its lifecycle, from data creation, processing, modification, transmission and destruction. The old ways of thinking about data protection aren’t fit for the era of digital transformation.

Modern data protection has five key drivers, all of which an organization must seek to understand. These drivers equally apply to Cloud and non-Cloud-related data and should form the basis of any robust data protection strategy.

These five drivers are:

Determine what information is stored locally, in the cloud or at a third party and understand jurisdictional data privacy requirements to help determine true digital risk.

Understand the sensitivity of the data, the importance of the data to the business and the likely impact to the business should this data be made available to non-authorized parties (including being made public) or be modified or corrupted.

Understand where the data is flowing and ensure only authorized access is permitted and that data is not transferred to non-authorized
or unprotected environments.

Assess third-party suppliers, and partners and understand who has access to the data. Determine if the right identities (machine and person) do have access, and determine who should not have access to the data.

Know what controls are being used to protect the data. Are they operating as designed and are they operating effectively?

Today, data exists in two broad environments: on-premise and in the Cloud. Digital Transformation is seeing a dramatic shift with data rapidly moving from on-premise to cloud, especially public cloud. Organizations are looking to cloud environments to reduce operational costs, enhance user experience (and performance), and make it easier to collaborate with partners and third parties.

Netskope research shows that most organizations now have greater than 50% of their web gateway traffic related to Cloud services and applications—a dramatic shift from the days when data was contained within the data centre, on-premise. Furthermore, 83% of users use personal app instances on managed devices and upload an average of 20 sensitive, corporate files to personal instances each month.

This trend will continue to grow and the need for an organization to manage their own data centres will continue to decrease.
As the result, an organization needs to establish, if they have not already, a Cloud Data Protection Strategy.

Within these two environments, on-premise and Cloud, data is used in three states: in motion, in storage, and in memory. Each permutation of environment and state will require a different approach to adequately manage the risk, within the risk appetite of the organization, to that data asset.

Protecting data in the cloud requires a different approach than that used when protecting on-premise data. The data’s presence on the cloud increases the level of exposure as any access to it means it is sent via the public internet, whereas on-premise data could only be accessed via intranet for most internal operations.

95% of organizations allow personal devices in some way in the workplace. (Source: Cisco)
Additionally, the universal availability and ubiquitous accessibility of this data have facilitated the shift towards increased use of Bring Your Own Device (BYOD), as well as Work-From-Home (WFH) solutions—trends that also expand the attack surface and eliminate the previously discernible “perimeter” that was the front line for controls in an older era.

Physical security, backups and disaster recovery are still important aspects of data protection, but when considering cloud-stored data, the responsibility has (typically) been adopted by Cloud Service Providers (CSPs). This relationship brings about a shared security model which must be fully understood and agreed with their service provider so that such services are implemented appropriately for every
party’s needs.

Unlike on-premise data, cloud data is accessed and manipulated through API requests and JSON—the language of the Cloud. It is therefore imperative that the security tools that are being used are capable of interpreting the language of the Cloud. This will allow for a proper application of controls and for visibility that traditional on-prem security solutions simply cannot provide due to their being blind to Cloud traffic.

This also means the control stack needs to be able to decode JSON and interpret API requests. Not having this level of inspection capability will mean that context—the understanding of aspects such as data classification, user actions, inter-application transactions, anomalous user behaviour and device type—will not be able to be appropriately understood.

If the context isn’t understood, policy or control actions will be deficient in enabling the organization appropriately to realize the value of Cloud services, inhibiting the Digital Transformation strategy of the organization.

Furthermore, with cloud-based systems, organizations are exposed to ephemeral security boundaries. Unlike on-prem systems with static IP addresses, and a constant number of resources, in the cloud, new resources are continually started and stopped.

The resources are far more dynamic in nature and simple misconfiguration can lead to excessive exposure. In fact, Netskope Threat research has indicated that misconfiguration in IaaS and PaaS environments is currently the
the leading factor that is contributing to the rise in data breaches.

Having data in the cloud can increase third-party risk as you are required to entrust the data with the CSP and are reliant on them
having the appropriate security measures in place to not only protect your data but also assure that you meet any regulatory compliance

However, this can improve the overall technical risk of the organization as the CSP may be far more skilled and capable than the organization itself to securely manage the environment, especially with basic hygiene such as patching and maintaining technology currency. Moving to a CSP can greatly enhance these fundamentals where some organizations have typically struggled.

The idiom “don’t put all your eggs in one basket” has always been relevant to data security. There is a tradeoff when deciding whether or not to split data or systems across multiple CSPs. On one hand, the split increases the attack surface, but on the other hand, splitting data and systems across separate networks reduces the impact of a breach or even a service delivery failure by a certain CSP.

Having systems split across multiple CSPs yields multiple benefits such as access to a greater variety of tools, better cost optimization, improved redundancy and a lower impact in case of a breach. Existing in a multi-cloud environment does however come with an expanded attack surface and so, it is imperative to take the necessary steps to secure it appropriately. Cloud-based SaaS that is used by an organization further increases the attack surface and should also be considered in the security plan.

The shared security model increases in complexity for each cloud service that is used. Each service presents its own unique vulnerabilities that an organization’s security team must consider. Furthermore, an organization must also do its due diligence and understand the security posture of the service provider to see that it meets both the organization’s risk appetite and any regulation compliance

When running in a multi-cloud environment an organization should:

1. Identify all the cloud services used within the organization. Business units often adopt SaaS solutions that help them achieve their objectives without considering the security of the data that flows between the organization and the service provider. Identifying these services as early as possible is necessary in order to ensure that appropriate security controls are applied.

2. Ensure security settings and policies are aligned across different cloud deployments. Doing so reduces the complexity of the security environment, and provides consistency for the organization’s employees.

3. Acknowledge that the rapid deployment and turnover of SaaS solutions in the workplace make it impossible for security teams to secure each service to the desired standard. Additionally, the constant changes increase the chance of misconfiguration when handled manually. In order to solve this, organizations need to implement automated security processes as they provide several indispensable benefits
a. They reduce/eliminate the possibility of errors/misconfigurations—increased consistency
b. They deploy policies and monitoring tools far more quickly—increasing efficiency
c. They perform tedious/repetitive tasks allowing the security teams to focus their efforts
4. Use effective tools that allow for effective visibility across the cloud services used by the organization.

TN Media News