- I tested Samsung's 98-inch 4K QLED TV, and watching Hollywood movies on it left me in awe
- Apple is working on a doorbell that unlocks your door Face ID-style
- 5 biggest Linux and open-source stories of 2024: From AI arguments to security close calls
- Securing the OT Stage: NIS2, CRA, and IEC62443 Take Center Spotlight
- Trump taps Sriram Krishnan for AI advisor role amid strategic shift in tech policy
“Best of Breed” – CASB/DLP and Rights Management Come Together | McAfee Blogs
Securing documents before cloud
Before the cloud, organizations would collaborate and store documents on desktop/laptop computers, email and file servers. Private cloud use-cases such accessing and storing documents on intranet web servers and network attached storage (NAS) improved the end-user’s experience. The security model followed a layered approach, where keeping this data safe was just as important as not allowing unauthorized individuals into the building or data center. This was followed by a directory service to sign into to protect your personal computer, then permissions on files stored on file servers to assure safe usage.
Enter the cloud
Most organizations now consider cloud services to be essential in their business. Services like Microsoft 365 (Sharepoint, Onedrive, Teams), Box, and Slack are depended upon by all users. The same fundamental security concepts exist – however many are covered by the cloud service themselves. This is known as the “Shared Security Model” – essentially the Cloud Service Provider handles basic security functions (physical security, network security, operations security), but ultimately the end customer must correctly give access to data and is ultimately responsible for properly protecting it.
The big difference between the two is that in the first security model, the organization owned and controlled the entire process. In the second cloud model, the customer owns the controls surrounding the data they choose to put in the cloud. This is the risk that collaborating and storing data in the cloud brings; once the documents have been stored in M365, what happens if it is mishandled from this point forward? Who is handling these documents? What if my most sensitive information has left the safe confines of the cloud service, how can I protect that once it leaves? Fundamentally: How can I control data that lives hypothetically anywhere, including areas that I do not have control over?
Adding the protection layers that are cloud-native
McAfee and Seclore have extended an integration recently to address these cloud-based use cases. This integration fundamentally answers this question: If I put sensitive data in the cloud that I do not control, can I still protect the data regardless of where it lives?
The solution works like this:
The solution puts guardrails around end-user cloud usage, but also adds significant compliance protections, security operations, and data visibility for the organization.
Data visibility, compliance & security operations
Once an unprotected sensitive file has been uploaded to a cloud service, McAfee MVISION Cloud Data Loss Prevention (DLP) detects the file upload. Customers can assign a DLP policy to find sensitive data such as credit card data (PCI), customer data, personally identifiable information (PII) or any other data they find to be sensitive.
Sample MVISION Cloud DLP Policy
If data is found to be in violation of policy, it means the data must be properly protected. For example, if the DLP engine finds PII, rather than let it sit unprotected in the cloud service, the McAfee policy the customer sets should enact some protection on file. This action is known as an “Response”, and MVISION Cloud will properly show the detection, violating data, and actions taken in the incident data. In this case, McAfee will call Seclore to protect the file. These actions can be performed both in near real-time, or will enact protection whenever data already exists in the cloud service (on demand scan).
“Seclore-It” – Protection Beyond Encryption
Now that the file has been protected, downstream access to the file is managed by Seclore’s policy engine. Examples of policy-based access could be end-user location, data type, user group, time of day, or any other combination of policy choices. The key principle here is the file is protected regardless of where it goes and enforced by a Seclore policy that the organization sets. If a user accesses the file, an audit trail is recorded to assure that organizations have the confidence that data is properly protected. The audit logs show allows and denies, completing the data visibility requirements.
Addressing one last concern; if a file is “lost” or the need to restrict access to files that are no longer in direct control such as when a user leaves the company, or if the organization simply wants to update policies on protected files, the policy on those files can be dynamically updated. This addresses a major data loss concern that companies have for cloud service providers and general data use for remote users. Ensuring files are always protected, regardless of scenario is simple to achieve with Seclore by taking the action to update a policy. Once the policy has been updated, even files on a thumb drive stuffed in a drawer are now re-protected from accidental or intentional disclosure.
Conclusion
This article addresses several notable concerns for customers doing business in a cloud model. Important/sensitive data can now be effortlessly protected as it migrates to and through cloud services to its ultimate destination. The organization can prove compliance to auditors that the data was protected and continues to be protected. Security operations can track incidents and follow the access history of files. Finally, the joint solution is easy to use and enables businesses to confidently conduct business in the cloud.
Next Steps
McAfee and Seclore partner both at the endpoint and in the cloud as an integrated solution. To find out more and see this solution running in your environment, send an inquiry to cloud@mcafee.com