The cardinal rule of first aid is to first stop the bleeding and stabilize the patient. When it comes to data protection, that means preventing sensitive data from getting out the door, or at least knowing what data is leaving, and where it is going.
Cyber security, antivirus, hackers and malware concepts with secure laptop at center.

Due to the challenges presented in successfully implementing Data Loss Prevention (DLP) as an effective control, some organizations try to skip this fundamental first step, and jump ahead to more sophisticated solutions, like identification of Insider Threats. That is analogous to conducting CAT scans and MRIs to assess a patient’s injuries, while they’re still bleeding from their wounds. Insider Threat identification is an important solution to protect your organization, but needs to be built atop of a firm foundation of an optimized DLP program.

Those that have successfully implemented DLP programs, realize that it requires attention to both the technology and the process. Failed attempts and false starts typically come from focusing exclusively on technology, while ignoring the people and process sides of the program implementation triangle.

DLP technology is a powerful and critical control that is fundamental to protecting your company’s critical data, adhering to data privacy regulations like GDPR and CCPA, and implementing the latest and great zero trust architectures. It is not a “set it and forget it” technology. Making DLP work in your organization requires an understanding of what data requires protection, how that data is used by the business, and how you will identify and handle intentional or accidental mishandling of that data.

DLP platforms identify targeted sensitive data using an ever-growing set of techniques, and take action defined by user configured policies and rules. Initially defining these policies, rules and associated actions is where the optimization journey begins. One size does not fit all, and it will take ongoing time and effort to make sure you are achieving your data protection goals while not hindering the business from doing their jobs (or better yet, enhancing the business’ ability to get their job done more efficiently and securely!).

Cast the net of policies too wide, with overly restrictive actions like blocking, and you may shut down your business’ ability to function. Define your policies too narrowly, and valuable data may leak out, potentially costing your company in many ways, including lost IP, impacted reputation, lawsuits and regulatory fines. Focus actions too heavily on alerting can result in overwhelmed analysts, unsure of where to begin or how to connect the dots across multiple DLP events that indicate a pattern of wrongdoing by an employee or contractor. Results from the initial configuration and ongoing connection with the business to understand their changing needs will drive subsequent iterations and adjustments to policies, rules, actions and response workflows.

The challenge many enterprise’s have is that they start with a brute force technology implementation, that quickly results in chaos due to either business pushback or an overwhelming flow of alerts. This often leads to a halt or slow down in the DLP implementation, resulting in reduced protection and exposed data.

The pure technology approach does not work well in any domain. Would you implement sales force automation without having the appropriate metrics and workflows defined to ensure it will work with your organization and to optimize the process on an ongoing basis? Part of the reason that SFDC was so successful is that it provided organizations with out of the box tools to align it to the way they do business, and the reporting to ensure that it is performing in line with expectations. When it comes to DLP, that means understanding key data points including policy effectiveness, remediator performance, user risk and data exposure.

Let the data tell the story

As in any business process, the right data and supporting analytics provides the needed insights for optimization. Thankfully, DLP technology captures a wealth of data that can be unlocked to provide the needed insights.

In the case of DLP, analytics provides insights into policy effectiveness and associated user behavior that help drive policy definition, analyst performance, endpoint DLP agent coverage, data at rest exposure risk and user risk measurement – prioritizing events and allowing analysts to view and remediate events based on user/person risk. This last point is particularly important, as it greatly enhances your analyst’s effectiveness and efficiency.

By looking at events at the person/user level, based on risk and behavior, analysts can focus on the most important events as a group to investigate and remediate, and not be left to identify and connect the dots between individual incidents that are flagged as high severity by the DLP system. This also lays a foundation for more sophisticated Insider Threat analytics that can be subsequently layered on top of an optimized DLP program by adding additional user activity sources and indicators of attack/compromise.

Ultimately, the goal is to use machine learning and analytics to optimize the DLP program to serve as an effective control to prevent the exposure of sensitive data. From there more sophisticated data and analyses can be seamlessly and incrementally added to the process, to provide more proactive detection of accidental and malicious threats.

Stop the bleeding and stabilize the patient, then conduct the right tests to deal with the greater issues at hand.