- Buy Microsoft Visio Professional or Microsoft Project Professional 2024 for just $80
- Get Microsoft Office Pro and Windows 11 Pro for 87% off with this bundle
- Buy or gift a Babbel subscription for 78% off to learn a new language - new low price
- Join BJ's Wholesale Club for just $20 right now to save on holiday shopping
- This $28 'magic arm' makes taking pictures so much easier (and it's only $20 for Black Friday)
Cybersecurity: Track data activity before
A security expert raises concerns that a lack of identifying and tracking unusual data activity can have dangerous consequences.
There’s usual data activity, unusual data activity, and then there’s dangerous data activity. Christian Wimpelmann, identity and access manager (IAM) at Code42, expresses concern that not enough emphasis is placed on paying attention to data activity at the company level. In the article When Does Unusual Data Activity Become Dangerous Data Activity?, Wimpelmann looks at each type of data activity and offers advice on detecting unusual activity before it becomes dangerous.
What is usual data activity?
To begin, Wimpelmann defines usual data activity as activity during normal business operations. “Sophisticated analytics tools can do a great job of homing in on the trends and patterns in data,” Wimpelmann said. “They help security teams get a baseline around what data is moving through which vectors—and by whom—on an everyday basis.”
By using analytics, specialists can compare a given action against:
- Common activity patterns of users
- Normal activity patterns of a specific file or piece of data
Wimpelmann cautions that too many security teams focus solely on the user, adding, “It’s the data that you care about, so taking a data-centric approach to monitoring for unusual data activity will help guard what matters.”
SEE: Checklist: Securing digital information (TechRepublic Premium)
What is unusual data activity?
Unusual data activity is the suspicious modification of data on a resource. An example would be the deletion of mission-critical files on a data storage device. “Unusual data activity is the earliest warning sign of Insider Risk and a potentially damaging data leak or data breach,” Wimpelmann said. “Whether malicious or unintentional, unusual data access and unusual data traversing networks or apps is often a precursor to employees doing something they shouldn’t or data ending up somewhere much more problematic—outside the victimized organization.”
What are the signs of unusual data activity?
Through experience, Wimpelmann has created a list of unusual data activities (Insider Risk indicators) that tend to turn into dangerous data activities. Below are some of the most common indicators:
- Off-hour activities: When a user’s endpoint file activity takes place at unusual times.
- Untrusted domains: When files are emailed or uploaded to untrusted domains and URLs, as established by the company.
- Suspicious file mismatches: When the MIME/Media type of a high-value file, such as a spreadsheet, is disguised with the extension of a low-value file type, such as a JPEG, it typically indicates an attempt to conceal data exfiltration.
- Remote activities: Activity taking place off-network may indicate increased risk.
- File categories: Categories, as determined by analyzing file contents and extensions, that help signify a file’s sensitivity and value.
- Employee departures: Employees who are leaving the organization—voluntarily or otherwise.
- Employee risk factors: Risk factors may include contract employees, high-impact employees, flight risks, employees with performance concerns and those with elevated access privileges.
- ZIP/compressed file movements: File activity involving .zip files, since they may indicate an employee is attempting to take many files or hide files using encrypted zip folders.
- Shadow IT apps: Unusual data activity happening on web browsers, Slack, Airdrop, FileZilla, FTP, cURL and commonly unauthorized shadow IT apps like WeChat, WhatsApp, Zoom and Amazon Chime.
- Public cloud sharing links: When files are shared with untrusted domains or made publicly available via Google Drive, OneDrive and Box systems.
SEE: Identity is replacing the password: What software developers and IT pros need to know (TechRepublic)
Why is it so hard to detect unusual data activity?
Put simply, most security software isn’t designed to detect unusual data activity and insider risk. Most conventional data security tools, such as Data Loss Prevention and Cloud Access Security Broker, use rules, defined by security teams, to block risky data activity. “These tools take a black-and-white view on data activity: An action is either allowed or not—and there’s not much consideration beyond that,” Wimpelmann said. “But the reality is that many things might fall into the ‘not allowed’ category that are nevertheless used constantly in everyday work.”
On the flip side, there are plenty of things that might be “allowed” but that could end up being quite risky. What’s important are the true outliers—whichever side of the rules they fall on.
What to look for in analytical tools
Wimpelmann suggests using UEBA (user and entity behavior analytics) tools to separate the unusual from usual data activity. He then offers suggestions on what to look for in forward-thinking security tools. The security tools should:
- Be built using the concept of Insider Risk indicators.
- Include a highly automated process for identifying and correlating unusual data and behaviors that signal real risks.
- Detect risk across all data activity—computers, cloud and email.
- Start from the premise that all data matters, and build comprehensive visibility into all data activity.
And, most important of all, the security tool should have:
- The ability to accumulate risk scores to determine event severity.
- Prioritization settings that are easily adapted based on risk tolerance.
- A simple risk exposure dashboard.
Final thoughts
Security teams need a company-wide view of suspicious data movement, sharing and exfiltration activities by vector and type. Having a security tool and adequately trained team members focuses attention on activity—in-house and remote—needing investigation. Wimpelmann concluded, “This empowers security teams to execute a rapid, rightsized response to unusual data activity before damage can be done.”