Taking a few simple steps can help reduce the likelihood that your sensitive data will experience a breach.
With so many files in existence and so many more being created every moment, it’s no wonder so many breaches and data loss incidents occur. We asked the experts for some of the top tips on keeping storage data protected.
Many of the big data breaches we read about in the news trace their origins back to one of these two issues, and most likely both: too much access and little or no monitoring of that access. These are some of the biggest problems in data security, according to Rob Sobers, Director at Varonis.
The 2017 Varonis Data Risk Report, found that 20 percent of folders are open to every employee. Forty-seven percent of organizations in the report had at least 1,000 or more sensitive files containing personal data, health records, financial information or intellectual property open to every single user. Not only are sensitive files open to more people than necessary, but access abuse is not monitored and flagged. This is why 63 percent of data breaches take months or years to detect, according to the report.
“One of the most immediate ways to increase an organization’s security posture is to remove unnecessary access, including global access groups, broken permissions, inconsistent Access Control Lists (ACLs) and unique permissions,” said Sobers. “You don’t have to look further than the DNC, Sony, The Panama Papers and numerous other WikiLeaks publications to see the damage weak access controls can wreak on an organization.”
In a recent Ponemon study, 62 percent of end users say they have access to company data they probably shouldn’t see. Hackers, nation-states and malicious insiders will take advantage of overly permissive access and limited detection controls to snoop, steal or hold hostage an organization’s valuable data. Like locking the front door, access to files has to be controlled at a basic level.
Data protection and security can and should be seen as a revenue generator when the right people have access to the right data at the right time. The problem is that organizations either swing too far toward productivity or data security: either leaving things too wide open so that information is insecure, or locking down data so tight that it disrupts productivity. The correct approach is to avoid any pendulum swings and find the an appropriate middle ground.
“With the right solutions in place, an organization can lock down data without disrupting productivity through automation and regular attestations from the data owners themselves,” said Sobers.
Many users will tell you that file access controls can be a hassle. There are plenty of war stories about the trials and tribulations of being unable to open documents that should be accessible. Instances exist, for example, of companies sending out press releases which required permission to open, print or copy. Their personnel had gotten into the habit of protecting everything to the point where they would inadvertently prevent people from even reading, copying text from or printing their media announcements. This is an example of digital rights management (DRM) gone mad. In some cases, certain rights can be applied to everything, and a default lockdown setting comes into play for any and all documents, even an invite to the office picnic.
“You can avoid this by reserving DRM for specialty cases and instead focus on ensuring that only the right people have access to the right data in the first place, that their use is monitored, and abuse is flagged,” said Sobers.
Permissions on storage data files can often go overboard, and safeguards can sometimes become overly elaborate – especially if most of what is protected is non-confidential. For example, even if a PDF file is set not to print or not to allow cut and paste, nothing precludes just taking a screenshot. And if that doesn’t work, take a photo with your phone.
“Lots of what is implemented today is not necessarily useful and could be ‘security theater’ like the TSA,” said David Ginsburg, vice president of marketing, Cavirin.
What about securing your data when you send it to the cloud? How best should you control it so it can’t be compromised without making things cumbersome for IT or the end user, or external collaborators?
The same access controls and monitoring that are often in place for files on-premises can and should be applied to files in the cloud. The good news is that most cloud storage platforms have the necessary controls concerning users, groups, and permissions.
“The pervasiveness of hybrid environments makes it harder for IT to control access to data,” said Sobers. “Many organizations are adopting a data security platform, which lets you manage and protect data for both on-premises and cloud storage from a unified interface.”
People pay a lot of attention to encryption – and with good reason. Many of the big cloud providers go to great lengths to reassure potential customers that their encryption algorithms are watertight. But what sometimes gets overlooked is the server itself – the file system, who has access to it, permissions, etc.
“This is a source of breaches, and in many cases, data at-rest is still unencrypted, so very vulnerable,” said Ginsburg.
Data might be protected on prem. And the cloud provider may well do a top-notch job of protecting it when it is in the cloud. But what about when it is en route? Data gets shifted around constantly, and it is at those points where it could be at its weakest.
“Watch out when you are migrating workloads to the cloud or have deployed a hybrid environment,” said Ginsburg. “There are companies that specialize in cloud migrations.”
For example, AWS offers the Virtual Private Cloud (VPC). And beyond the technology side, there are various best practices for setting up secure tunnels, Ginsburg added.
First there was backup. Then encryption, deduplication, compression, DRM, VPNs, DR, replication, undelete, archiving and more and more products, all designed to protect data. While they all are fine in their own right, things can get a little cumbersome.
“People often build their data protection environments using point products such as protection storage, protection software, servers, switches, etc., often from different vendors which leads to fragmented data protection, lengthy and complex deployment and a more complex environment,” said Richa Dhanda, director of product marketing for data protection, Dell EMC. “This often results in separate environments for different apps and platforms — meaning siloed data protection leading to high operational and management overhead and risks.”
But the market is gradually moving towards convergence. Two or more of these data protection functions can be found in several products on the market. And more converged data protection products are coming.
“Market trends continues to drive the need for solutions that can support and scale with growing data and deliver accelerated time to protect leading to emergence and growth of converged solutions,’ said Dhanda. “The goal of integrated solutions is to accelerate time to protect, reduce data silos, and cut operational and management costs and risks.”
Photo courtesy of Shutterstock.