Data loss from insiders continues to pose a growing threat to security, with emerging technologies such as AI and generative AI (GenAI) only compounding the issue, indicating swift action is needed, according to Code42.
Since 2021, there has been a 28% average increase in monthly insider-driven data exposure, loss, leak, and theft events. 85% of respondents expect this trend to continue over the next 12 months.
Data protection falls short
While 99% of companies have data protection solutions in place, 78% of cybersecurity leaders admit they’ve still had sensitive data breached, leaked, or exposed. As today’s risks are increasingly driven by AI and GenAI, the way employees work, and the proliferation of cloud applications, respondents state they need more visibility into source code sent to repositories (88%), files sent to personal cloud accounts (87%), and customer relationship management (CRM) system data downloads (90%).
“Today, data is highly portable. While AI and cloud technologies are igniting new business ventures that allow employees to connect, create, and collaborate, they also make it easier to leak critical corporate data like source code and IP,” said Joe Payne, CEO of Code42.
“This year, the research highlights the new challenges posed by AI as data sets are being pushed outside of organizations to train LLMs. We also see that source code is now considered the most important data to protect, other than financial information and research data. This is a critical finding, as most data protection tools are incapable of spotting the most common source code exfiltration techniques,” added Payne.
Organizations concerned over AI impact on sensitive data
79% of respondents believe their cybersecurity team has a shortage of skilled workers, leading cybersecurity leaders to turn to AI (83%) and GenAI (92%) technology to fill the talent gap; but these aren’t a 1:1 replacement, and the report also cautions against the possible data loss risks of these tools.
73% of cybersecurity leaders state that data regulations are unclear, while 68% are not fully confident their company is complying with new data protection laws. 98% believe their data security training requires improvement, with 44% of respondents believing it requires a complete overhaul.
89% of respondents agree that their company’s sensitive data is increasingly vulnerable to new AI technologies. 87% are concerned their employees may inadvertently expose sensitive data to competitors by inputting it into GenAI. 87% are concerned their employees are not following their GenAI policy.
Risks can vary by employee age and role, with companies more concerned about data security breaches from Generation Z and Millennials falling victim to phishing attacks (61%), oversharing company information online (60%), sending company files/data to personal accounts/devices (62%), and putting sensitive data into GenAI tools (58%).
Respondents also believe senior management (81%) and board members (71%) pose the greatest risk to their company’s data security, likely due to having wide-reaching access to the most sensitive data.
Insider data loss drains time and money
Insider-driven data exposure, loss, leak, and theft events can have vast financial repercussions, with cybersecurity leaders estimating that a single event would cost their company $15 million, on average. Respondents spend an average of 3 hours per day investigating insider-driven data events.
72% of cybersecurity leaders are worried they could lose their job from an unaddressed insider breach. To be effective, companies believe that data protection solutions should offer speed and ease of investigation (42%), visibility into file contents and metadata (39%), and should be able to integrate with other tech solutions (38%).
Greater visibility is needed so that companies have sight on data being copied into GenAI tools to identify and remediate risks before it’s too late.