Insider Threat Signals
3–31–2025 (Monday)
Hello, and welcome to The Intentional Brief - your weekly video update on the one big thing in cybersecurity for middle market companies, their investors, and executive teams.
I’m your host, Shay Colson, Managing Partner at Intentional Cybersecurity, and you can find us online at intentionalcyber.com.
Today is Monday, March 31, 2025, and we’re continuing to see cyber - and information security more broadly - be at the center of the national discussion here in the US. What lessons can we draw from the ongoing Signal scandal? Quite a few, turns out.
Insider Threat Signals
By now, the news around the fact that Cabinet officials in the United States used encrypted messaging app Signal to coordinate a missile strike on Houthi rebels in Yemen almost feels like old news, which saves me the need to weigh in on the political implications.
That said, I do think it highlights some very real risks that we all face in our own companies around “insider threats” in a way that’s maybe not obvious.
When we think of an “insider threat,” we commonly frame them as malicious - trying to steal from or defraud the company in some way.
And, to be clear, that does happen. We saw news last week about startup company Flexport accusing two former employees of stealing source code to start a competitor.
But generally, the threat is not malicious, but rather either complicit or simply unintended, if I view it in the most magnanimous lens.
What we’re seeing in the fallout from the Signal story is more akin to that latter category - I can see a framing where these folks were just trying to get things done, and used a tool at their disposal to accomplish the goal.
The problem, of course, is that in doing so, they created a significant amount of risk for their organization - in this case, the US Government, and specifically the F-18 pilots who are in many ways lucky to have not been shot down that day.
We see this same pattern, however, in lower stakes situations. One way, of course, involves the same use of these type of encrypted messaging platforms to conduct business outside of regulations. Whether that’s malicious or just incompetent, I’ll leave to your judgement, but we saw regulators fine a slew of financial institutions back in 2023 for this very thing, using Signal, WhatsApp, and other messaging platforms for off-channel communications in violation of federal record keeping requirements (which, it would seem, were also violated in this latest Signal story).
So while we’ve got regulatory exposure for these practices, we also see material cyber risk generated by similar employee actions - namely with tools like filesharing.
I’m sure you’ve even seen this happen, where there’s a file that needs to go outside the organization, but is too large for email, so it ends up in someone’s personal Dropbox. We can all see lots of reasons why this is risky, but also understand how it happens.
The problem, of course, is that it’s very difficult to know what’s been sent where, from a data inventory perspective, and manage that risk appropriately (like ensuring the need for multi-factor authentication, time-boxing the length of file storage on that external service to a week, etc.).
In fact, I’ve even worked an incident where the wrong person was emailed because the email client auto-filled the name and the employee didn’t doublecheck before the data was out the door to the wrong Brian.
So what can we do about this risk?
First of all, I think we need to recognize that the vast majority of insider threat is not malicious, but rather either complicit or unknowing with regards to the risk exposure. Cultures where the security team views everyone as malicious can get toxic very quickly.
That said, we do need to put some guard rails in place to prevent folks from crossing these bright lines when they shouldn’t. This looks like a good set of policies laying out expectations for employees, paired with training to ensure they know where those lines are.
It also looks like least-privilege access controls to limit what any one employee can access, and ensure the business need is there.
Third, from a technical perspective, we should adopt the notion of “paved paths” - a framing I’m cribbing here from Netflix, that essentially frames the idea that you should make the easy path or the default path for employees the secure path, as well.
This might mean providing some messaging capabilities or file sharing capabilities or any of these other potential areas of data loss, leakage, or exposure - which can be hard for some more traditional IT departments. I see this often with the idea of starting a BYOD program so that you can ensure corporate data is removed from personal devices when employees leave.
But, the truth of the matter, is that if you don’t offer it, employees will still seek it out.
People remain one of the hardest parts of an organization to secure, and we’ve just gotten another stark reminder of that.
Fundraising
From a fundraising perspective, a very big week with almost $35B in newly committed capital, the majority of that (€21.5B) coming from EQT, who raised their sixth infrastructure fund.
We also saw Oakley Capital of London raised €4.5b for its sixth flagship PE fund, so the European market continues to be quite active.
Total fundraising for Q1 now comes in at just over $200B, which will have to get put to use at some point.
The CoreWeave IPO, which we spoke about last week, seems to be off to a bit of a bumpy start, after reducing their initial offering, saw trading flat on the first day, and is now down about 10% as I record this on Monday.
Perhaps the private markets aren’t such a bad place to be, after all.
A reminder that you can find links to all the articles we covered below, find back issues of these videos and the written transcripts at intentionalcyber.com, and we’ll see you next week for another edition of the Intentional Brief.
Links
https://www.cnn.com/2023/08/08/business/regulator-wall-street-fine-whatsapp/index.html
https://systemweakness.com/security-like-netflix-1dcde455e8cd