Lessons Learned from the Vault 7 CIA Insider Threat Leak
We should thank the Department of Justice for their consideration in wrapping up one of the most (potentially) consequential government insider threat cybersecurity stories in time for us to cover it for this year’s National Insider Threat Awareness Month.
And this one was a doozie.
In case you missed it in July, ex-CIA programmer Joshua Schulte was convicted by a jury for stealing and exposing a massive trove of government-owned cybersecurity tools and secrets when he passed nearly 9 thousand stolen files to Wikileaks back in 2017.
According to the reports, “The cache of cyber spy secrets including zero-day vulnerabilities in Android, iOS, and Windows, along with known bugs in routers, smart TVs, and smart vehicles that were allegedly being exploited for spycraft by the CIA.”
In the dump passed on to Wikleaks, known as the Vault 7 leak, were a whole host of source files containing an estimated 91 hacking tools used by some of the most sensitive parts of the Agency’s operations.
Damian Williams, the United States attorney in Manhattan, where the trial was held, hailed the verdict. Mr. Schulte has been convicted of “one of the most brazen and damaging acts of espionage in American history,” Mr. Williams said in a statement.
Having decided to represent himself and forego a plea deal, Schulte faces up to 80 years in federal prison for his actions. Or at least this is the estimate before any ruling on charges related to child abuse materials that were also found in his possession at the time of his arrest.
Examining the Impact of this Insider Threat Attack
The CIA has had traitors before, and will again in the future.
In an interview with Wired following the leak to WikiLeaks back in 2017, another CIA leaker named John Kirikou told the interviewer that, “There are three holy of holies, sources and methods, liaison relationships, and anything having to do with NSA. Never ever talk about stuff like that.”
Schulte crossed that bridge and threw the match.
His was a case of a sworn member of the United States’ intelligence community deciding to burn years worth of work and knowingly put active operations at risk. Possibly even bringing harm to agents in the field who could be traced back to the use of these hacking tools.
Looking at the repercussions of this incident, we can see a number of impacts that are common across government data leaks, and are acutely felt when it comes to intelligence agencies.
1. Disruption of Operations
We are unlikely to ever learn how many operations were burned in this leak as adversaries learned of the tools that the CIA was using.
Schulte was a member of the Operations Support Branch that is tasked with building tools for immediate espionage purposes. This means that someone in the field needed a tool for a specific purpose on a short timeline, and would reach out to this department for help.
It is more than likely that the more sophisticated actors used the information from the leaks to scan for the tools and patch the vulnerabilities that the CIA was exploiting. These included more than a few 0-days that were probably pretty valuable until they weren’t.
2. Putting Agents at Serious Risk
While many of the tools that were burned in this leak were for remote use, more than a few required a human to install the tools on physical devices, closing the air gap. Carrying out these kinds of operations are risky if you are caught because you as the agent can be caught red handed and may still be in the physical location.
When adversaries of the US heard about these leaks, it is more than likely that they started checking to see if they were impacted. If they found any of these tools in their machines or environments, they may have looked to see who the agents were to may have been responsible for installing them.
In 2017 it was reported that China had killed or imprisoned at least 18 US spies in that country. Espionage is a serious game, and if prospective agents do not feel that the US can protect them, then it makes it increasingly difficult to recruit more spies in the future.
3. Burning Tools and Methods
Along with the 0-days that became less useful N-days, having 91 hacking tools become public could not have been a good day at the office down in Langley. Considering the years of work that had gone down the drain, it also left the Agency with fewer tools at their disposal to carry out their mission.
Beyond the damage to their stockpile and capacity against adversaries, the leak was also a public relations headache. This is mostly because, similar to the Snowden case, it showed how US intelligence agencies were exploiting vulnerabilities that could arguably have left American users less secure. Both for having left the vulnerabilities unpatched and for how they may be used against them.
There is an ongoing debate about whether the government should report vulnerabilities to vendors when they find them. While smart people can disagree on where the line falls, it never makes the government look good and can reduce trust in the agencies amongst the public.
A side effect of hacking tools getting burned in a leak like this is that they often trickle out into the civilian world once they are exposed. Similar to how the Russian NotPetya and American Eternal Blue were picked up by criminal groups after they became known, it is more than likely that the source code for these tools ended up in malware deployed by criminal hackers once it became public.
Beyond the impacts of his leak, Schulte stood out for going against the usual motivations for this kind of treason.
An Uncommon Case of Treason
Interestingly, Schulte represents one of the less common profiles for a malicious insider threat.
Government leakers usually fall into two categories:
These are the malicious insiders doing it for the money. Maybe they are selling secrets to the Soviets or criminal groups to support an expensive lifestyle. They might sell personally identifiable information to criminal groups for some cash on the side.
There have also been cases of insiders working with former government officials to help them with their personal businesses.
The connective tissue for these actors is that they all want to get paid.
While many of these actors may seek out financial compensation for their malicious activity, they also have an ideological or national drive behind their actions.
Malicious insiders like Edward Snowden, former CIA officer Alexander Yuk Ching Ma, Reality Winner, and many others, violate their commitments to secrecy at least in part to serving another country or in opposing the US government.
Every case is different, often in their scope and severity as well as the motivations, but compensation is not the only factor in these instances.
Then there is a third category.
By all accounts, Schulte falls into neither of the categories above. Instead he was allegedly a really smart yet explosive guy who thought that he knew better than everyone else.
According to the reporting, he was unhappy with his work at the Agency and left to work elsewhere. Apparently though, he made sure not to leave totally empty handed.
Indicators of an Insider Threat
Catching an insider threat can be tricky because by being a member of your organization, they are supposed to be there.
Unlike an external hacker, they have their own credentials and know where all of the valuable information is stored away. This makes them harder to detect.
However, there are signs that someone may be a risk to your organization‘s security.
1. Asking for materials outside of their normal areas of work
Organizations should be segregating sensitive materials and resources so that no single insider can cause too much damage.
Infamous leaker Edward Snowden reportedly had to convince unwitting colleagues into giving them their credentials because of segmentation policies that his agency had in place.
Someone asking to “borrow” your creds for some sort of need should be a red flag.
2. Downloading Giant Quantities of Data
Some businesses may appreciate employees taking their work home with them, but government or other similarly sensitive organizations should view this differently.
Since they know where the good stuff is, insiders can target the most valuable data and exfiltrate only the best bits. Or they can pull a Chelsea Manning (who downloaded nearly 500,000 documents) and grab a motherload.
3. Anger and Erratic Behavior
A good sign that an employee may try to do something that harms the organization is if they are acting angrily and out of the bounds of normal behavior.
Manning had a violent incident that reportedly had some folks considering revoking her access to sensitive materials. Schulte had developed a reputation for explosive anger during his time at the Agency, earning him the nickname Voldemort and nuclear option. This anger and erratic behavior should have been a sign that he was a risky employee.
It is very much keeping with character for an ideological or disgruntled malicious insider to act out before taking their big destructive action.
Given these indicators, organizations need to take steps to pick up on them before they have an incident on their hands. Using the right tools to deter, detect, and deduce can play a big role in helping to mitigate their risks.
3 Tips for Deterring an Insider Threat
The Department of Homeland Security’s National Cybersecurity and Communications Integration Center offers a number of tips and technologies for detecting potential insider threats and mitigating the damage from an incident.
Here are a few that should be at the top of your list.
1. Let Employees Know You Use Data Loss Prevention Tools
In their guide, they say that organizations should, “Announce the use of policies that monitor events like unusual network traffic spikes, volume of USB/mobile storage use, volume of off-hour printing activities and inappropriate use of encryption.”
Letting everyone know that your organization is using the technologies to record sessions, issue alerts to improper actions, and otherwise monitor improper use of data in ways that can impact security can help to deter your people from taking risky, malicious actions in the first place
2. Monitor User Behavior Analytics
Look for uncharacteristic behavior that may indicate that an employee is acting inappropriately or may be drifting towards risky behavior.
User Behavior Analytics allows you to harness big data, analyze it for trends, and draw valuable conclusions that can alert you to risks. By automating this process, you can extend your capacity for reviewing more data, all the time, and reduce the workload on your team.
3. Monitor Data Transfers
Along with alerts for transfers of data to external drives or via other digital means, you should use technologies that will allow you to set rules for what can be accessed after hours, customized blocking of attachments by unauthorized actors, and even prevent accidental leaks that can impact any team.
Listen and Provide an Outlet to Help Prevent a Good Employee from Turning Bad
One point stood out in the Guide from DHS was that beyond the technology, departments can help to prevent their people from turning on them if they, “Provide avenues for employees to vent concerns and frustrations to aid in mitigating the insider threat motivated by disgruntlement.”
It sounds simple and it definitely will not prevent a determined insider in every instance, but it is amazing how much allowing a person to feel heard and appreciated can help to keep them in their job and loyal to their team.
So as we mark another Insider Threat Awareness Month, try to find the right mix of solutions and approaches that will help to keep your organization safe from an insider threat for another year.