Part 1 of this series revisited Pfc. Manning’s activities at Forward Operating Base Hammer with a focus on indicators that provided information about the level of oversight and risk management with respect to protecting Secret and Top Secret-level data.
In Part 2, we will look more closely at the information security environment at FOB Hammer and the initial response to the leaks by the DoD. We will begin to see how incompetent the whole chain of command from the leadership at FOB Hammer all the way up to the Secretary of Defense is in protecting high-value information.
Elephant 2 after the fold.
Elephant 2
In theory, the Government and the DoD take information security seriously. Appendix III to OMB Circular No. A-130 is the prime directive for information security in the federal government and the Armed Forces. It defines “adequate security” as:
Security commensurate with the risk and magnitude of harm resulting from the loss, misuse or unauthorized access to or modification of information.
For clarification of responsibilities for security on classified networks it says:
Policy and procedural requirements for the security of national security systems (telecommunications and information systems that contain classified information or that support those critical national security missions (44 U.S.C. 3502(9) and 10 U.S.C. 2315)) is assigned to the Department of Defense pursuant to Presidential directive.
From that directive comes years and volumes of studies and publications on how to secure information. The Army did not provide “adequate security” in this case. But not because they didn’t know better or for lack of authorization.
Since the early ’80s, beginning with “the Rainbow Series,” the DoD has developed and published information security guidelines, policies and procedures. From “the Brown Book (1989):”
Systems that are used to process classified or other sensitive information must be designed to guarantee correct and accurate interpretation of the security policy and must not distort the intent of that policy. Assurance must be provided that correct implementation and operation of the policy exists throughout the system’s life cycle.
The DoD has also been an active partner with the Computer Security Resource Center at the National Institute of Standards and Technology in the development of the Special Publications (800 Series) information security guidelines. (These guidelines are best practices for securing IT systems). It has been addressing the insider threat problem for some time. For example, see here (pdf), here (pdf), and here (pdf). It has developed and implemented policies on how facilities that deal with highly confidential information should be operated and what what security controls should be in place on IT systems and components that are involved . . . Having said that . . .
From the Tech Broiler blog entry titled “Wikileaks: How our Government IT Failed Us:”
However, . . . Manning would never have been able to transfer that data if the Army had been following the same standard IT practices that it follows stateside and on military bases and other government installations.
So what nailed us was simple. We allowed this guy to walk into work with writeable DVD media and gave him laptops with functional read/writeable DVD drives and possibly even USB ports, at an Iraq field operations center in a theater of war, when the standing policy on military bases and in other government installations (such as at US Central Command) is to prohibit personnel from bringing USB devices, Smartphones, iPods and CDs onsite.
That’s just plain stupid.
Well said! Yes it is. And it clearly demonstrates breach of duty and negligence on the part of Manning’s chain of command.
In the face of standing policy, it is hard to envision an environment in which a breach would be easier to perpetrate than this one. The excuse that has been offered to the press was that it was important that the information on those networks be available to forward operational areas. Somehow or another, this is supposed to be an excuse to ignore policy.
From a Computerworld article about the scramble within the DoD to figure out what happened captured these gems from Secretary of Defense Gates:
Because of the leak, Gates said that the Department of Defense will review whether the approach should be changed. “Or do we continue to take the risk,” said Gates.
Gates said there are “technological solutions'”to the dilemma, but they aren’t immediately available.
This is a red herring. It is also total bull and a blatant display of how clueless the leadership of the Executive Branch and the Department of Defense really is.
Controls that would have prevented this from occurring are commonsense, not high-tech, and readily accessible to anyone at a Forward Operating Base. A screwdriver to remove the DVD drive and some epoxy to plug the USB/Firewire/XXX ports would have prevented this. In situations in which it is necessary for an analyst to move data between classified and unclassified networks (and there is a need for removable media), there are well defined policies, processes and procedures for doing so without jeopardizing security.
It is not clear from published reports what the configuration of the “secure room” that housed the SIPRNet and JWICS “terminals” was, but in most situations it is a stand-alone building or a sealed-off area in a building (or ship) which is guarded by at least one (usually more) armed guards. Access to these areas is supposed to be strictly limited to only those with a need to be there, and the rule, enforced by search, is “nothing in, nothing out.” Basically, you can’t take anything in with you that you could use to smuggle anything out, and if you come out with anything that could be used for smuggling anything out, you get to go to the brig while everything you had on you is taken to pieces.
Manning noted that the SIPRNet and JWICS terminals were “air gapped.” This is tech-speak for saying that neither SIPRNet nor JWICS were directly connected to any other network. In other words, they were completely separate networks and the only way a person could access resources on either network would be from a terminal directly connected to that specific network.
That the terminals were air-gapped implies that the systems were implemented around the Bell-La Padula model for enforcing access control rules which use security labels on objects and which determine access to them based on the clearance level of the entity requesting access. In this case, there are very strict rules about how data is handled, “where” it can go and how it is transferred.
There are occasions when it is necessary to move data from the highly-classified side (in this case, SIPRNet or JWICS) to a less secure network (for instance the local operations network, or even to hard copy). This transfer is only supposed to happen in the “secure room,” and customarily “under four eyes.” (The idea behind the “four eyes rule” is that the only way for any “funny business” to succeed would necessitate collusion between the parties involved). In most cases, removable media that are for use in the “secure room” are stored in a locked container, such as a safe, except while actually in use. The storage and decommissioning of media used for these transfers is closely monitored and managed and nothing leaves the room without having been decommissioned. These site security controls aren’t rocket science or Mission Impossible stuff.
It’s not that hard:
Rule 1: As a general rule: Nothing in, nothing out.
Rule 2: Anything that goes in, stays in until it’s decommissioned. I.e. crushed, shredded, burned, wiped, disabled . . . whatever. And usually, there is special handling of the remains once they come out.
This is done every day in the real world. These are not “technological solutions” that “aren’t immediately available.” This is Commonsense Security Measures 101: Effective Uses of Duct Tape, Chewing Gum and Baling Wire. They just weren’t in practice.
To make matters worse, there was a lapse in personnel management. In the real world, when someone is demoted, fired, or even when they resign, two things happen simultaneously. In the meeting in which the change of status occurs, a company officer collects any tokens, badges, or whatever the person uses to gain access to the organization’s operational systems while security removes all company documents from the person’s station and all of the person’s logins are removed, etc. Even mom-and-pop stores take the cash register key back from the employee before he leaves their sight. In Manning’s case, he was allowed full, unattended access to Top Secret systems until the end. Manning was the archetypal insider threat. He was disgruntled, he had been demoted and was being shipped out. And nobody thought to take the keys to the cash register from him. Danger, Will Robinson!
Manning was a classic example of an insider threat. The concept of the insider threat is not new. It predates the evolution of the computer. The information security world – inside and outside the DoD – has been addressing this problem since at least the middle 1980s. (See the documents referenced above for examples of DoD involvement. One of them even has the title “DoD Insider Threat Mitigation). Outside of the DoD there are several organizations that devote considerable resources to the subject. Bin-Goo-hoo on the key “insider threat” and you will have weeks of interesting reading. A great place to start is the Insider Threat page at US Security Awareness. It’s not a tough concept to master, it’s easy to recognize, and managing the risk does not require rocket science.
In an interview with BankInfoSecurity.com, well-known information security pioneer Marcus Ranum had this to say when asked about lessons learned from WikiLeaks:
Well, to me the biggest lesson is that the people who are inside your organization are the ones who can really hurt you, because they know where the good stuff is. That is a serious problem. So, to me that is the big takeaway. Now that is not [news]. I think I’ve been banging that drum for 20 something years, and security practitioners before me were banging it for 20 years before that. So that is nothing new.
A report published in 2006 which was commissioned by the DoD titled: “Comparing Insider IT Sabotage and Espionage: A Model-Based Analysis” listed six major observations:
Observation #1: Most saboteurs and spies had common personal predispositions that contributed to their risk of committing malicious acts.
Observation #2: In most cases, stressful events, including organizational sanctions, contributed to the likelihood of insider IT sabotage and espionage.
Observation #3: Concerning behaviors were often observable before and during insider IT sabotage and espionage.
Observation #4: Technical actions by many insiders could have alerted the organization to planned or ongoing malicious acts.
Observation #5: In many cases, organizations ignored or failed to detect rule violations.
Observation #6: Lack of physical and electronic access controls facilitated both IT sabotage and espionage.
Just for grins and giggles, let’s compare Manning and his situation at FOB Hammer with the list. Let’s see . . .
Observation #1: check
Observation #2: check
Observation #3: check
Observation #4: check
Observation #5: check
Observation #6: check
Wow! Six out of six! Congratulations! A perfect score!
D’oh! Wait, the DoD gets 10 points extra credit with Manning! From an article in the Washington Post titled “Mental health specialist recommended WikiLeaks suspect not be deployed to Iraq”:
Manning’s immediate supervisor, an Army master sergeant, required him to seek mental health counseling after he displayed signs of instability. The master sergeant and an Army major then discussed whether to deploy Manning based on concerns that he was a risk to himself and others. The master sergeant and the unit’s commander, a captain, decided to send him to Iraq because the unit was short of intelligence personnel, because Manning’s behavior had started to improve, and because he seemed receptive to therapy.
Here endeth the Lesson.
Manning was the archetypal insider threat. The final component of the perfect storm. And the whole thing could have been avoided if his access to the data had been cut off when it should have been. Just take the “keys to the T-bird” away. There is not an HR manual in the world that doesn’t have something in it about what actions should be taken when disciplining a high-risk employee. And in every way imaginable, Manning was high-risk. Emotionally unstable, demoted, being sent home to what was most likely going to be a very unpleasant environment, and he has unfettered access to highly sensitive information. There are no words to describe the degree of negligence that was shown by not cutting off Manning’s access to SIPRNet and JWICS.
To sum up:
- Site security at FOB Hammer was abysmal to non-existent.
- Manning was the archetypal insider threat.
- No one took any precautions to limit his access to sensitive systems.
- The DoD’s responsibilities with respect to information security are clearly defined.
- The DoD’s responsibilities with respect to information security were completely ignored.
Manning may have been the one who perpetrated the deed, but the responsibility for allowing an environment that enabled him (as opposed to thwarting him) lies directly on his commanding officer and the chain of command. There is no excuse whatsoever for that to have happened. The explanation is criminal negligence and incompetence on the part of Manning’s chain of command.
RIP Elephant 2.
The incompetence exposed at FOB Hammer is nothing compared to that of those who ran SIPRNet and JWICS. Part 3 will explore that and examine some of the implications.
This blog entry was cross-posted to lartwielder at Daily Kos.