So far in our story, we have focused on the role of the Department of Defense in facilitating Bradley Manning’s opportunity to acquire and export so much classified information. In Part 4, we will shift focus to the Department of State and meet the matriarch of the herd.
In many ways, the plot doesn’t change, but in other ways, the only way to describe what went on at State is that it was a leviticusly deuteronomous Charlie Foxtrot.
Way past Epic Fail . . .
The tale picks up after the break . . .
There are (at least) 8 ways in which State screwed the pooch on this one. The last seven follow from the first one:
- There was no risk assessment done. (By this I mean a real, grown-up Risk Assessment as described in NIST Special Publication 800-30: “Risk Management Guide for Information Technology Systems.” (pdf)
- There were no more network security controls in place on the State side of the network and on the database as there were on the Defense side.
- The database was not designed with security in mind – there were no security controls in place.
- There was a total absence of data management and access control.
- There was no data classification/segregation process in place.
- There was a mix of data types/classifications in the database.
- Users used the database inappropriately.
- Worst of all, there was no oversight of the system through its lifecycle.
An article in the Washington Post titled “WikiLeaks cable dump reveals flaws of State Department’s information-sharing tool” is a good starting point for this discussion. As I go through the article, I’m going to be calling attention to many no-nos, sometimes in a harshly critical way. But, before I begin, I want to say that this was a case of many well-intentioned people trying to do a good thing. The issue I have is not with them or their intent. It is with the lack of guidance and leadership that would have allowed them to accomplish their goal in such a way as it could be accessible without being vulnerable. The people who were asked to put this thing together were way out of their league . . .
Here goes . . . The second paragraph of the article introduces the source of the cables that Manning eventually passed to WikiLeaks:
It had a bureaucratic name, Net-Centric Diplomacy, and served an important mission: the rapid sharing of information that could help uncover threats against the United States. But like many bureaucratic inventions, it expanded beyond what its creators had imagined. It also contained risks that no one foresaw.” (Emphasis mine)
No one foresaw the risks because there was no formal risk assessment done before the database was designed and implemented. No one asked the tough questions before they started doing . . .
Note carefully the observation that it “expanded beyond what its creators had imagined.” What this means is that, basically, a mashup of “stuff” was commissioned, put into play and then left to drift as it may.
This is in contrast to systems that are built and managed by professionals. In the grown-up world, there exists this notion of a System Development Life Cycle (SDLC) . . . That is, that there are various phases through which a system moves from the time it is first conceived until it is discontinued, and each phase is actively managed. Part of the planning process involves planning for the growth and evolution of the application as the user base matures and the use of the application evolves. Another part of the planning involves how to respond to situations that arise that are not in the anticipated scenarios.
Amateurish does not begin to describe the implementation of the NCD database.
One of the NIST 800-series publications is titled “Security Considerations in the System Development Life Cycle.” (pdf) This document offers advice and best practices in how to manage the security requirements of a system as it progresses through its active life. Nowhere in the document does it say, “Bring it online and forget about it.” From where I sit, on the reader’s side of the Washington Post article, the level of sophistication I see in the implementation of NCD database is about what one would expect from a final project assignment in a first-semester programming course.
Partly because of its design but also because of confusion among its users, the database became an inadvertent repository for a vast array of State Department cables, including records of the U.S. government’s most sensitive discussions with foreign leaders and diplomats. Unfortunately for the department, the system lacked features to detect the unauthorized downloading by Pentagon employees and others of massive amounts of data, according to State Department officials and information-security experts.
Wow. Lets see how many things we can find wrong in those two sentences:
- poor database design
- confusion among its users (about how it was supposed to be used and what was and what was not supposed to be stored in the database)
- included most sensitive contents
- lacked features to detect unauthorized downloading of massive amounts of data
Jesus, Mary and Joseph! We’re not talking about recipes to go into The Official Foggy Bottom Cookbook here. This is Real Important Stuff that could get people killed or keep people from being killed! Jesus!
The article continues . . .
“It wasn’t clear what was to be shared or not shared,” the official said. “So you end up with a cable in the database that contains embarrassing stuff about [German Prime Minister Angela] Merkel. Is that the kind of stuff that a war fighter really needs to see?”
No, not really. We’ll address data classification in more detail in Part 5 when we talk about risk management.
A few State Department officials expressed early concerns about unauthorized access to the database, but these worries mostly involved threats to individual privacy, department officials said. In practice, agency officials relied on the end-users of the data – mostly military and intelligence personnel – to guard against abuse.”
Really? Like maybe there’s a banner at the top of each page that says: “We have no idea who the hell you are or whether or not you have any business accessing this information, but we are relying on you not abuse your privilege or use this data that is in any way inappropriate.”? Right! Pull my finger!
There is an old, honored adage in the risk management business. “Guard against capability, not intent.” Envision a person walking down the street toward you with a hand grenade in his hand. The spoon is still in place but the pin is missing. You may know this person. He/she may even be your sibling. They want you to take the grenade from them. They promise they will be very, very careful and be sure not to turn loose of the grenade until you tell them that you have it fully under control.
You have two options: one is to take the grenade from them and then take it over to the bomb disposal unit’s explosion trap. The other is for you to tell him/her to take it over there and dump it in the trap. Now, you may absolutely, completely, totally and unreservedly trust your sibling that they will do what they say they will do. And you may be strongly inclined to take the grenade from them. But what if, just outside your reach, they stumble and fall and drop the grenade? Security is never about trust. It is always about capability. When it comes to information security, all other rules come after Murphy’s Law. If you can’t count on anything else, you can count on that . . .
The only safe assumption one can make when it comes to security-related issues is that no one can be trusted. And not because all people are bad or have ulterior motives. People are human, they make mistakes. They can do things that have tremendously negative consequences not because they’re evil, but simply because they didn’t know any better . . . bad things can be done by good people . . . they just make a misstep and fall down, didn’t hear something they needed to hear . . . whatever. And that’s just the people who have no ulterior motive . . . It’s not a case of don’t trust people because they’re bad, it’s a case of expect people to make mistakes because they’re humans and humans make mistakes. But then there are those who intentionally take advantage of weaknesses in the system . . .
The department was not equipped to assign individual passwords or perform independent scrutiny over the hundreds of thousands of users authorized by the Pentagon to use the database, said Kennedy, the undersecretary of state.
<sarcasm>But that’s OK. It really wasn’t that important. And there were no other access control mechanisms available either.</sarcasm> Idiots! “Gee, there are going to be hundreds of thousands of people accessing this database of Secret, Top Secret and Top Secret NOFORN cables . . . I just don’t know how we can assign passwords to all those people, so we’ll just skip that and let anyone look at anything . . .” Jesus! Does this guy still have his job!?!?!?!?! This is a perfect indicator of how clueless these people are. More on this later.
“‘It is the responsibility of the receiving agency to ensure that the information is handled, stored and processed in accordance with U.S. government procedures,” he said.
<more sarcasm>I’m sure that each and every person who had access to that database received in-depth training on where the data came from, and what the US government procedures were for the handling, storage and processing of such data . . . Along with the closing: “Now, guys, remember how important this is . . . pretty, pretty please be careful. We won’t be monitoring you, so you’re on the honor system to do the right thing.” Really? Five hundred thousand people with access to this database and you really trust every one of them to be completely perfect . . . and I don’t mean perfectly honest . . . I mean perfect as in never making a mistake and doing something they didn’t mean to do or didn’t know was the wrong thing to do</more sarcasm> . . . EVERYBODY MAKES MISTAKES! How can one possibly believe that out of 500,000 people, someone is not going to make a mistake? And, that out of 500,000 people, people not a single one will take advantage of their access?
OK, the DoD gets the rap for nonexistent site security and nonexistent network security, but State owns the data protection problem. As evidenced in the excerpts from the Washington Post article, they didn’t take it very seriously. This wasn’t a database of fantasy football teams we’re talking about . . . If our diplomats (and other attached entities) were doing their jobs, those cables contained invaluable data and observations and very frank discussions about US relations with other countries and personalities . . . The sort of things that in the business world would be classed as competitive intelligence. In other words, it would be better if the competition didn’t know what we knew or what was being said . . . But it wasn’t important enough to State to put effective protective measures in place.
All of the observations and criticisms of the degree of security on the DoD classified network apply equally to State’s network and the NCD. If they had had adequate security controls and protocols in place, they would have caught Manning red-handed.
One of the Swampland blog entries about the events in Washington as the leaks first became public had talked about what happened when Secretary of State Clinton informed the White House of what had happened:
the White House came back with a question: “What’s our corrective action?”
Clinton’s undersecretary for management, Patrick Kennedy, had a simple suggestion: pull the plug on SIPRNet . . .
If State had had a functioning Information Security function in place:
- If the appropriate controls had been in place, they wouldn’t have had this problem in the first place, but, in the event it did happen,
- They would/should have had an in incident response plan in place, and
- Instead of having to run back and call a panic meeting, Secretary Clinton would have been able to respond with a description of the incident response process that was already in place.
Now, lets shift focus a bit.
Recent attention has been directed at the lack of access control and the assumption that anyone who could physically get to the database could get to anything contained in the database. Depending upon what was in the database and who could get to it, that may not be a problem. Apparently, there was a problem, though. More than one, as a matter of fact.
The first problems were identified by the observation early on that basically, nobody had a clue about what information was being dumped in there nor who was getting what out . . . In most information security contexts, that would be a problem.
Adding insult to injury, there was no data classification or data management in place on the affected data sets. In the rush to make data available, fundamental information security principles and policies were ignored.
Undersecretary of State Kennedy bemoaned the fact that “The department was not equipped to assign individual passwords or perform independent scrutiny over the hundreds of thousands of users authorized by the Pentagon to use the database.” That statement shines a spotlight on how little the State Department understands about information security.
There is a critical distinction to be made here but, as in this case, it is rarely made. The distinction is between authentication (are you really who you say you are) and authorization (do you have permission to do something). The problem with the NCD was the lack of access control – or a means of determining whether an entity is authorized to access certain pieces of information.
Consider going to the movies. If I use a credit card to purchase my ticket, the person at the ticket window might want to see my driver’s license to verify that I am the legitimate holder of the credit card. That process is called authentication. A password is an authentication token that I present (like the driver’s license) to demonstrate that I am who I represent myself to be. Authorization is a completely separate function. As far as the ticket-taker at the movie theater is concerned, I’m authorized to come in because I have a ticket that says I can. The ticket taker could care less who I am . . . only whether or not I have a ticket to a movie that’s currently playing. I might have a valid driver’s license, but, no tickee, no comee in. But I had to identify (authenticate) myself to the ticket seller in order to pay for the ticket (authorization token). So when it comes to data access, the only question of interest to the “authorizer” is whether the requester has permission to access the data.
So State confused an access control problem with an authentication problem. Big, big, big, humongous difference. It’s way beyond the scope of this piece to go into detail about access control mechanisms . . . suffice it to say that there are several that would have worked well, made State’s job much easier, and which, by the way, would have provided yet another trail back to Manning . . .
At this point one may be tempted to observe that it really didn’t matter anyway because there was no data classification, hence, no need to have a mechanism in place to determine who got to see what. The observation is true, but all the more damning in its veracity. We will get back to this issue in Part 5.
State was the final component of Manning’s perfect storm. Not only were his goings and comings as invisible to State as they were to the DoD, they had no idea that he was methodically copying everything he could get his hands on in the database. There was no database activity monitoring, no access control, no connection monitoring, no traffic monitoring . . . The lights were on, the doors were open but there was nobody home . . .
And, irony of ironies, the State Department’s Deputy Chief Information Officer was one of the recipients of the 2010 National Cyber Security Leadership Award for his work on “’continuous monitoring’ of deployed computer systems.”
Finally, just to drive a stake in this elephant’s heart, let’s revisit one of Deputy Secretary of State Kennedy’s comments to the Washington Post.
It is the responsibility of the receiving agency to ensure that the information is handled, stored and processed in accordance with U.S. government procedures,” he [Kennedy] said.
This is what’s known as passing the buck. Beginning with Appendix III to OMB Circular A-130 through all of the NIST Special Publication 800 Series, in the environment in which the leaks occurred, the Department of State seems to have ignored or violated just about every directive and information security policy extant, starting with the requirement to provide “adequate security.” It is the responsibility of every government agency to implement and maintain a risk management process. State either doesn’t have one or doesn’t pay any attention to it. Or it didn’t feel that the cables were important enough to warrant protection . . .
From the tone of the Washington Post article and all the bluster and moaning from Secretary of State Clinton, it seems that State wants to play themselves as the aggrieved party. Well, Hillary, you’ve led with your chin on this. If you and your staff had half a clue, this never would have happened. Darlin’ you were pwned and you didn’t even know it until you read it in the papers. There is no excuse whatsoever for this happening. There is an explanation, though. It’s called lack of governance, lack of oversight, lack of risk management and lack of management. You know, . . . incompetence.
This does it for Part 4. Part 5 will revisit Manning’s perfect storm and show that if existing directives and policies had been implemented and enforced, none of the components would have existed. It will show that this whole debacle resulted from a lack of risk management, governance, and management oversight . . . beginning with fact that recommendations by mental health experts that Manning not be posted to Iraq were ignored and he was sent anyway. The culpability for this whole debacle lies with the leadership of the Departments of Defense and State.
This entry was cross-posted to lartwielder on Daily Kos.