Operational Metrics: The Next Step in the Evolution of Defensive Cyberspace Operations

As a senior First Lieutenant, I volunteered to teach new Second Lieutenants about defensive cyberspace operations. Over the course of a year I spent more than twelve hours with three different Cyber Basic Officer Leaders Course (BOLC) classes; each time, they asked the same question: what did your missions ac- complish? Each time, I struggled for an answer.

After participating in nearly twenty opera- tions over the course of three years, I had seen many Cyber Protection Teams hunt, clear, as- sess, and enable hardening. But while those teams had always accomplished their missions, I had not seen those operations lead to signifi- cant change in the operating environment. As the Cyber Protection Brigade looks to the future on the eve of the Branch’s ten year anniversary, it should consider altering its operational approach to address this. After years of hard work to train, staff, and equip the Army’s premiere defensive cyberspace operations force, it is time for the next step in the evolution of that unit: the incor- poration of metrics in general, and measures of effectiveness specifically, into its operations.

Background #

A 2014 edition of ARMOR magazine, the Armor Branch’s professional de- velopment bulletin, included an article by Captains Tom Westphal and Jason Guffey titled Measures of Effectiveness in Army Doctrine. That review highlighted many discrepancies in doctrinal definitions of measures of performance (MOPs) and measures of effectiveness (MOEs); this article relies on definitions from JP 5-0: Joint Planning. JP 5-0 defines a measure of performance as “an indicator used to measure a friendly action that is tied to measuring task accomplishment”, and a measure of effectiveness as “an indicator used to measure a current system state, with change indicated by comparing multiple observations over time.” Put simply, MOPs concern themselves with friendly action, while MOEs concern themselves with those actions’ effects on the system.

JP 3-12: Cyberspace Operations explains that “[defensive cyberspace operations (DCO)] missions are executed to defend the DODIN, or other cyberspace DOD cyberspace forces have been ordered to defend, from active threats in cy- berspace.” ADP 3-90: Offense & Defense then explains the purpose of the defense as follows, emphasis mine: “The purpose of the defense is to create conditions for the offense that allows Army forces to regain the initiative. Other reasons for conducting the defense include retaining decisive terrain or denying a vital area to an enemy, attriting or fixing an enemy as a prelude to the offense, countering enemy action, and increasing an enemy’s vulnerability by forcing an enemy commander to concentrate subordinate forces.” This article relies on these def- initions to describe defensive cyberspace oper- ations as missions to retain decisive terrain in the fifth domain. Although many more complex definitions exist, this one best captures the pur- pose of the four Cyber Protection Team func- tions as described in Cyber Warfare Publication 3-33.4: Cyber Protection Team (CPT) Organization, Functions, and Employment. Missions to hunt, clear, enable hardening, and assess directly support the doctrinal definition of retain in FM 3-90-1: Offense & Defense by “[ensuring] that a terrain feature controlled by a friendly force remains free of enemy occupation or use.”

This article discusses MOPs as they measure friendly action in service of retaining decisive terrain in the fifth domain, and MOEs as they measure the effectiveness of those actions in achieving that objective.

Ops as metrics #

Over three years I watched CPT after CPT execute its assigned mission ac- cording to a series of primarily administrative standards. For example, “Did the team depart on time?” “Did the team collect data?” “Did they turn their report in?” Every question focused on friendly action but not system state, the purview of a measure of effectiveness. Put another way, mission success depended on executing the mission itself, not whether those missions changed the operating environment. In the words of one Officer, “We use ops as metrics.” No one asked if those teams’ operations effected the system’s state, if they contributed meaningfully to the retention of decisive terrain. That would have necessitated shrewd questions like, “Did the team collect sufficient data, and analyze it in a suitably rigorous manner, to effectively illuminate malicious cyberspace activities?” If the team did uncover evidence of malicious cyberspace activities, “Was its elimination or neutralization effective?” After years of returning to the same site, “How many recommendations were effectively implemented?” Questions like those, to assess the effectiveness of operations to hunt, clear, and enable hardening and assess, respectively, went unasked, and so they went unanswered, and the system remained unchanged as a result.

This “ops as metrics” mentality extended beyond operations and into training as well. The questions always asked, “Did the training occur?” or “How many attended?”, seldom how well it improved the trainees’ abilities to function in their work role, or how capable it left network owners to staff their defenses. While certainly important factors, those questions measured only friendly action, not changes in system state resulting from it. Those crucial questions went unasked and unanswered, too, and the system remained unchanged.

This is not a new problem. Some have called it the “growing pains” of a branch that has been “building the plane as we fly it” for almost a decade. This is also not a problem unique to defensive cyberspace operations. Across the Army, commanders fall prey to the siren song of measures of performance every day. For example, by asking easy questions like, “Are the soldiers on the PT field?” instead of the harder question, “Are the soldiers getting adequate physical training?” The answer is almost always “yes” and “no”, respectively—-but when those commanders only define measures of performance, and ignore measures of effectiveness, the “yes” is the only answer that matters.

Metrics as metrics #

The Army in general—-and Cyber Protection Teams in particular—-must make a concerted effort to incorporate meaningful MOPs and MOEs into their operations. Across the force, commanders should ask, “Are the soldiers on the PT field?” and “Are the soldiers getting adequate physical training?” Commanders of defensive cyberspace operations forces should ask, “Did the team collect data?” and “Did the team collect sufficient data, and analyze it in a suitably rigorous manner, to effectively illuminate malicious cyberspace activities?” Measuring effectiveness has become the exception, not the rule, which in the realm of defensive cyberspace operations has led to operations of limited rather than optimal impact. In order to change this, Cyber Protection Team should begin to incorporate metrics in general—-and measures of effectiveness in particular—-into their operations.

Incorporating measures of effectiveness into Cyber Protection Team operations would force planners to define success in an unprecedent- edly transparent way. The imprecise meaning of “retain decisive terrain in the fifth domain”– or the slightly more specific yet still vague tasks “hunt”, “clear”, “assess”, and “enable hardening”–has allowed success at the tactical level to become a subjective matter of opin- ion. Measures of effectiveness would serve as a guardrail against the resultant inconsistency. Mission Element Leaders, and the Team Leads above them, would finally have to frame their missions in terms of their actions’ effects on the system, not just the actions themselves.

This is a tall order. Even the private sector continues to struggle with this challenge, where few agree on standard security operations met- rics. A good start would involve first divorc- ing administrative and operational metrics. Al- though important milestones in accomplishing a given mission, “Did the team depart on time?” and “Did they turn their report in?” are not meaningful operational metrics.

Next, Cyber Protection Teams should adopt basic, generally accepted measures of performance like Time to Detect (the amount of time from the earliest evidence of related activity to the start of an investigation; also called the adversary’s “dwell time”), Time to Investigate (the amount of time from the start of an investiga- tion to its conclusion), and Time to Remediate (the amount of time from the end of an investigation to fully remediated). Together these three make Time to Resolution, the amount of time from the earliest evidence of related malicious activity to fully remediated. These are meaningful measures of friendly action tied to the accomplishment of hunt and clear operations. This data should be captured and compared across operations.

Cyber Protection Teams should also adopt basic measures of effectiveness such as Root Cause Remediation, a simple percentage of investigations in which the root cause of the compromise was identified and then remediated. This is a meaningful measure of system state that could be evaluated over time to gauge the impact of missions to assess and enable hardening. Classification, another MOE, would also serve Cyber Protection Teams well. This metric involves classifying investigations as true posi- tives, false positives, or false negatives, another meaningful measure of system state that could be compared over time to identify improvement as true positives increase and as false positives and false negatives go down.

These basic measures would not guarantee that Cyber Protection Team operations lead to greater improvement of the DODIN’s defensive posture. Cyber Protection Teams receive mis- sions to hunt, clear, assess, and enabled hardening in networks over which they have no actual control, which requires difficult coordina- tion with network owners who must balance operating and maintaining their environments against securing it. Responsibility for meaning- ful change rests with both parties. These basic measures would, however, at least begin to arm commanders with the knowledge to understand the actual impact—-or lack thereof—-of their operations, a necessary next step in the evolution of defensive cyberspace operations. And that, at least, would be a step in the right direction.

Contributors #

Thanks to the following people for providing feedback for this article.

Their input was considered, but this paper may not accurately reflection their opinions.

Note: An edited version of this article appeared on West Point’s Modern War Institute, published May 10th, 2022, which can be viewed here.

Permalink.