As a senior first lieutenant, I volunteered to teach new second lieutenants about defensive cyberspace operations. Over the course of a year, I spent over twelve hours with three different Cyber Basic Officer Leaders Course classes, during which the students invariably asked the same question: What did your missions accomplish? Each time, I struggled for an answer.
After participating in nearly twenty operations over the course of three years, I had seen many cyber protection teams (CPTs) hunt, clear, assess, and enable hardening. But while those teams had always accomplished their missions, I had not seen those operations lead to significant change in the operational environment. As the Cyber Protection Brigade looks to the future on the eve of the cyber branch’s ten-year anniversary, it should consider altering its operational approach to address this. After years of hard work to train, staff, and equip the Army’s premiere defensive cyberspace operations force, it is time for the next step in the evolution of that unit: identifying the right metrics—particularly, measures of effectiveness—and incorporating them into its operations. Otherwise, we risk a future in which we continue to be unable to answer the question the lieutenants I taught were so quick to ask—what are we actually accomplishing?
Background
A 2014 edition of Armor magazine, the US Army armor branch’s professional development bulletin, included an article by Captains Tom Westphal and Jason Guffey titled Measures of Effectiveness in Army Doctrine. That review highlighted many discrepancies in doctrinal definitions of measures of performance (MOPs) and measures of effectiveness (MOEs). The broad point is an important one and deserves mention in any discussion of operational metrics. And yet this problem should not stop those discussions from happening. This article relies on definitions from Joint Publication (JP) 5-0, Joint Planning. JP 5-0 defines a measure of performance as “an indicator used to measure a friendly action that is tied to measuring task accomplishment,” and a measure of effectiveness as “an indicator used to measure a current system state, with change indicated by comparing multiple observations over time.” Put simply, MOPs concern themselves with friendly action, while MOEs concern themselves with those actions’ effects on the system.
JP 3-12, Cyberspace Operations explains that “DCO [defensive cyberspace operations] missions are executed to defend the DODIN [Department of Defense information network], or other cyberspace DOD cyberspace forces have been ordered to defend, from active threats in cyberspace.” Turning to service-specific doctrine, Army Doctrine Publication 3-90, Offense and Defense then explains the purpose of the defense as follows (emphasis mine): “The purpose of the defense is to create conditions for the offense that allows Army forces to regain the initiative. Other reasons for conducting the defense include retaining decisive terrain or denying a vital area to an enemy, attriting or fixing an enemy as a prelude to the offense, countering enemy action, [and] increasing an enemy’s vulnerability by forcing an enemy commander to concentrate subordinate forces.” This article relies on these definitions to describe defensive cyberspace operations as missions to retain decisive terrain in the fifth domain. Although many more complex definitions exist, this one best captures the purpose of the four cyber protection team functions as described in Cyber Warfare Publication 3-33.4, Cyber Protection Team (CPT) Organization, Functions, and Employment. Missions to hunt, clear, enable hardening, and assess directly support the doctrinal definition of the retain tactical mission task found in Field Manual 3-90-1, Offense and Defense by ensuring “that a terrain feature controlled by a friendly force remains free of enemy occupation or use.”
This article discusses MOPs as they measure friendly action in service of retaining decisive terrain in the fifth domain, and MOEs as they measure the effectiveness of those actions in achieving that objective.
Ops as Metrics
Over three years I watched CPT after CPT execute its assigned mission according to a series of primarily administrative standards. Did the team depart on time? Did the team collect data? Did the team turn its report in? Every question focused on friendly action but not system state, the purview of a measure of effectiveness. Put another way, mission success depended on executing the mission itself, not whether those missions changed the operating environment. As I heard one officer say, “We use ops as metrics.” No one asked if those teams’ operations effected the system’s state, if they contributed meaningfully to the retention of decisive terrain. That would have necessitated shrewd questions: Did the team collect sufficient data, and analyze it in a suitably rigorous manner, to effectively illuminate malicious cyberspace activities? If the team did uncover evidence of malicious cyberspace activities, Was its elimination or neutralization effective? After years of returning to the same site, How many recommendations were effectively implemented? Questions like those, to assess the effectiveness of operations to hunt, clear, enable hardening, and assess went unasked, and so they went unanswered, and the system remained unchanged as a result.
This “ops as metrics” mentality extended beyond operations and into training as well. The questions always asked were: Did the training occur? or How many attended? Seldom asked was how well it improved the trainees’ abilities to function in their work roles or how capable it left network owners to staff their defenses. While certainly important factors, those questions measured only friendly action, not changes in system state resulting from it. In training as in operations, those crucial questions went unasked and unanswered, too, and the system remained unchanged.
This is not a new problem. It might even be expected for a young branch to experience growing pains, especially one that has essentially been building the plane as it flies it for almost a decade. This is also not a problem unique to defensive cyberspace operations. Across the Army, commanders fall prey to the siren song of measures of performance every day—for example, by asking easy questions like Are the soldiers on the PT field? instead of the harder question, Are the soldiers getting adequate physical training? The answer is almost always yes to the former and no to the latter. But when those commanders only define measures of performance, and ignore measures of effectiveness, the yes is the only answer that matters.
Metrics as Metrics
The Army in general—and cyber protection teams in particular—must make a concerted effort to incorporate meaningful MOPs and MOEs into their operations. Across the force, commanders should ask Are the soldiers on the PT field?, but also Are the soldiers getting adequate physical training? Commanders of defensive cyberspace operations forces should ask Did the team collect data?, but also Did the team collect sufficient data, and analyze it in a suitably rigorous manner, to effectively illuminate malicious cyberspace activities? Measuring effectiveness has become the exception, not the rule, which in the realm of defensive cyberspace operations has led to operations of limited rather than optimal impact. In order to change this, cyber protection teams should begin to incorporate metrics in general—and measures of effectiveness in particular—into their operations.
Incorporating measures of effectiveness into cyber protection team operations would force planners to define success in an unprecedentedly transparent way. The imprecise meaning of retain decisive terrain in the fifth domain—or the slightly more specific yet still vague tasks hunt, clear, assess, and enable hardening—has allowed success at the tactical level to become a subjective matter of opinion. Measures of effectiveness would serve as a guardrail against the resultant inconsistency. Mission element leaders, and the team leads above them, would finally have to frame their missions in terms of their actions’ effects on the system, not just the actions themselves.
This is a tall order. Even the private sector continues to struggle with this challenge, where few agree on standard security operations metrics. A good start would involve first divorcing administrative and operational metrics. Although important milestones in accomplishing a given mission, Did the team depart on time? and Did the team turn its report in? are not meaningful operational metrics.
Next, cyber protection teams should adopt basic, generally accepted measures of performance like time to detect (the amount of time from the earliest evidence of related activity to the start of an investigation; also called the adversary’s dwell time), time to investigate (the amount of time from the start of an investigation to its conclusion), and time to remediate (the amount of time from the end of an investigation to fully remediated). Together these three make time to resolution, the amount of time from the earliest evidence of related malicious activity to fully remediated. These are meaningful measures of friendly action tied to the accomplishment of hunt and clear operations. This data should be captured and compared across operations.
Cyber protection teams should also adopt basic measures of effectiveness such as root cause remediation, a simple percentage of investigations in which the root cause of the compromise was identified and then remediated. This is a meaningful measure of system state that could be evaluated over time to gauge the impact of missions to assess and enable hardening. Classification, another MOE, would also serve cyber protection teams well. This metric involves classifying investigations as true positives, false positives, or false negatives, another meaningful measure of system state that could be compared over time to identify improvement as true positives increase and as false positives and false negatives go down.
These basic measures would not guarantee that cyber protection team operations lead to greater improvement of the DODIN’s defensive posture. Cyber protection teams receive missions to hunt, clear, assess, and enable hardening in networks over which they have no actual control, which requires difficult coordination with network owners who must balance operating and maintaining their environments against securing it. Responsibility for meaningful change rests with both parties. These basic measures would, however, at least begin to arm commanders with the knowledge to understand the actual impact—or lack thereof—of their operations, a necessary next step in the evolution of defensive cyberspace operations. And that, at least, would be a step in the right direction.
Captain Zachary Szewczyk is a cyber operations officer with three years of experience in defensive cyberspace operations. He commissioned into the Cyber Corps in 2018 after graduating from Youngstown State University with an undergraduate degree in computer science and information systems. After entering the operational force in June 2019, he supported or led fifteen operations across two battalions, including several high-level incident responses. He also wrote a field manual for defensive cyberspace operations, and wrote several technical white papers distributed across the Cyber Protection Brigade and at the Army Cyber and US Cyber Command levels. Captain Szewczyk remains one of the few senior analytic support officers in the Army, a work role that combines domain expertise, planning, and data science to support decision making at the tactical edge.
The views expressed in this work are those of the author and do not reflect the official policy or position of the United States Military Academy, the Department of the Army, or the Department of Defense.
Thanks to J.C.F., D.J.F., and N.G.S, for providing feedback for this paper. Their input was valuable, but this paper may not accurately reflect their opinions.
Image credit: Sgt. Tom Lamb, US Army National Guard