Authors’ note: This article applies economic theory to help unpack lessons from a particular experience of one of the authors in Afghanistan. For ease of reading, we have used the first person singular when describing that experience.
“You might want to reconsider taking tomorrow off. We have a lead on a mission . . . and it’s a good one!”
The ground unit had gotten a tip that a high-level threat was moving into the area, and would not be staying long. If they acted, and were successful, the payoff would be huge for increasing security in southeastern Afghanistan. However, if they were slow to act—delaying the mission by even an extra day—their window of opportunity would close. The target would move on, likely for good.
In an aviation unit, the aviation commander’s intent can almost universally be characterized as “support the customer.” If the ground commander needs it, our goal is to provide it. In this case, it was a ride to the fight in a way that provided the speed and element of surprise essential for mission success. In the foothills of Afghanistan’s Hindu Kush mountains, covering any distance with speed requires helicopters. Without airlift, the ground commander’s mission was a nonstarter.
However, every mission needed higher headquarters’ approval, and quickly was not exactly a central feature of that approval process. This was by design. Since the likelihood of any one mission winning the war was low, the mission approval system was designed to deliberately consider high-risk missions before approving them. The higher the mission’s assessed risk level, the farther up the chain of command the approval authority sat, with the approval process becoming increasingly deliberative with each move up the HQ ladder.
In this specific case, deliberative meant too slow. As the air mission commander, I rapidly began to understand that whether this mission would occur or not hinged on how I chose to characterize its risk level to higher headquarters. I was left with a tough decision that tested the boundaries of my ability to conduct what economists call “marginal analysis.”
Marginal Analysis
Marginal analysis is at the backbone of most introductory courses in economics. In Gregory Mankiw’s popular textbook, which we use for our course at West Point, he begins with what he describes as the “Ten Principles of Economics.” The third principle states that “rational people think at the margin,” which captures this idea of marginal analysis. Mankiw explains that “rational people often make decisions by comparing marginal benefits and marginal costs.” As a rational actor, you don’t compare the fullness of two alternative courses of action, like “blowing off your studies or studying 24 hours a day.” Rather, you determine “whether to spend an extra hour reviewing your notes instead of watching TV.” You would only spend an extra hour reviewing your notes if the marginal benefit of doing so exceeded the marginal cost. Rational people think on the “edge”; they make decisions using marginal analysis.
According to Mankiw, how I chose to characterize the risk associated with the mission in Afghanistan would depend on the marginal benefits and costs associated with my choice. In hindsight, this type of calculus was much harder—and nuanced—than I realized.
A “Just this Once” Trap
Not everybody agrees with Mankiw’s third principle. Harvard Business School professor Clayton Christensen argues that there are flaws to this way of thinking. “We end up paying for the full costs of our decisions, not the marginal costs, whether we like it or not.” In other words, if companies make decisions on the “edge,” they might miss out on potential opportunities or be blinded to potential risks. From an ethical perspective, Christensen warns of a “just this once” mindset inherent in marginal thinking.
Many of us have convinced ourselves that we are able to break our own personal rules “just this once.” In our minds, we can justify these small choices. None of those things, when they first happen, feels like a life-changing decision. The marginal costs are almost always low. But each of those decisions can roll up into a much bigger picture, turning you into the kind of person you never wanted to be.
Christensen illuminates the dangers of “just this once” thinking with a personal story. As a college student, he had made a decision to not work on Sundays—a commitment based on his faith—and this decision even extended to the collegiate basketball court. He discovered that it’s easier to do the right thing—in his case, keeping his commitment not to work or play basketball on Sundays—100 percent of the time than to do the right thing 98 percent of the time. Doing the right thing 100 percent of the time avoids the slippery slope that can easily make 98 percent turn into 97 percent and soon into 90 percent—and the next thing you know, you’re not even going to church on Sunday.
As I considered the approval process for this mission, I was left with the competing wisdoms of Christensen and Mankiw clashing in my head. On the one hand, I knew that if I wanted to get this mission approved, I needed to understate the level of risk—I needed to cut an administrative corner. On the other hand, doing so would not only go against policy and procedure, but might set a tone of shortcuts and misleading behavior during future missions. Even as I attempted to conceive the costs and benefits of this decision, my own value judgement was probably impacted by the anchors that had pulled on our unit from the onset of the deployment.
An Anchor’s Pull
In their bestselling book Nudge, economists Richard Thaler and Cass Sunstein explore the biases that impact decision-makers. For leaders attempting to think on the margin, an “anchoring bias” can impact marginal analysis. Thaler and Sunstein warn that our “adjustments [from anchors] are typically insufficient.” In other words, a company will tend to overstate the costs and understate the possibilities that could arise from an alternate course of action.
Anchoring biases can be cultural. An organization’s cultural cues often push leaders towards a certain decision. In Afghanistan, we had been on the receiving end of mixed messages in this regard. Upon arrival in Afghanistan, the commander of the unit that we were replacing offered a cautionary anchor. “No one here is capturing Bin Laden,” he told us. With this punchy phrase, he was attempting to appropriately calibrate our risk tolerance, as if to say “There is little that you will do here that will warrant accepting extraordinary risk. Therefore, you should pause and re-evaluate your cost-benefit analysis if you find yourself undertaking high-risk missions.”
His perspective was hard-earned. After a year of working closely with coalition special operations forces, providing them with a variety of aviation assets for time-sensitive missions, he knew all too well the allure participating in such missions would hold for the new pilots—and the danger. His organization had the scars to prove it, having suffered casualties and deaths executing some of these missions. We found ourselves returning to his words many times in the first months of our deployment.
But we also felt an anchor towards action. As part of its description of the concept of mission command, the Army identifies enabling “disciplined initiative” in subordinates as a key objective. Among other things, disciplined initiative is “action . . . when unforeseen opportunities or threats arise.” Done correctly, effective mission command empowers junior leaders to make independent decisions to accomplish the commander’s intent. As former Chairman of the Joint Chiefs of Staff Gen. Martin Dempsey was fond of saying, successful mission command fosters a culture that has a “bias for action.”
Cultivating a culture with a bias for action is essential to the success of any organization, military or otherwise, that operates in a fluid environment. Gen. Dempsey has popularized the term for military leaders, but the concept has long been a popular one in the business community, as well. Amazon has enshrined bias for action in the company’s “leadership principles,” concluding that “speed matters in business.” The implication in the sentiments of both Gen. Dempsey and Amazon is that an organization can only gain the speed to be necessarily agile and adaptive to a changing marketplace or battlefield if it devolves decision-making to the lowest possible level.
Risk and Short-termism
How we think about risk impacts how we think on the margin. In finance, discount rates help decision-makers account for the riskiness of an investment. A riskier investment has a higher discount rate, and a higher discount rate makes the near term matter more and the long term matter less. Overstate risk, and you fall into a trap of short-termism. This appreciation of horizon and risk further complicates the notion of thinking on the edge.
For this mission, our ground commander had “upped the ante”—increased our discount rate—in the way that he conveyed the opportunity at hand. This was a high-value, fleeting target of opportunity. Waiting for tomorrow wasn’t an option; we felt like we needed to act now. If we were to adopt a “bias for action,” our risk tolerance in this situation should be elevated, allowing us to seize the initiative. But we also had reason to be risk-averse, not only based on the guidance from our previous commander, but also from our desire to return safely from our deployment.
If our “feeling” about the importance of this mission—our true risk measure—was off, we would be overstating risk and fall into trap of short-termism. Paradoxically, to get this high-risk (high discount rate) mission approved, we needed to understate its risk to our higher-ups. Executives go to prison for misleading investors. What made my situation any different?
As the air mission commander, values, anchors, and risk all clashed as I pondered how to get the mission approved. The decision-making authority did not reside at a level sufficiently low enough to work within the time constraints that the situation imposed. This presented a challenge: how do we satisfy competing commander’s intents? How do we resolve “support the customer,” which suggests a bias for action, with “no one here is capturing Bin Laden,” which suggests a bias for inaction?
I chose to characterize the mission as a standard QRF mission, which was largely what it was, I assured myself. We were executing a hasty plan to a non-standard landing zone (LZ). Conveniently, we left out the part about how we already had an idea of the target, and that we suspected it was a high-threat environment.
No need to get higher headquarters unnecessarily worked up over it. If there really is an elevated threat there, that’s the S-2’s job to catch it, I reasoned to myself.
As the helicopters glided toward the horizon, leaving the airfield behind, everyone’s focus shifted to mission accomplishment. We would leverage the element of surprise, using the mountains to shield our approach as much as possible, hopping over the ridgeline into the valley of the objective at the last possible minute. It was less than a minute from the time we crested the ridge (giving the enemy the first probable opportunity to hear the approaching helicopters) to the time that the ground force was off the aircraft and rushing toward the compound. The helicopters left the valley to stage for extraction.
A few hours later we received the call over the radio that the ground force was mission-complete and ready to come home. I held my breath as I precariously balanced the helicopter on a rocky outcropping, letting troopers leap in, one by one. With everyone on board we headed back to base. The mission was deemed a success, with no friendly casualties. However, as we flew back toward home, I could not shake the feeling that we were quick to define success only because nothing went wrong. We had incurred a great deal of risk to execute the mission, and at the end of the day, the mission brought us no closer to “catching bin Laden.”
As any introductory economics course will tell you, thinking on the edge is a powerful concept. But in our case, competing anchors and a high-risk sense of urgency pushed marginal analysis in a dangerous direction. There is a deeper lesson to this story in regards to the interplay of risk and anchors.
It is easier to balance between two anchors—an anchor towards caution and an anchor towards action—when the pressure is low. But when pressure mounts, straddling becomes harder; our instinct and loyalties push us in a particular direction, and that direction may not be the right one for mission accomplishment. Perhaps the best way to prevent these detours is to learn from detours in the past. Open up. Separate outcomes from decisions. Talk about the decisions themselves. We got lucky that we had an uneventful and generally positive outcome. But luck shouldn’t inhibit our willingness and ability to unpack our decisions, especially the bad ones.
As the Army itself recognizes in Army Doctrine Publication 3-0, Operations, “War is chaotic, lethal, and a fundamentally human endeavor.” This means that by default, battlefield decisions are governed by human fallibility and are influenced by decision-making biases. Leaders should not ignore this fact, but should instead embrace this aspect of inherent humanness in the decision-making calculus and account for it. The best leaders understand that the bias for action avoids tipping into recklessness and tactical restraint avoids becoming battlefield paralysis only by developing a well-calibrated ability to accurately assess risk and reward. Only through reflection on our own biases can we learn and safeguard against the dangerous edges that we might be tempted to cross, while still retaining the disciplined initiative essential to achieving victory.
Sean McMahon is currently an Instructor of American Politics in the Department of Social Sciences at West Point. Sean deployed twice to Afghanistan with the 101st Airborne Division and now serves as a Foreign Area Officer. He earned his BS in Comparative Politics from West Point, an MS in Engineering Management from Missouri S&T, and an MA in International Economics and Latin American Studies from the School of Advanced International Studies at Johns Hopkins.
Ben Summers currently serves as an Assistant Professor of Economics in the Department of Social Sciences at West Point. He deployed twice to Afghanistan with the 101st Airborne Division and was a recipient of the 2013 General Douglas MacArthur Leadership Award. He earned his BS in Economics from West Point in 2006 and an MBA with Distinction from Harvard Business School in 2015.
The views expressed are those of the authors and do not reflect the official position of the United States Military Academy, Department of the Army, or Department of Defense.
Image credit: Capt. Peter Smedberg, US Army