How Red Team Function

Zenko provides a series of guidelines for implementing red teams, as well as examples of how they have been used in different type of organizations, such as the military, government, and business.fullsizeoutput_3524

HOW RED TEAMS FUNCTION

The term Devil’s Advocate generally refers a skeptic, or someone who is being contrary. It originated in the Roman Catholic Church, where it was defined as an official whose job was to operate as an independent investigator into the life and virtue of a person nominated for sainthood. The Devil’s Advocate has since been considered the first regular practice of a red team, a term that came into use during the Cold War era. Today, red teaming is known as a systematic approach to determine the intentions, strengths, and weaknesses of a given competitor. It is a technique employed in a number of fields but still vastly underused. Red teams provide a new perspective to any organization, helping it discover its own weaknesses and improve performance.

When used correctly, a red team can help an organization ensure it is making the right decision in a crucial situation. When leaders employ a team whose job is to find the negative aspects of a decision, they are provided with a more complete picture and can be more confident that their final decision will be the correct one. Any competitive institution runs the risk of operating with incomplete information, especially in the modern world of rapid change. Self-evaluation becomes especially dangerous in this environment. Senior leadership is usually incapable of performing accurate evaluations within its own organization because of different biases, such as confirmation bias and organizational bias. And while many leaders claim to encourage dissent in the workplace, it is seldom effective because employees often do not have the level of expertise required to identify problems correctly. More and more organizations have been leveraging red teams for core functions, such as simulations, vulnerability probes, and alternative analyses.

Red teams can be used internally (like the Vatican’s Devil’s Advocate), or externally to attempt to gain valuable information about key competitors. Red teams can still fail, though, depending largely on how willing and receptive the organization’s leaders are to them. Sometimes the design or execution of the red team itself is flawed. A red team’s primary difficulty is to find a way to operate effectively within an organization while still being independent enough to challenge it.

BEST PRACTICES IN RED TEAMING

The concept of “best practices” is generally incompatible with red teaming. Nearly all red team practitioners are deeply skeptical of the strict guidelines that generally define best practices. While no single set of rules could or should define the job of all red team members at all times, most would agree that a set of principles can help a red team effectively work around and within its parent organization. Red team members should adhere to the following six guidelines:

1.The boss must buy in. Chain of command is important in nearly every institution, and is helpful for implementing change, solving problems, and establishing accountability. Therefore, the “boss” of an organization must be willing to support the red team, commit the required resources, and accept its results. He or she must also allow the red team to be completely honest about its findings. And ultimately, it is the boss who decides whether to implement change based on these findings.

2.The red team must be outside and objective, while inside and aware. Red teams must be able to strike a balance between being part of an organization and being independent from it. The red team should report only to the boss and anyone else affected directly by its decisions. The red team–and the rest of the organization–must understand its goals and the scope of its activities.

3.Members should be fearless skeptics with finesse. A red team needs to be staffed with the right types of people–for example, those who do not always fit within the “norm” of the larger organization. Most people have what is called an existence bias, which assumes that something is good because it exists. The best red team members have overcome this bias and can examine things more critically.

4.The team must have a big bag of tricks. A red team’s techniques should be anything but routine. Members should have diverse, creative, broad-minded approaches to their tasks. A red team should also be flexible and leverage technology to adapt to a wide variety of uses in order to meet its goals.

5.The organization must be willing to hear bad news and act on it. If a red team’s findings are ignored, the work is for nothing. Leaders must be willing to consider and implement the team’s recommendations–even if this involves processing results that are upsetting to or difficult for organizational leaders.

6.A red team should be used just enough, but no more. If red teaming is employed too frequently, an organization will not have enough time to implement the recommended changes, and the staff tasked with making the changes may feel demoralized. When used appropriately, however, red teaming prevents an organization from becoming too complacent.

ORIGINS: MODERN MILITARY RED TEAMING

The version of red teaming most organizations use today was developed by the U.S. military. The costs and risks involved in military operations make red teaming extremely important for all branches of the armed forces. The strategy came into use during the Cold War, but the system was not developed and codified until after 2000. Following 9/11, red team programming was expanded to help meet the terrorist threat. Vulnerability probing is an especially important red team task for the military and its many possessions. Simulations often take place in the form of tabletop war games called Kriegsspiel, which allow leadership a chance to think through the potential consequences of tactical decisions. The military has made efforts to promote dissent among junior officers, but progress is slow; the hierarchy within the armed forces is powerful, and creative thinking is not rewarded as much as following orders is.

In 2004, “Red Team University,” officially called the University of Foreign Military and Cultural Studies (UFMCS), was founded at Fort Leavenworth, Kansas. The core of the multi-week program focuses on teaching cadets how to improve their awareness and understanding capabilities. The handbook they developed, The Applied Critical Thinking Handbook, is available for free online. More than 10 years later, the school is still an important, viable, and profitable part of the Army training program.

The Marines adopted an official red teaming approach in 2010. The commanders of the Marine Corps understood the need for red teams on their staffs, especially during times of war. The initiative met some resistance at first; most was based in officials’ unfamiliarity with or distrust of the process. Eventually a number of junior officers adopted the program and saw measured success with it. Red teaming has since been adopted by military organizations outside the Unites States as well, in countries like Israel and the United Kingdom. The British military has published the Red Teaming Guide to help counter the same doubts the U.S. Marines faced. The greatest challenge for these forces is to justify the expense of the program, but by now most commanders can agree on the value it provides.

ALTERNATIVES: INTELLIGENCE COMMUNITY RED TEAMING

The U.S. Intelligence Community (USIC) comprises seven organizations and more than 100,000 employees. Analysts within these organizations are challenged to identify trends and then make predictions based on them. Analysts sometimes make mistakes, which can have far-reaching consequences. Red teaming serves these organizations by providing different-minded competitive analysis to reduce or eliminate the possibility of organizational bias or other errors.

A National Intelligence Estimate (NIE) is the product of high-end analysts, and takes months to develop. These reports are often met with skepticism or criticism, and are sometimes incorrect. Red Team analysis was introduced to the NIE process during President George H. W. Bush’s administration in the form of two teams: Team A would perform the standard analysis, and Team B would be made up of outside experts who could work independently to find answers. However, the process was heavily structured and its results were inconsequential. Team B’s results were flawed, and the Bush administration refused to act on them, falling back to Team A’s more traditional NIE.

Not only has red teaming been misused by the U.S. government, but in some situations the technique was not used at all–such as in 1998 when the CIA was planning an attack on Al Qaeda. To prevent leaks, access to intelligence reports was limited to a dozen high-up officials. The group received information that Osama bin Laden had access to VX nerve gas, and that there would soon be a meeting of militants in Pakistan. They launched an attack, but it was only marginally successful because the entire operation was handled within the Counterterrorism Center. Other departments, including the nonproliferation unit, were not informed or given a chance to raise concerns about the plan.

In the wake of 9/11, White House officials needed to know if there were additional attacks being planned. The director of the CIA formed a group designed to challenge conventional thinking. This became known as the CIA’s Red Cell. Members were carefully chosen for their analytical and creative-thinking skills, not for any expertise in counterterrorism. The Red Cell evolved to have a wider scope beyond just terrorism. Not every senior official has found value in the Red Cell, but nearly all of them read every report because they appreciate the alternative thinking. During the hunt for bin Laden, reports suggesting the terrorist leader’s location at Abbottabad, Pakistan, were unconfirmed. The risks of a failed raid on the compound were big, but even after extensive red teaming, analysts claimed to be 60-80 percent sure he was there. Further challenging the analysis would risk leaking information about the Delta Team raid. A final red team was created within the National Counterterrorism Center (NCTC). When their findings were presented to President Obama, the likelihood of finding bin Laden at the base had been reduced to around 40-50 percent. Despite the perceived coin-toss chances, the government knew that the operation had been thoroughly analyzed from every possible angle.

ADVERSARIES: HOMELAND SECURITY RED TEAMING

In the 1970s and 1980s, professor and terrorism expert Stephen Sloan developed a method for simulating terrorism in order to better understand terrorists’ mind-sets and tactics. The tactics he developed are used by companies, governments, and military and law-enforcement organizations around the world, including the Department of Homeland Security (DHC).

One of the primary challenges to red teaming, when focused on the government’s physical defenses, is to illustrate the value of preventative security. Organizations generally do not see the flaws in their systems until they have been drastically breached. Even then, the findings of red teams are not always acted upon. For example, in the late 1990s, The Federal Aviation Administration (FAA) red team consistently uncovered huge gaps in airport security, but nothing was done to correct them. These persistent security issues led to the terrorist attacks of 9/11. Since then, red team reports on aviation security have generally been taken much more seriously.

Following the terrorist attack in Mumbai, India, the New York Police Department (NYPD) realized that it would not be equipped to respond to a similar attack, and conducted an extensive analysis of the terrorists’ tactics, weapons, and motivations. A tabletop simulation was developed to test the findings. These tabletop scenarios are now regularly used by the NYPD as an exercise in strategy, response, and imagination. The situations are often designed to be “unwinnable” to truly test the officers. The recommendations that emerge from these exercises are often integrated into the NYPD’s operations.

COMPETITORS: PRIVATE-SECTOR RED TEAMING

Limited versions of red teaming have been adapted for the private sector. Vulnerability probes are the most common, as well as simulations used to evaluate important decisions. Many companies hire external consultants who specialize in red teaming, often to identify ways to improve efficiency and therefore save money.

When making vital strategic decisions, business executives generally employ a combination of three approaches:

  1. Strategic planning.
  2. Liberating structures.
  3. Promotion of a corporate culture in which employees help identify flaws.

All three of these approaches suffer from some degree of organizational bias; therefore internal red teaming is seldom realistic or effective. Most employees are unwilling to be a dissenting voice, so real problems go unaddressed, often with serious consequences.

Some companies use business war games as a method for executives and employees alike to assume the roles of competitors and customers in order to better understand potential pitfalls. This method requires facilitators who can prevent “players” from retreating to their traditional roles, as well as buy-in from leadership within the organization. These war games are most often based on either statistical models or moderated discussions. Either way, they help companies develop insights they might not have otherwise found.

Cyber security in the private sector also benefits greatly from red teaming. The potential cost of damages from poor cyber security can be significant. Most cyber breaches can be prevented or mitigated by utilizing simple best practices, but hackers are always developing new methods to attack with. Red teams are used to conduct penetration tests, but the success of these tests often hinges on how willing the target organization is to participate. Those who are compelled to do so by regulation tend to narrow the scope too far for penetration tests to be of any use. When an organization is aware of cyber security risks, however, the tests they request tend to be much more exhaustive and obtain better results. Cyber penetration tests are becoming a widespread business practice, and even mandatory in some industries.

Similarly, physical penetration tests help red teams determine how difficult it is to actually break in to a building. Most companies only spend the minimum amount necessary for physical security measures. While physical security may seem easier to maintain than cyber security, studies have shown there is much less understanding in the private sector of what constitutes an effective, integrated security plan. Legal requirements and industry standards both make physical penetration tests necessary for organizations, and they are carried out in much the same way as their cyber counterparts.

MODESTY, MISIMPRESSIONS, AND THE FUTURE OF RED TEAMING

Red team projects usually cannot be classified as complete successes or failures. However, red teaming always uncovers new insights that an organization would not otherwise have discovered with an internal investigation. Even when red teaming fails, it reveals something new about the values and thought processes of the examined institution. When an organization introduces red teaming, is may be subject to numerous misinterpretations, such as ad hoc approaches, mistaking red teams as policymakers, freelance red teaming, shooting the messenger, and using red teams to make rather than inform decisions.

The success of red teaming depends on how it is valued within an organization’s leadership, and whether that leadership is willing to accept and act on any uncovered data. Its effectiveness as an awareness and management tool should become more widespread with use. It can be a powerful device, yielding great benefit, to those who are willing to learn from it.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.