A red team is an inside group that explicitly challenges a company's strategy, products, and preconceived notions. It frames a problem from the perspective of an adversary or sceptic, to find gaps in plans, and to avoid blunders. Red teams are one way to manage the biggest corporate risk of all: thoughtlessness.
The term red team comes from the cold war practice of using US officers taking a Soviet, ie “red” perspective. (Moscow did the same thing and called it a “blue team.”) US officers would “think red”, to attempt to defeat US plans and systems, rather than mirror image US thinking onto the Soviets. For example, the US Navy used a red team to try to defeat its own submarine force using Soviet concepts and technology.
Today, red teams are used to double check important assumptions and overcome groupthink. The CIA used a red team to challenge both the intelligence and plan for the attack on Osama bin Laden in Pakistan. An insurance company hires plaintiff attorneys to critique their insurance contracts prior to their issue. Following the collapse of Arthur Andersen, Enron's auditor, the big four accounting firms established what, in effect, were ambassadors in Washington, senior partners whose job it was to think like a regulatory agency, Congress, or some other branch of government. In the UK, the Department of Works and Pensions uses a red team to monitor welfare reforms.
The common feature in all of these examples is the adversary's or sceptic's outlook taken on by an independent group. This shift in perspective recognises the powerful psychological force that exists in all organisations not to challenge the way problems are framed – something that can lead to disaster. Arthur Andersen decentralised risk management to its Houston, Texas, ie Enron, office. Its Chicago headquarters had no independent view of the risks they were taking on, nor did headquarters appreciate the depth of the reaction of regulators and politicians to Enron's collapse.
A red team is especially useful to review decisions with large scale and complexity. This is because the momentum needed to launch such projects can lead to a feeling that team loyalty requires supporting them, and because the tendency to get lost in the many details leads people to overlook project risks as a whole. Segways (the two-wheel personal transportation device) and Windows 8 would have benefited greatly from red team review. Red teams also are useful to break out of the most dangerous bubble of all, the tendency of senior management to be detached from field operations (as Arthur Andersen was) and the realities of global business.
Red teams are very useful at the tactical level as well. Technology giants like Microsoft and Apple use red teams to try to hack their own software, knowing that if they relied on software producers to judge this they would overlook many holes and vulnerabilities.
Despite their value, red teams are difficult to use. They need to be more than window dressing, a mere rubber stamp approval of premade decisions. Another problem is that the red team may not truly capture the thinking of an adversary. Team members may simply impute the way they would frame a problem to a competitor. Many US firms find it difficult to understand the way Chinese companies think. One sees this in US legalistic challenges to intellectual property disputes. Red teams that try to "think Chinese" are necessary in today's global markets.
Getting the right people on a red team is important. Creativity, trust, and good communication skills are vital. So are clear authority lines to ensure that the red team doesn't report to a department that has a clear interest in promoting the strategy under review.
A final example of red teams should be kept in mind. "Free" red teaming may come from government bodies. The US Congress may launch hearings about corporate malfeasance and call in critics to testify. The Justice Department, SEC, or Serious Fraud Office is more than willing to "red team" corporate corruption, with enormous damage to a company's reputation. So one final reason to use a red team is that if a company doesn't do so, the authorities may be only too happy to do it for them.