'Red Team: How to Succeed By Thinking Like the Enemy'

Tuesday, November 10, 2015
Guang Niu/Reuters
Speaker
Micah Zenko

Senior Fellow, Council on Foreign Relations

Presider

Senior Vice President, Director of Studies, and Maurice R. Greenberg Chair, Council on Foreign Relations

In Red Team, CFR Senior Fellow Micah Zenko draws on little-known case studies and unprecedented access to red teamers to reveal the best practices, common pitfalls, and winning strategies of these modern-day Devil’s Advocates. The book shows how policymakers, business leaders, and curious minds alike, can succeed by thinking like the enemy.

The CFR Fellows’ Book Launch series highlights new books by CFR fellows. It includes a discussion with the author, cocktail reception, and book signing.

LINDSAY: Good evening, everyone. On behalf of Richard Haass, the president of the Council on Foreign Relations, I want to welcome you all here this evening. We thank you for coming, particularly since the weather outside is a little bit dreary. I am Jim Lindsay. I am the director of studies here at the Council on Foreign Relations. And that means I oversee the work of the Council’s in-house think tank. I also want to welcome everyone who’s joining us from the Internet as we livestream tonight’s event.

I think whether you’re in here in this room or watching over the Internet, you’re in for a real treat. The reason for that is tonight’s special guest, Micah Zenko. He is a real talent. And it is both my honor and my pleasure to get to introduce Micah. Here’s where he can’t rebut me, he has to sit and be quiet. Micah is a senior fellow here at the Council on Foreign relations. I think many of you know him as one of the best young national security scholars in the country, and particularly with his expertise on the use of armed drones. He’s the author of an outstanding blog, “Politics, Power, and Preventive Action.” You can also follow him on Twitter. He’s very active, @MicahZenko. So it’s pretty straightforward on that score.

But we’re not here tonight to talk about Micah’s blog or his latest tweet, but rather to talk about the publication of his terrific new book. And here’s where we have to go with a prop. “Red Team: How to Succeed By Thinking Like the Enemy.” It is a terrific book and I am not alone in that judgement. Publisher’s Weekly wrote of it, and I’m quoting here, “Zenko explains in absorbing detail the value of red teams and offers readers much to consider.” Booklist decided to go for the succinct summary. It simply says, an excellent book. So please join me in welcoming Micah Zenko. (Applause.)

ZENKO: Thank you. Thank you so much.

LINDSAY: OK, Micah, terrific book. I know you’re relieved it’s done. Now you get to enjoy the fun part of it. Micah spent an awful lot of hours in his office working on the book. But why don’t we sort of begin with basics. What is red teaming?

ZENKO: I like to think of red teaming as a few things. It’s a mindset, and approach, and a series of specific tactics and techniques. The mindset—the theme of the book is you can’t grade your own homework. And institutions are the most—are the least likely to identify blind spots. They’re the least likely to really challenge the assumptions of their strategies and plans. And they have a great deal of difficulty really understanding the intentions and capabilities of their adversaries. People show up to work every day, they do what they do, they become constrained by the limits of the institution. So they have trouble conceiving of what they’re doing differently. So once you recognize and you have that humility, you might be interested in trying things like red team.

Specific techniques that I go through in the book are simulations, alternative analysis, and vulnerability probes. The last one is the one that people are interested in greatly now because of the issues of airline security. These are people from TSA and the Department of Homeland Security that covertly smuggle fake explosives and guns through commercial airliner screenings. Recently, they did it 67 out of 70 attempts at six different airports. Now, the reason you do it isn’t to embarrass the security, but to prove that you can and then provide corrective measures for how to change your architecture to protect yourself.

LINDSAY: OK. So this seems like a no-brainer, that you’d want to know your vulnerabilities, whether you’re a government, a private organization, particularly if you’re in the private sector because if you’re vulnerable, if that vulnerability is exploited you could lose an awful lot of money. So I guess the question is, why isn’t it commonly done?

ZENKO: It’s not a core business practice if you’re in the private sector. It is axillary to what you do routinely. It is also slightly expensive. If you hire people to break into your computer networks, they might cost $20,000. If you’re, for example, a pharmaceutical company and you have a drug going off patent and you want to conceive of all the ways that your competitors respond, the market will respond, regulators will respond, you might hire an external—what they call a business war gamer. This is somebody who makes you rigorously think through all the different ways these competitors will respond to your actions. So it’s not a core business practice, costs a little money.

But more importantly, it can be demoralizing and it can put—it can be perceived to put unit cohesion at risk. For example, if you’re in a command staff in the military, it’s a hierarchical institution. You work together 16, 18, 20 hours a day. You are mission focused. You know, one of the things I talk to lots of majors, lieutenant colonels, colonels, they’ll tell you when you’re at headquarters, you are on mission lock. Your job is to achieve to the objective, to succeed. If you bring in dissenting or challenging viewpoints, you not just—might put the mission at risk, but you’re challenging the thinking of everyone you’re next to 16, to 18, to 20 hours a day. That’s very difficult to do. So we assume people can’t do that, and that’s why you might use a red team.

LINDSAY: So what are the keys to doing red teaming well?

ZENKO: Well, first, I mean, I’d—this is the whole second chapter of the book, which really—you know, this book consisted of talking to people who red team. And interviewed over 200 people in a range of different fields because it’s ultimately a social phenomenon. It’s not something you can just read about and try to understand from a distance in anecdotal evidence and case studies. I talked to people. And in the course of talking to them, there were six best practices that really emerged. If you’re interested in all six, take a look at the book, but I’ll just give three here which I sort of think are the most important.

The first one is, the boss must buy in. If the senior leader doesn’t care about the red team, signal that they care down the chain of command, the red team will not get the resources, it will not get the access, and it will not produce any product that anybody listens to. So that’s the critical thing.

The second thing is the red team has to be situated correctly to the target institution that it’s red teaming. If they are put in, as people like to say, one of—you know, a Marine Corps red teamer described to me, we were like the children at the kids table, put in a corner in a separate room, in a separate building, not really listened to, they’re not going to be effective. Similarly, though, if you’re too ingrained within the target institution, you become captured by the institutional culture. And then you start—you’re just saluting and trying to execute the mission. So it has to be situated just correctly, and that’s very difficult.

The final thing is you have to be willing to listen to the bad news and do something with it. It is quite routine—you know, I—the most interested people I talk to in the book are people who break into things for a living. These are people who hack, who break into buildings. They never fail. And they do it, in a way, to demonstrate how easy it is to what they do. They wear GoPro cameras. They get in through the smoking door. They pretend they’re there for an interview. They pretend that they’re delivering something. You know, and they always break in. They never fail.

The point of it is not to embarrass or to humiliate, but it’s to show really how easily they could do it. Once they’ve proven it, the institution and the—and the security team and the IT staff has to be willing to say, we have something wrong and we are going to change what we’re doing. And that’s what the red team provides, a series of corrective measures in a prioritized way that you can then act upon. So those three things I think are the three critical best practices.

LINDSAY: OK. So you’ve talked about using red teaming, particularly in trying to detect vulnerability to cyberattacks. But outside of cyber, who else does red teaming really well?

ZENKO: So one of the groups that I found really useful—and this is on the simulation type of red teaming—was the NYPD Commissioner’s Office. The NYPD commissioner conducts what are called tabletop exercises. They’re done at 1 Police Plaza. They consist of all the two-stars who are the regional commanders for Brooklyn South, Brooklyn North. They consist of the Bomb Squad, the Detective’s Bureau, everybody else—FBI, Joint Terrorism Taskforce. And it’s about 40 or 50 people who meet around a room and are then given a series of fictional scenarios for an upcoming scheduled event.

So I was there just about 10 days ago for the New York City Marathon preparation. And there were members of the New York Roadrunners there. And they went through a series of situations like, for example, a drone shows up over the Verrazano-Narrows Bridge. Why is it there? And another one shows up. It looks like it’s carrying a package. What do you do? And people have to respond plausibly with the resources they have at time. Or at mile 16 when people come into Manhattan and then they turn north up First Avenue, what if between 100 and 200 of them suddenly got dizzy and fainted and started throwing up? Did they drink tainted water or is just an aberration of a cluster of people doing it? And if they drank tainted water, where would you find it?

And what comes out of these ultimately is people are pressed to come up with realistic responses with the resources at hand. And they learn collectively how each other will respond. So it was fascinating to see how the NYPD learned how the New York Roadrunners is prepared for a ton of contingencies. For example, the race can be diverted at any location. They have alternate finish lines for both the elite runners and everybody else. And these people who had been protecting the marathon for years didn’t know about them. So it was only as a result of that red team simulation that they learned how to better protect the marathon.

LINDSAY: Are there any risks to doing red teaming? Is there any times when you shouldn’t red team? And I ask, in part, because if you get really smart people to sit around and think about vulnerabilities, isn’t there a risk they will point out vulnerabilities that maybe you really shouldn’t be worrying about, they could focus on the extreme worst-case scenario, as opposed to the more mundane ones?

ZENKO: So this is a constant worry of security researchers. These are people who do ethical, responsible disclosures of software. These are hackers who, you know, literally break into everything via the Internet of things. The attack surface is just growing exponentially. And if you go to the security conferences and you understand how they do it, it’s actually not that difficult. So people say, well, if you’re proving this, aren’t you showing—giving a road map to the bad guys? But if you do it responsibly, you talk to the person who makes the software, you talk to the car manufacturer, they send out a patch and then they get to brag and show off what they did. So that’s the responsible way to do it.

The irresponsible way to do it is just to put on the Internet some zero-day exploit—or, not a zero-day anymore—but some piece of malware and say this is how you could use it. The other way it can be—it can be—you shouldn’t do it, is if you’re not going to do anything with the findings. So if you—if you empanel a red team and they do a serious scrub of your strategy, or your plans, or processes and they come up with something outside of the box for how you shouldn’t change things, and the senior vice president just puts it in the inbox and doesn’t listen to it, well, now people recognize that this is a boss who isn’t going to listen to dissenting viewpoints or things that challenge them. And subsequently, it demoralized the workforce.

So I always say, don’t ask anybody to red team unless you’re interested in what they have to say. Red teams are never determinative. They provide additional analytical insights that you might not find in your institution, but they help decision makers and leaders think through problems.

LINDSAY: OK, you talked to dozens of people in putting the book together. Your favorite story?

ZENKO: So my favorite story is in many ways the most heartbreaking story of the book, is a gentleman who became a whistleblower. He was the head of the FAA Red Team. His name is Bogdan Dzakovic. And the FAA Red Team was set up basically after the Lockerbie bombing. And in the mid-1990s they were conducting relatively low-level covert smuggling tests to test both personnel, procedures and technology—and screening technology.

And what they found was really disheartening. I mean, they essentially, using really rudimentary tactics with just a small amount of surveillance, broken into everywhere they wanted to break into. They captured it very specifically. They reported up the chain of command to the individuals who would have done something with it. And they found out that nothing was being done. Basically, at the time the FAA could send a letter of correction or impose a small fine on a commercial airline. And then, through a series of arbitration, those fines would get knocked down to almost nothing.

And over time basically they saw security vulnerabilities everywhere. And one of the more tragic events was just a few months before 9/11, out of Logan International Airport, with Fox Local News, they did a covert smuggling test to show how they could break things in. And they were sending this to everybody and they were telling everybody. And after 9/11, Bogdan became a whistleblower. And he sort of—he sort of said, this was a clear and present safety. And the Office of Special Counsel agreed and he actually won his whistleblower cases.

So this was a case where red team was being done correctly, faithfully, with the vulnerabilities being reported to the right people, and airline security was not really improved in a way that—even if they had acted on it, they might not have prevented 9/11, but it would have certainly raised the awareness of the threats and improved the security culture.

LINDSAY: OK. At this point, I want to bring the rest of the room into the conversation. I want to remind everybody this is on the record. I’m going to ask you to please wait for the microphone to come to you. Going to ask you to speak directly into it, put it very close to your mouth, and stand while you do so, state your name and your affiliation. So who wants to go first? Mitzi in the front row.

Q: I’m Mitzi Wertheim with the Naval Postgraduate School.

I was lucky enough to join the Defense Department 38 years ago, but I’d started at the Peace Corps. And actually, when I went to the Defense Department, I said: You need Peace Corps skills. You need to understand the others. Red teaming has been going on for a long time. But one of the things I’m aware of is changing the way people think, feel, behave, and believe is the most difficult thing to do. And if you don’t have a personnel system that rewards people for identifying the things that you need to fix, it doesn’t matter. And nobody pays attention to personnel systems to figure this out. So how would you deal with this?

ZENKO: That’s a great question. And you know, one of the things you realize is when you talk to people about red teaming they always say, well, I have somebody who works at my office, or in my command, or at my agency, and they’re kind of a maverick. So we sort of red team. You know, this person, this man, this woman’s kind of pain in the ass and it’s a good thing. But the truth is that mavericks get hunted down and killed. And the reason is, is because they pose a risk to the institution. And the fact that in every institution people can identify the one or maybe two people out of hundreds sort of demonstrates how rare they really are.

The Army, for example, recognizes this problem, significantly. It is a big problem. And in fact, at University War College they do a survey of every colonel who becomes a general officer, who becomes a one-star. And they found that there’s a national survey they’ve been giving out for decades. It’s called the Openness to New Ideas Surveys. And the Army colonels who become generally are the least open to new ideas relative to the American population. That’s not by accident, right, because in peacetime you want individuals who know what they’re doing, know doctrine, have a steady set of hands, and can implement. But they’re going to be the least likely to see things in alternative and divergent ways.

They’re trying to train people to do this. And I went out to Fort Leavenworth where there is a red team university. I’ve been out there five times. I took the two week short course. It’s a really remarkable thing to have people read constructivist literature, read Benedict Anderson, go to Nelson-Atkins Museum and they make them think about Mark Rothko field paintings. (Laughter.) You know, it’s just, you know, watching that is kind of a fascinating exercise. But when they become the 05s, the 06s, the general officers, those people who really truly think differently tend to either leave or get winnowed out. And how do you promote and protect your mavericks is one of the things that a lot of institutions think about.

Q: Do they have solutions?

ZENKO: So you either have a senior sponsor who identifies you and puts a tarp over you and hides you and protects you. (Laughter.) Or they are—you know, the famous case in the Army, as many of you know, who is now a three-star, H.R. McMaster, who is the maverick of the Army. Everybody knows H.R. And he is now in a critical position to find his own mavericks and to promote and protect them. But they’re really rare individuals. But it’s a great—it’s a great point.

LINDSAY: In the back. Mr. Brake.

Q: Ben Brake. I’m at Department of State, INR Cyber.

Following up on Mitzi’s question, I think another way—short of having dedicated red teams or mavericks, would be more inclusive workspaces and diversity in some of these organizations. And I was curious to know if any of your research touched on that issue.

ZENKO: Well, diversity is a great thing in terms of a lot of perspectives, but the problem is people who work together every day tend to suddenly think alike. This is demonstrated in tons of sociological experiments. They tend to think alike after time because the command climate signals it, the values that are transmitted down become apparent to everybody. Institutional culture, everywhere you work, it’s in the walls. You know what it is. It’s tacit. It’s sometimes hard to identify, but we all are constrained by it every day, whether you know it or not. As I always say, nobody shows up to work every day and says, what are we going to do today? They don’t start anew with a new plan to start doing things. There are deeply ingrained SOPs and culture and values, and we all know what they are.

So the longer people are together, the less diverse they become. That’s one problem. The second one is we know that, especially in hierarchical institutions, people are unwilling to voice up. They don’t challenge. They don’t provide dissenting viewpoints for several reasons. Some people think they’re going to be retaliated against, but mostly they think it’s pointless. They recognize there’s really no reason to. Things aren’t really going to change, so I’m not going to speak my mind anyway. So it is useful to promote diversity. It is useful to have mavericks. But over time, they become diminished significantly. And I think red teams are one way to have a semi-independent look at things once you recognize that.

LINDSAY: Gentleman here in the front.

Q: Thank you. Christopher Graves, Ogilvy.

Is there an educational or pedagogical approach that delivers more effective red team thinking? And what’s the most effective red team thinking you’ve seen in the private sector?

ZENKO: So interestingly enough—I’ll answer the second question—the private sector is the most secretive sector of any I sort of surveyed. Intelligence community, actually very open. Military, more open than anybody, especially once you’re at the 06 level and above. You know, Peter Schoomaker, the wonderful general, you know, a former four-start, like, once you shed a little ketchup, like, you’re just not afraid to speak anymore. And they will help you think through some of your problems. There’s a former great hacker, his name is Mudge. If you know Mudge, he’s sort of a legend. Mudge was at DARPA, a senior official for many years. Then he went to Google. And he said, I hated going to Google because I suddenly couldn’t talk to people. And he was, like, the head hacker at DARPA for many years. (Laughter.)

So it’s really hard to know what happens in the private sector because consultants only tell stories of success. And when they’re not successful, the fault always lies in the institution that they’re—the business that they’re helping to work with. So it’s very hard to know. There are some people, though, I would say—I became as part of this process—I’m a certified business war gamer, if you need this help. (Laughter.) I took the two-day course. I learned how to do it. There are people who are really good at that. A couple of them I feature in the book. But at the private sector, hardest to know. But I agree.

Teaching it is a very interesting thing. So there is now a sort of wide-scale effort to teach red teaming in massive way, both at Quantico with the Marine Corps and in the Army at Fort Leavenworth. It’s hard to each at an industrial scale people to be different, right? And the initial goal, actually, at Leavenworth was these were people who came out of the Iraq War, they were—these were people who planned the ground campaign in the Iraq War. As they like to say, I was on mission lock. I thought we were going to take down Saddam Hussein and then the Iraqi military was going to take over.

We had no idea what to think about if they were—you know, we didn’t pre-mortem our plan. We didn’t really challenge the assumptions. Everybody who told us something bad, we dismissed and we listened to our chain of command. That’s what they’ll tell you. So they created this place at Leavenworth. Initially they wanted to change the minds of all officers, but you can’t do that, right? So now they’re trying to train specific red teamers. And so there’s a two-work course, a nine-week course, and a sixteen-week course.

And I think they do about as good a job as anybody teaching it. And by the way, if you’re interested just Google—it’s the University of Foreign Military Cultural Studies, Red Team University. They have Volume Seven. It’s the applied critical thinking handbook. Anybody can find it. Anybody can read how they teach this. It’s not a secret. So they’re about as good as it gets at that, where there’s public information.

LINDSAY: Gentleman here in the front.

Q: All right. I’m James Turner for the Daniel Alexander Payne Community Development Corporation.

Do you have any lessons learned or things you can share with us to impose some discipline on red teams? In other words, even though the boss might be in favor of it, don’t come back with, you know, a thousand findings. Come back with something I can do something with, like, you know, 10 or 20 critical things that would really help the problem that I can get my arms around?

ZENKO: So the way you do this effectively is the initial scoping conversation you’ll have with the targeted institution. You know, if you come in at the last minute, people use this term if you’re the seagull and you come in the last minute and you go to the bathroom on the plan and then you fly away, that’s not helpful, right? You needed to be there at the beginning to understand plausibly what they can do with your information. Like, if you’re trying to help improve somebody’s computer network architecture, and they don’t have many resources, don’t tell them about the most advanced piece of malware to break in.

Help them—what can you do with the plausible amount of money you have, because one of the things we learn about IT security is most of it’s spent really poorly. It’s not a matter of most resources. Most of it is being used to buy things that big name companies put out, but it’s actually not the most effective. So that’s one thing. The other thing is, the final product has to be, I would say, prioritized. It has to have specific, concrete recommendations. And it has to have a timeline when you can do it, right? So if a red team comes back, right, with a long list of things that you can’t plausibly do, and it’s too late to really make the changes, that’s not helpful.

But most leaders, senior vice presidents, senior officials in intelligence community or business, will say: Tell me the three things I really need to do and why. So that’s the homework. And then the recommendations at the end. The other thing is, there are certain people who should—in red teams, who should never meet with the target institution. Red teamers are strange, almost universally. They often behave poorly with others. (Laughter.) They’re either too young and they don’t care any better or they’re too old and they don’t care—and they just don’t care anymore, right?

People always say the best red teamer is the terminal colonel. This is the individual who’s not going for general officer, they know everything about the institution and the issues, they’re not looking to impress or get promoted, so they’re all of a sudden—they’re as honest as ever. But that person might not be able to speak in a way that is sensitive and, I would say, consistent with the needs of the target institution. So sometimes there’s a buffer and an intermediary there.

Q: Thanks. Elisa Massimino with Human Rights First.

So I run a human rights advocacy organization. And the reason I came tonight is because I think we have a chronic problem in my community, if you will, of failure to think about what the other side, whatever it is, on an issue is thinking and why. And I just wonder if you could talk about applications of this concept to advocacy work.

ZENKO: So it’s funny, because I’ve actually been in touch with colleagues of yours who say, we want to red team our advocacy issue, or we want to red team philanthropy, for example. That’s one that’s a lot of people. We don’t really know how to demonstrate impact. We have trouble measuring it. We’re doing the same things over and over again because they get funded well and our board seems to sign off on them.

And that’s a big challenge. I would say understanding your adversaries, on your own you won’t do it, right? You will not show up in conference room one day and say, let’s think hard, right? People at Leavenworth like to say, like, if you just show up, as the term BOG said—bunch of guys or gals sitting around the table—and we’re going to think hard, it’s like riding a stationary bicycle. You will go nowhere because everybody is already—has the same culture ingrained, there’s groupthink issues. If the boss is in the room, there’s hierarchy issues right away.

So there are a series of tips and techniques. They’re call liberating structures. You can go online and find them. You can—you can hire people. They cost relatively little money, they take some time to talk to people in the institution, they run really focused, structured brainstorming sessions which will help you break out of your groupthink problems. That’s one way to do it. And I’d say in many ways it’s the most effective.

But I always like to say, it’s easier to red team your enemies than it is to red team yourself. Red teaming what the adversaries are thinking about is actually not that hard. Red teaming your own institution puts at risk you and everyone you work with. It’s also easier to red team operational issues—like, what can the enemy do—than it is to red team ideational, which is let’s really rethink this strategy or this plan or what methods and processes we have in place. That’s actually harder. But you should grab me later, because I’ll have more ideas for you, please.

Q: You won’t share them with the rest of us? (Laughter.)

ZENKO: It’s too long a conversation, there’s wine, people, you know. (Laughter.)

LINDSAY: I’m going to come down here in the front.

Q: Hi. Ray Kokolski (sp), former naval officer and government worker.

I was thinking about the government procurement process and how red team would work there, because your first idea is you have to go to the top. Well, where is the top, because if you stay in the government you have to go to Congress. So at what level would you attack it? Is it doable at all?

ZENKO: So interestingly enough, the first ever historical example of red teaming we could find was in 1963, and it had to do with a Pentagon procurement issue. There was a political columnist who described going to visit Secretary McNamara’s office. And the columnist says: Secretary McNamara is running a strange game backwards. And he calls his red team his devil’s advocates. And this was like—you know, and it was basically—the concept of red team, as best I can determine, was invented just a few years earlier, the red as a result of the Soviet red army, and thinking about NATO and the U.S. and the different ways you would defend against it. But actually, that was the first ever example of red teaming.

And actually a lot of big procurement decisions, as you know, get red teamed. When the tanker and air refueling gets—and the long-range strike bomber gets announced—immediately they appeal either to the GAO or somebody else, or the Pentagon OIG to sort of do a red team assessment. But in many ways, they’re just checking to see that due diligence was done. It’s not a real sort of red team rethinking of the issue of why do we need a long-range strike bomber? I mean, that’s what a red team would really do. What are the missions and capabilities and roles and responsibilities that lead you to this significant procurement decision?

I don’t see a lot of that happening, but it could certainly be done. I would have Congress demand one. You give them about $5 million. It takes about a year. And they could do a realistic assessment. That’s what I would pitch it in.

LINDSAY: Let me ask you a question to follow up. You’ve drawn this sketch of the maverick who’s going to help improve the organization. But is it also possible for some people to be cranks? And how does your senior vice president know how to distinguish between the maverick who is bringing smart advice and the crank who’s going to steer them wrong? Because I was particularly struck, you talk about the selection bias problem, particularly in the private sector, where you hear about the successes but no one tells you about the advice you got from the red teaming that took you down the wrong avenue. So how do you puzzle through that, Micah?

ZENKO: Well, again, you have to understand, why do you have the red team, what do you want them to do? The crank by definition doesn’t have actionable, useful advice, usually. The crank knows what’s wrong, doesn’t understand the series of pressures, the limited time, the limited resources that a decision maker has to deal with to make a final—to make a final decision, make a final choice. But, again, the initial scoping conversations are so critical. And many red teamers describe them as being like a therapy session. They say, especially in the business world, it’s quite remarkable, they say, like: Tell me what is your most important thing? Is it to protect our data? Is it to get quarterly market share? Is it to get quarterly earnings? Is it reputational? Do you not want to lose your job, right?

And you find these leaders, they don’t really know what their most important thing is. And once you have identified what the most important thing—that’s the thing the red team should spend its most time working on. What is it that’s most critical to you? And the crank is going to be less likely to have that sort of therapeutic, I would say, conversation. They’re just going to come in with a problem and very little practical solutions.

LINDSAY: I think the gentleman over there has a question.

Q: Hello. Tomas Bilbao with the Cuba Study Group.

In your 200 interviews, did you find that there were any cultures, whether national cultures or institutional cultures, that are more open or receptive to the type of feedback a red team would provide? In other words, more willing to be introspective or self-critical?

ZENKO: I would say, interestingly enough, in the military, in a command staff, they tend to be very open to it. The problem is the people, the more junior staff officers, are often the least willing to give it. You know, one of the lieutenant colonel Marines I spent a lot of time interviewing, who now teaches red teaming at Quantico at Marine Corps University, you know, he said the way I got promoted in every job was to read my mind’s boss—to read my boss’s mind, and then act upon what he wanted before he asked me.

And it was quite interesting, because then I talked to lots of senior leaders, colonels and above who lead command staffs, and they beg people to come in and tell them what’s wrong. They want to know it. So how do you translate down through command climate, through protecting individuals that you value what they have to say. You won’t be retaliated against. It won’t be pointless. I might not do anything with the information you tell me, but I welcome it. And that’s a difficult thing to demonstrate.

Q: Did you at all look at the Israeli military? The “Start-Up Nation” book talks a lot about how the Israeli armed forces reinforce and almost (horde ?) that type of question.

LINDSAY: Tomas, you asked the right gentleman. And now he has an answer for you.

ZENKO: Well, there is a unit which was very hard to learn about, except some former Israeli military officials and intelligence. It’s called Ipcha Mistabra, which means “the opposite is true” in Hebrew. They are directed to come to the opposite conclusion of whatever the current plan or conventional wisdom is. They brief not just—they’re in the military intelligence component of the IDF. They don’t just brief generals. They go to parliament. They brief the prime minister’s office and the prime minister’s Cabinet. They describe their jobs—one of the individuals I know who did the briefings—as exhaustive. You have to essentially be argumentative by design. You have to challenge and doubt everything that happens.

And this is the classic in the Vatican sense, which is the—the opening of the book, this is the devil’s advocate. This is the person empowered to challenge and question every fact that came up before someone came up for sainthood. It existed for over a thousand years. And some of these cases for canonization lasted decades because they just found more, and more, and more damaging, harmful information. It’s hard to do that every day. And you have to do it in a way, again, that is sensitive, that doesn’t get you shutout, so they don’t stop listening to you. But it does exist in the Israeli military. And the people who do those jobs don’t last more than two or three years. That’s about as long as they can do it.

And I’d say one other point, which is—gets to this issue of red teaming. There’s a temporal aspect to it, which is people shouldn’t red team forever. People should serve on red teams for short periods of time. It tends to change their perspective. They suddenly see things differently. They feel more empowered to speak up. But if you’re a red teamer forever, if you don’t go back to blue or you don’t go back within the institution, you probably forgot what the institution’s doing in the first place.

LINDSAY: Over here we have a gentleman with a question.

Q: Dwight Holloway, BlueCreek Investment Partners.

Based on your research, what percentage of the time are red teams tasked with discovering best practice, as well as discovering vulnerability? You gave the example of the FAA team and the New York Marathon. Would those teams have also been tasked with, tell me who’s got the best airline security in the world or tells me who does marathon security the best in the world?

ZENKO: No. And basically, a red team, I always say it should not supplant the typical operations and plans and policy staff in an institution. The red team, you know, in a military command staff should not just tell the director of operations in the three section what they’re doing wrong and then say here’s what you should be doing. That’s not what they should do, right? Their job is to challenge assumptions, to identify blind spots, to poke holes in plans and strategies.

But they on their own should probably not be the same ones who develop the plans because, as you know, you fall in love with the plan. Similar in the private sector, you fall in love with the deal. If your job is to develop it, you simply will not see it in a critical and divergent way. So they can help them, and they should have a series of conversations with them, not just at the very last minute, but they are not the ones who come up with best practices or necessarily a new plan to, I would say, replace the one they found holes in.

LINDSAY: Did you have a question, sir? Yes, stand, and bring the microphone.

Q: Hi. Bale Dalton. Department of State, Navy Reserve.

We’ve touched a little bit on this, but oftentimes the critique of a devil’s advocate is that they are able to say sort of these negative or antagonistic things and get away with it, and then don’t have buy-in in the final product or don’t have buy-in in the company as a whole. So in your research, did you find that people preferred sort of external groups to come in and do the red team, which obviously have fly-away capability and not buy-in, or prefer to rotate people in through internal teams, as you were sort of talking about before?

ZENKO: I mean, my personal preference is internal teams that have limited time and exposure because they haven’t given up on the institution and they don’t have a profit motivation to simply have some bang and then to fly away. A great example of this that I feature in the book is the U.K. Ministry of Defense has a red team. They’re in Swindon, which is about 50 miles west of London. And I went on a visit to them. And it’s quite fascinating, because they’re by all these people who do training, and doctrine, and education support.

But they are independent. Their products are not coordinated with anybody else. They decide what they work on, and then they say no to people who aren’t going to do anything with them. And people are there for a little period of time, and then they go away and others come in. And once they’ve left, they’re sort of changed. When you’re exposed to red teaming, you become a changed person. Over time—that wears off, but at least for a period of time after you’ve been exposed to it, you become slightly changed. So tend to go for more of the in-house.

LINDSAY: Lady in the back.

Q: Thank you. Paula Stern.

I wanted to follow up on your—the questioner asked about cultural differences, and ask about gender differences, and what your observation is. Generally when you are part of a group, and you’re a minority in a majority situation, you tend to be able to see things and be willing often, if you’re not completely cowed, to say things. So I’m wondering what your observations have been when you have had a gender diverse or other forms of diversity on a red team. Does it make a difference?

ZENKO: It does. And it’s interesting, because there are certain communities where there is very little diversity. And I would just say, in the hacking world it’s terrible. I mean, not just information security, but people who actually do penetration tests of computer networks. And I’ve been to many security conferences. The number of women you can count on one hand among thousands. And I actually know somebody up at NYU who teaches undergraduate women how to hack. That’s all he teaches. And the problem is they eventually leave the field because—

Q: It’s toxic.

ZENKO: It’s not just toxic, but breaking into things is boring. It’s actually not hard. Women want to be part of something that is a collective, that builds an institution, and is successful. Guys like to break into things and then just to walk away. (Laughter.) They found this—they found this sort of over and over again. And that was one of the things we had a really great, difficult time of finding more women in the computer security world because they’re just—in the hacking side of things, they’re not—they’re not there.

I would say the most effective red teamer that I found in this book, as it was described to me, is—there’s a one-star—a retired one-star who runs this Ministry of Defense red team in the U.K. He said the best red teamer he ever had was an American woman who was 20, undergrad, working for the summer for them. She was so different from everyone else there, not in uniform, didn’t know people’s ranks, didn’t care, said things that nobody else would say because she was leaving, she was a fearless individual. And there are certain very specific situations where she pointed out things to very senior British defense officials which made them change what they were going to do with some—with the defense white paper specifically.

And it was only because she was so different, so uninvested, came in with sort of an outside perspective. But you can’t count on that, right? It’s hard to find that sort of lightning in a bottle with that individual. But you’re right, diversity is a critical component, not just of—not just gender and background, but experiences and worldviews. A good red team should have a diverse set of skills and background.

LINDSAY: I’m going to go all the way in the back. All the way in the back. Wave your hand, they’ll bring you a microphone.

Q: (Laughs.) Thank you. The name is Mercedes Fitchett with U.S. Air Force.

If you’d like to take the Fifth on this question, please feel free to do so. If you were to apply a red team to CFR, how is it that you would see that taking place?

ZENKO: Well, I would—I mean, the one thing is, because we get—this has probably been reported. I mean, our computer networks, as are many, are at risk. So I would absolutely have a red team who would think about certain state-based adversaries who are trying to get inside computer networks. So that would be one critical red team. The other is just sort of the red team of the processes, which is trying to find specific projects or upcoming events that—like, for example, we’re about to hit our hundredth anniversary. It’s a big deal, right? And there are people who have been thinking about this and how to roll it out. I might red team the 100th year anniversary of CFR. That’s a good time for a really big strategic rethink. So that would be—that would be the sort of thing you can plausibly do and have an impact on.

LINDSAY: Sir.

Q: Hi. I’m Jack Gocke with Marsh & McLennan Companies.

One of our operating companies is called Mercer, human resources consulting. And we look at—we look very closely at employee engagement, and how it’s impacted—how it impacts productivity. Many CEOs view employee engagement and productivity their number one challenge within their organization. So I’m wondering you how can apply some of your work to helping boards and the C-suite better understand what engages employees, what doesn’t, what works, what doesn’t, what needs to change within the organizational cultural milieu to make it work better.

ZENKO: I don’t have a good answer. I’ll just say, though, that in the introduction to the private sector chapter, I go into a lot of what it is about people both either both unhappy and unfulfilled at work. I mean, there was a great Pew study that just came out recently, basically 50 percent of all workers don’t know why they show up every day. They literally—I mean, they drill down on, like, what are you doing here? What’s your job? How is it tied to the core mission of the institution? Most people don’t know, right? So to engage them, you should probably provide some leadership and some guidance and make it clear, though a series of both command climate and rewards, and incentives, and encouragement, like, that’s the first step.

But the other issue is, the individuals who can identify problems and are willing to voice them up. So one of the critical—I would say one of—if I was going to read one thing on red teaming other than my book—

LINDSAY: But you should read your book, right?

ZENKO: You should read the book. (Laughter.) Is the—and it’s available. It’s free online. The GM report to the—

LINDSAY: Not your book.

ZENKO: No, not free online. (Laughter.)

LINDSAY: It’s available in the back of the room.

ZENKO: Maybe in certain other countries it’s free online now. (Laughter.) But there was a law firm that was, as a result of the ignition switch fatality issue with General Motors, that was hired to do a study of the organizational culture within General Motors. And it’s a fascinating 200 pages. Among other things, the command—I mean, nobody has been found criminally culpable, although about 180 people have lost their lives, it’s cost GM hundreds of millions of dollars, the CEO lost their jobs, many senior people. But nobody will ever be held criminally culpable.

The problem was that the command climate was quarterly earnings matter over everything. And people were actually instructed to use specific language to diminish security threats. So if there was a security issue of safety issue, use these words, don’t use these words. And the culture was clear that security didn’t matter. And nobody ever said, don’t care about security or safety, but it was signaled throughout both what mattered, what was valued, and what got you promoted. And it’s a devastating study. Nobody’s to blame for the problem, necessarily. But the people who knew, knew enough to keep their mouth shut, right?

So how would you have found them and made them voice up? It’s unlikely they would have in any situation. A sort of empowered red team, given scope and access to a lot of an institution like that, might have found it. But you never know.

LINDSAY: Well, we have time for one more question. Before I pick the lucky last questioner, I have to remind everybody that this meeting was one the record. Young lady in the middle row, you get to ask the final question.

Q: Thanks. Thanks, Micah, hi. Alex Toma with the Peace and Security Funders Group, which is a network of funders and philanthropists.

So my question is going to be about scale. So a lot of these funders in our network, although they fund at hundreds of millions of dollars a lot of the peace, and security, national security work that we’re all doing, have tiny teams. I mean, we-re not talking, you know, Human Rights First numbers. We’re not talking CFR numbers. We’re talking dozen, maybe a half-dozen or so program officers, directors. So does scale matter? Can you do red teaming on a small scale?

ZENKO: It’s harder, because you likely don’t have the resources, right? And it’s also hard to identify—if things are going along and there’s no clear problem, it’s hard to find the money, the resources, and the time to red team, right? And one of my—one of my favorite quotes is the former head of security for Dow Chemical, Dan Erwin. He said, the surest way to get the senior management invested in a senior management plan is to burn down the building across the street. (Laughter.) The problem is, you rarely have a spare building to burn down. (Laughter.) And it’s usually not until there’s a crisis situation when people will actually hire somebody to do this.

But the good news is, crises provide opportunities. In the business sector, they often say it’s actually a crisis where people are going to be most willing to hear the bad news. They’re going to be most faithfully—understand alternative perspectives and to assume the role of adversary, for example. So it’s harder to do when you don’t have the resources, but because you have fewer at-bats, you have fewer chips to put on the table, it could potentially have greater impact. But I also have thoughts, if you want to talk to me later.

LINDSAY: Well, I think you have just seen why “Red Team: How to Succeed by Thinking Like the Enemy” is such a terrific book, and why Micah Zenko’s such a valued member of the David Rockefeller Studies Program. So please join me in thanking Micah.

ZENKO: Thank you.

(END)

This is an uncorrected transcript.

Top Stories on CFR

Iran

Steven Cook, the Eni Enrico Mattei Senior Fellow for Middle East and Africa Studies at CFR, and Ray Takeyh, the Hasib J. Sabbagh senior fellow for Middle East studies at CFR, sit down with James M. Lindsay to discuss Iran’s unprecedented attack on Israel and the prospects for a broader Middle East war.

Economics

CFR experts preview the upcoming World Bank and International Monetary Fund (IMF) Spring Meetings taking place in Washington, DC, from April 17 through 19.   

Sudan

A year into the civil war in Sudan, more than eight million people have been displaced, exacerbating an already devastating humanitarian crisis.