'Red Team: How to Succeed By Thinking Like the Enemy'

Thursday, November 5, 2015
David Mercado/Reuters
Speaker
Micah Zenko

Senior Fellow, Council on Foreign Relations

Presider

President, Council on Foreign Relations

In Red Team, CFR Senior Fellow Micah Zenko draws on little-known case studies and unprecedented access to red teamers to reveal the best practices, common pitfalls, and winning strategies of these modern-day Devil’s Advocates. The book shows how policymakers, business leaders, and curious minds alike, can succeed by thinking like the enemy.

The CFR Fellows’ Book Launch series highlights new books by CFR fellows. It includes a discussion with the author, cocktail reception, and book signing.

HAASS: Well, good evening, and welcome to the Council on Foreign Relations. I’m Richard Haass, and I’m fortunate enough to be president.

And this is—events like tonight are always on my shortlist of favorite events, since we’re in the idea business and there are few better ways to generate, develop, and disseminate ideas than to write a thoughtful book. And Micah Zenko has done just that.

So what we are going to do is spend a few minutes talking about his book, he and I, and then we’ll open it up to you all to ask the tough questions. And then he will mill around afterwards, accepting your congratulations. And rumor has it he’s more than willing to affix his name to his book if you are more than willing to, you know, cough up your credit card. (Laughter.) So I think that is the nature of the—of the exchange. Is that about right?

ZENKO: That sounds fair.

HAASS: That sounds fair. I think it sounds eminently reasonable. And then, whether you actually go ahead and read the book, that’s up to you. But increasingly for authors that is—that is secondary. (Laughter.) Only kidding, only kidding. (Laughter.)

So explain at the beginning, just to make sure we all begin from a platform of knowledge, Micah—I should say, by the way, Micah is a senior fellow here at the Council on Foreign Relations, I expect all as you know. He has probably written about and done more to shape the public debate about drones and the use of drones than really anybody else. And I think he’s actually had an impact on not just the debate, but on the policy that has been influenced by the debate. And his a background reflects what a lot of people here have, and it’s a background that I tend to value in particular—indeed, I try to live it every now and then—which is someone who’s spent time in government and also time academically. And scholar-practitioners are our sweet spot here because we’re trying to be more scholarly than those who are full-time practitioners, but more practical and practitioner-like than those who are full-time scholars in the academy. And the Council on Foreign Relations occupies a kind of middle ground, and we’re trying to do policy-relevant work that is, in fact, relevant and makes a difference. And Micah has been doing just that.

Sir, why don’t you begin with the concept of “Red Team.” Exactly what is red teaming? And where—from where does the concept and the phrase emanate?

ZENKO: Thank you. Well, I like to think of red teaming as both an approach, a mindset, and a set of specific, like, tactics and techniques. And really the approach and the mindset is the recognition that, if you’re a leader in an institution in a competitive environment where there’s relative gains and losses, that you probably don’t know everything that’s going on and you can’t conceive of what your adversaries are doing. So once you recognize that and you have that set of, I would say, humility about your own institution, you try some of these tactics and techniques. And specifically in the book it’s simulations, like the NYPD tabletop exercises I talk about in the book. It’s vulnerability probes—these are people who break into buildings and computer networks for a living. And it’s alternative analysis—and these are people outside of the sort of mainline authoritative analytical process who think differently and contrarianly by design to help leaders think through different problems.

And so the phrase itself, “red team,” comes from the Cold War, and “red” being the Soviet Red Army that NATO and the United States planned for extensively in the ’50s and ’60s. And the first use we found, actually, in the public literature was that, in 1963, there’s a great article by a columnist who describes a strange game that’s happening in the Pentagon where a red team—which Secretary Robert McNamara calls his devil’s advocates—are playing a game backwards, which is they’re starting with the end result and then they’re trying to think about how they came to this specific issue, which was about procuring a specific bomber. So it really—“red team” comes out of the U.S. response to the Soviet Union, and now it’s applied specifically for how you think about both your own problems and those of your adversaries—those problems that your adversaries can pose.

HAASS: What, then, are the most significant historical examples? For example, the old Team A, Team B thing, is that one?

ZENKO: Yeah. In 1976, after a huge fight, there were hardliners in the Ford administration who wanted to sort of reassess the Soviet strategic threat posed to the United States, and they didn’t like the CIA’s conclusion. And they said that the United States actually had a—both a missile and a nuclear warhead superiority gap to the Soviet Union, and these hardliners simply didn’t believe it. And so they got actually the first President Bush, who was the director of the CIA, he empaneled a Team B of outside experts to have access to all the intelligence and come to their own conclusion. And the conclusion they came to was very different than the Team A, which was the normal CIA analysis, and it was much more hardline. And in the end, President Bush didn’t do much with the intelligence and release it, but that’s one prominent example. Another—and I have that story in my book.

And the other sort of prominent historical example, one of the more tragic examples, was in the FAA Red Team. This was formed in 1995. This was a small group of six to seven security experts who broke into civilian airliners and smuggled weapons and explosives through screeners consistently. They reported the security vulnerabilities that they found to the FAA, which didn’t do much to make security industry—the commercial airline industry improve themselves. So those are two classic sort of examples that go back.

HAASS: After 9/11, when I was in government at a place that you had worked at before, the Policy Planning Staff, something began called the Red Cell. I don’t think I’m telling secrets. And the whole idea, if I remember correctly, was we—the CIA set aside a bunch of analysts to basically dream up ways that they could cause us major problems.

ZENKO: Yeah, it was actually two days after 9/11. George Tenet, the director of CIA, brings up the director of intelligence on the analytical side, Jami Miscik, and a couple of other seniors, and says, I want you to think differently, specifically about the issue of terrorism. And I want you to—this is using his language—I want you to “piss off” senior officials. I want you to be so different from mainline authoritative analysis that we are shocked by your suggestions and your outcomes in your analytical product.

So they formed about six or seven people at the time, none of whom were experts in terrorism—none of them—by design, because one of the things you learn about the value of red teaming is that people who are experts suffer from the tyranny of expertise. They know their issue to the ground level, but they’re the least likely to see alternative ways to frame it and discontinuity.

HAASS: Just for the record, we guard against that at the Council on Foreign Relations. (Laughter.)

ZENKO: I mean, one of the ways we guard against that is with looking at different fields, looking at different industries, and through sort of I would say cross-field peer review. And we actually do some of that—some of that here.

But George Tenet did not want people who knew about terrorism to work on it. And they put out things that were very different, and they were stamped differently so everyone knew it was a Red Cell. And the Red Cell exists to this day, and has tremendous appeal and traction in the intelligence community.

HAASS: What, then, are the situations—if you basically had a checklist—if you were working in any organization—because this, by the way, in no way is limited to foreign policy and national security. You could do red teaming at General Motors. You could do red teaming at Starbucks, I assume.

ZENKO: Definitely.

HAASS: If you had a checklist of conditions or prerequisites that needed to be met before you would establish a red team, what would you need to see that you would then say—it’s almost like it’s time for Ghostbusters—it’s time for a red team? What would you need to see?

ZENKO: So part of it is understanding your situational environment. When you’re in a highly threatening environment—for example, if you work at an institution that is facing cyberattacks all the time, you need to red team quite frequently. If you’re at somewhere that doesn’t face them, you don’t need to do it that frequently. If you have a scheduled event that is coming up—the most common use of red teaming in the private sector is pharmaceutical companies because they have a lot of other actors they need to worry about: the regulatory environment, their competitors, the marketplace. And so, for example, when they have a drug that’s going off patent, they will hire outside people to help them think through every potential scenario that could go wrong. This is what’s called a pre-mortem analysis, which is thinking through failure before you execute. And so you have to, one, identify that.

And the second thing I would say is you really have to have, as I call it the best practice in the book, the boss’s buy-in. If the boss doesn’t care, red teaming doesn’t matter. The people who need to do it won’t get the money they need, they won’t get the access they need, and then their final result product, nobody will do anything with it. And that’s actually one of the biggest problems.

And the third thing is that you need to have a clear understanding with the red team and the target institution as, why are we here? What is it you—what is it I can really do to help you? Which means to say, what is it you really value in the institution? Are you trying to protect your data? Do you care about your reputation more than your financial market interest? And once you have those conditions set, you can kind of see what red teams can do to help you.

HAASS: A slightly different but related question: What, then, are the do’s and don’ts of successful red teams? You mentioned before the tyranny of expertise—you don’t want people who are so wedded to the almost conventional wisdom that they can’t think outside the box, which in some ways is the raison d’etre of a red team. What other do’s and don’ts are there for people who want to establish something that challenges orthodoxy?

ZENKO: So one of the big don’ts is don’t assume that the people you show up with and work with every day can suddenly think differently. Like, you don’t go in a room one day like, we’re going to brainstorm and red team right now, because you’re already suffered from the same cognitive biases, the institutional pathologies, the command climate that the hierarchy structures, you already see the world in a very limited way, and you do it collectively. And it’s very hard to show up to the people you work with every day and suddenly challenge them and say, I think you’re wrong, actually. People are hardwired to make good impressions among the people they work with every day and their bosses, so they don’t give dissenting and challenging viewpoints. So you can’t just show up one day and do it. You need to have an outside moderator who leads these discussions. In the business world they call these liberating structures, where they force you to do structured brainstorming in a way that is often very uncomfortable. That’s one of the big don’t do’s.

And one of the big do’s is be willing to do something with the finding and, as I call this, be willing to hear bad news and act on it. One of the most consistent things about people who break into computer networks for a living is they find the same vulnerabilities over and over and over again, which is weak passwords, you don’t have segmented networks, you don’t hash your passwords, people still get caught up by the same phishing scams over and over again, there’s poor security culture. And they will find—they will—these guys, they break into your building, they break into your computer network, they never fail—ever fail. And then, once they’ve done that, they have a readout to show exactly how you did it and here are the corrective steps. Oftentimes the chief information security officer or the CEO looks at this and says, like, this is troubling, this is troubling, and they then they put it on a table and they never do anything with it.

HAASS: I mean, I can see, again, another question. The CIA one, the Red Cell, that’s one that’s identified. And one thing you said there surprised me, is you thought it had great traction. I will—I’m not so sure, because in a funny sort of way, once something self-identifies as a red team then people then see it in a certain way. So what about having red teams that aren’t known or called red teams? Aren’t they then much more, in some ways, dangerous, in the best sense of the word, to the bureaucracy? Because they’re not easy—so easy just to say, oh, they’re in the business of challenging conventional wisdom, they’re a bunch of gadflies. To what extent, in a sense, do they marginalize themselves by calling themselves red teams?

ZENKO: Well, it’s funny because George Tenet picked the phrase Red Cell because it sounded conspiratorial and alluring. He really wanted to shake up the bureaucracy, and it—that sort of stuck. But over time, the Red Cell itself—the Red Cell itself becomes institutionalized. People assume that the Red Cell’s going to think differently. They come—they come—they become sort of inured with it, and I think that’s a real challenge.

The really valuable red teaming, though, is when it’s done in a way that helps decision-makers think through problems. So if you drop a controversial Red Cell product in the middle of—in the middle of the White House Situation Room when the White House isn’t worried about that problem, it won’t be read. One of the things they’ve tried to do at the CIA is to time Red Cell products with the president’s schedule and the upcoming diplomatic schedule. So they’re most useful when they can be used.

HAASS: OK. So if you are looking at this world the United States confronts now—and there’s no shortage, shall we say, of stuff coming into the inbox—where would you think—if you were the assistant to the president for red teams or for whatever, what would you think are some areas or questions that you say, this might warrant a red team?

ZENKO: So the biggest thing would have been the approach in the summer of 2014 of going to war with Syria. And, in fact, not that long ago I was asked to come down and speak to a U.S. government agency who spends all of their time working on Syria problems, and they said we want you to red team Syria. How should we think about red teaming as it applies to Syria? And I sort of laid through some of the best practices and the conditions for where it tends to be successful. And they all went, “oh,” because they recognized the command climate will not accept the sorts of changes and the sorts of challenging assumptions and the conventional wisdom that is sort of set in place about how the campaign plan is working and progressing and making sort of iterative progress. They recognized that there’s just not receptivity in this institution to do it, so they weren’t really do it.

But the time you can really do it is before you go to war. And so there was a very short window with a very limited number of people who got access to the speech President Obama gave on September 10th, 2014. And then there were a lot of people who had to go and execute the plan, and in the intelligence community and the Pentagon who saw that speech and that strategy and they said, “oh, dear God,” like this won’t occur—the strategic objective of eliminating/destroying/degrading ISIS won’t be achieved. And so if they had been read in on it early, they might have been able to challenge it before they get underway.

HAASS: You just mentioned ISIS. I would think that a really valuable exercise would be to have a permanent cell of people who were the ISIS cell. And their idea wouldn’t be to think outside the box so much as their idea would be to impersonate ISIS as best they could, so it’s as if we could read ISIS’s traffic. And you would have a bunch of people who would so insinuate into their boots or sandals that they would give us insights into how their minds would work.

ZENKO: That does exist at U.S. Central Command, which is the military command for the geographic area of the Middle East, in the intelligence shop. They do have people who do that. But the fact of the matter is, it’s not much different than what you would read in The Economist or if you spoke Arabic and you had a good sense of what ISIS does. Commanders describe it as not being that practical and not being that useful.

What’s harder than red teaming your adversary is red teaming yourself. It’s actually easier to put your feet in the—in the sandals of an ISIS terrorist and think about how they would see the world. It’s harder to be challenging with your own processes, your own blind spots, to understand your own strategic weaknesses.

HAASS: Isn’t that, though, what people in the hack—I mean, basically you hire people to hack your own computer system so you learn the vulnerabilities, and then you go patch it.

ZENKO: That’s right, that’s right. I mean, that’s the core theme of the book, is you can’t grade your own homework. The same people who set up your computer networks, the IT shop, as great as they are, as much as they think about security, as many defensive measures as they put in place, they cannot think as deviously and as proficiently as the common street hacker can. And so you hire these hackers to use off-the-shelf malware to find ins to your software or your computer network or your configuration, and they do it all the time. Again, they show their homework very carefully and then they say, these are a set of remediated corrective measures you can take to make yourself more secure, but it’s never perfect.

HAASS: I could go on, but I will show uncharacteristic restraint. So let me open it up to you all, to our members and guests, to raise any questions/issues with the author, with Micah Zenko.

Yes, ma’am. You have to wait for a microphone, and please just identify yourself. We’ll do—

Q: Nancy Truitt.

I’m wondering about the differences in culture. For example, thinking about ISIS, they essentially are Muslim; we are not. We don’t think that way. I could understand how red teaming would work in the Western Hemisphere, in Europe with Russia, but how does it work with a culture that is so totally different than ours?

ZENKO: Well, this is trying to, again, assume the role of your adversary, assume the role of other actors, and it’s a problem the United States has faced for a long time. As you recall, during the Cold War there was a whole industry in the intelligence community that did what they called Kremlinology, which was trying to see the world through the eyes of the individuals who were leading the Kremlin. And believe me, at that time we thought communism was more alien and foreign than we think about even Islamic terrorism today. So it’s a problem they face, I would say, consistently. You’re right on.

HAASS: Are there areas—just related—that you think red teaming’s a—is it ever—like, I could see where it might be a waste of time in some situations. Are there some situations where you’ve actually come to think it’s counterproductive?

ZENKO: So there are many situations, especially in computer networks and in people who break into buildings, who significantly—they call what’s called fratricide. They shut down computer networks or they actually cause harm and damage to the building. That’s a bad situation. You don’t want to do that.

The other thing is red teaming has to be, as I call it, a safe-to-fail environment. The people who break into buildings for a living, they carry GoPro cameras. These are broadcast to the desktop of the CEO, so they can watch people break into their building. And as they’re doing this they walk past security guards, they walk past employees who see strangers who are wearing—I mean, one of the guys I feature in my book, he wears a shirt, it says “your company’s IT guy.” (Laughter.) And he gets in—he gets in everywhere because he says—and he points to his shirt. (Laughter.) And he has a card which says, “I’m the IT guy.” And then he has a fake letter from the chief information security officer that says, I’m here to fix your IT system. They go, oh, OK, go ahead.

And so this guy breaks in everywhere he goes and he shows people for having poor security culture. All those people could be fired because they let a stranger walk around in the building, but you can’t fire them because it’s not any one individual’s fault.

HAASS: So, I mean, in that area—and we’ve actually found it here or any other institution, a lot of what you would call culture or what I’ve heard a CEO recently describe as digital hygiene. You can do all the fancy stuff with all the fancy software stuff, but it’s the simple day-to-day behavioral things that really, more than anything else, make a system more or less vulnerable.

ZENKO: Yeah, it’s the same—there are six best practices in sort of cyber hygiene, as they call it, that if you implement you will protect yourself from roughly 80 percent of the vulnerabilities—

HAASS: One of them is not to have a password called “password.”

ZENKO: Absolutely. (Laughter.)

HAASS: So if any of you have a password called “password” or “ABCD” or “1234,” when you go home tonight, please change it. Is that right?

ZENKO: That’s right.

HAASS: (Laughs.) Yes, sir. It probably works. Don’t worry.

Q: Evan Michelson from the Sloan Foundation.

How does red teaming differ, if at all, from some of the scenario planning or forward-looking activities that companies like Shell might have done back in the ’70s during the oil embargo?

ZENKO: Yeah, I mean, I actually do get into some of the—and Shell was famous for it, and just a few people there. It can be similar to it, but the truth of the matter is that most scenario planning, it is—it is that not that contrarian, is not that outside of the box. The participants are people from the institution itself, so they already know about the bandwidth for how they can go. A real red team sort of says, this is all the range of scenarios that can really happen, that could really cause harm to your business, that really could pose risk to your business environment. So it’s a little bit more aggressive and devious, and I would say out-there, than classical scenario planning. But it has similar brethren, I would say.

HAASS: What surprised you in the course of researching or writing this book? What did you come across that you went, wow?

ZENKO: So many things. One of the things that fascinates me, and this is the honor of getting to write a book at a place like this, is that everybody in the business world thinks—everybody thinks wisdom lies elsewhere, as I like to describe it. People in the business world think the people who really know about leadership and management are all in the military. Senior leaders in the military say if you really want to know how to manage a company and lead, you need to talk—you need to go to Silicon Valley, because they’re at the cutting edge. People in the homeland security industry similar, you know, everybody wants to go everywhere else for how to go. And I sort of look at cross purposes. I would say that’s one of the things that really shocked me, was that people just aren’t confident in the systems and the processes they have in place at their institutions. A lot of them aren’t.

And the second thing—

HAASS: Just to interrupt—isn’t that OK? I mean, I actually—there might be an argument for people in charge of places being—false confidence or overconfidence might be, in a funny sort of way, dangerous, because it would lead you into a kind of smugness or complacency that could probably get you into trouble.

ZENKO: Absolutely. I agree. The other thing I would say, I was fascinating by how interesting these people are. They are by design and by selection bias different. Some of the best people who are red teamers, as they like to say, they’re very young or they’re old—people who don’t know better yet, or people who don’t care anymore. (Laughter.) In the military, one of the phrases people describe is the terminal colonel. This is the 06 who is at the end of their career, they’re not going for general officer. And they don’t care anymore and they’re really going to say whatever they want. It’s also young people who aren’t steeped in the institutional pathologies and the culture and the expected behavior of an institution. And they say things that are outlandish and surprising.

One of the—one of the groups I spent a lot of time researching, and I went to Swindon, you know, west of London to talk to them. The Ministry of Defense has a red team in the U.K. And what they do is—it’s made up of people who they never wear uniforms. Most of them are civilians. There’s no hierarchy. There’s no rank. Everyone calls each other by their first name. They do special, specific, targeted assignments for various branches of the government. And the guy who runs it is a retired one-star general. He said the best red teamer he ever had was an American undergraduate woman who spoke to all these British generals in a way that they had never been spoken to, and identified problems and raised issues that none of them would ever talk about. And she served there for four months over the summer, then she went home. And if she had stayed there and tried to make it a career, she likely would not have made that sort of arguments.

HAASS: How do you deal with the problem, which you mentioned, that red teams often come up with really interesting stuff, and then it dies in the inbox?

ZENKO: Well, this is—this gets back to the whole issue of demonstrating impact. How do you know when red teams make a difference? And that’s the question I ask everybody—everybody who red teams and everyone who receives red teaming information. You know, it’s interesting, the CIA Red Cell is a great example, because I interviewed Bob Gates and Steve Hadley, the national security advisor, at the White House. And they all said, I read every single Red Cell product that came in my inbox because it made me think differently, even if for a moment.

And the one thing that red teaming always does is it changes the way you conceive of the world and the problems. It’s often temporary, because you go back to your desk and you re-login to your computer and you’re back in the same cognitive biases and mindsets. But for a short period of time, people are both willing to see problems they didn’t see before, and they feel this obligation—this moral obligation to identify them to people who can do something about it.

HAASS: It was actually—I read them every day also. And part of it, it was just different. You were bombarded with intel, which all was kind of classic. And the Red Cell was just so different. It was almost like a little bit of a break and a little bit of a—it was different. It was kind of—it was intellectually—it exercised different muscles.

ZENKO: I mean, mainline authoritative analysis, by design, is intended to tell senior policymakers: This is what the world looks like. And what the world looks like tends to be what they already know. It tends to look like what it looked like yesterday and the day before. And one of the big problems the Red Cell escapes in terms of process is typical inter-intelligence community products have to be coordinated. And the coordination process sands down all the rough edges. Everyone has to agree to this text. They might be able to add some footnotes, but most policymakers don’t go to the footnotes. They just read the text. So the final product tends to look like the last product and the product before that. Red Cell products, by design, don’t go through the coordination process. They’re signed off by one person. And they never—they’re basically never turned down. So they get to look different.

HAASS: So they’re not sandpapered off, which is—sure, Mr. Oppenheimer.

Q: Thanks. Micah, are there also particular occasions when the cognitive challenges of this are less? In the aftermath of a shock, the beginning of an administration, a sudden surge in capacity? You mentioned earlier that the best time to do this may be before war. But that may be a moment when decision makers are least receptive to this kind of input. But moments when their minds are suddenly jogged and their thoughts are challenged might be the moment when this can really have an impact.

ZENKO: No, that’s a great point. And one of the ways people try to get around this is they make the impact vivid and immediate. One of my favorite quotes is the former chief security officer of Dow, Dan Erwin, and he says: The best way to get senior management to buy in on an emergency plan is to burn down the building across the street. (Laughter.) The problem is, there’s rarely a spare building to burn down to demonstrate that this could happen to that building. (Laughter.) So one of the things these people like to do is hack into the CEO’s personal desktop, break into the building and leave your card underneath the mousepad.

HAASS: Don’t anyone get any ideas here, all right? (Laughter.)

ZENKO: And now it’s not just—it’s not just a case of, oh, I had a sense my computer networks were insecure. Now I know because I felt it directly. The other way is to use historical examples. When you can point to cases where—and I’ll just say, one of the best red team things you can read out there is free. There was a—GM hired a firm to do an assessment of the ignition switch problem, which has cost General Motors hundreds of millions of dollars and about 150 people their lives over the course of the time that we know this problem was. And it’s the General Motors after action report to the Board of Directors. And it is was a devastating example where people were shown specifically to keep their mouth shut about problems, to use very specific language to downplay security problems. And people were rewarded for this because the pressure for quarterly earnings was so significant that people didn’t want to hear about safety and security problems. So when you use historical examples like that they go, oh, this could be happening here. But you don’t make it 250 pages. You make it short and simple for people to digest.

HAASS: I see two more people and we’re going to probably bring it to the end. Mr. Gelb and then Liz Economy I think had her hand up.

Q: Listening to your description gives me the feeling that there is no problem that cannot be solved. All you need is to get people to be honest and talk about the problems. Well, the problems are so diverse, so completely across the board, there is no way that you can get everything—for example, we just watched a plane go down from Sharm el-Sheikh on its way to Russia. And the first statement was it was just something to do with the back of the plane. Now the feeling is it looks like it was clearly a bomb that was put in. There’s nothing new about a bomb being put in a plane and bringing a plane down. We’ve been watching planes going down for the last 30 years, whether it’s Pan Am or whatever it is. What you’re talking about is, if we just get the right people who ask the tough questions that that’s the way we’re going to get the answers to all the things that are going to be hurtful to us. We’re in a war at this point and there is no answer to every question, whether it’s red, green, or blue, in my humble opinion.

ZENKO: Well, when you read the book you will appreciate the final chapter, because the final chapter is all about having modesty about red teaming. A lot of the people who are consultants who sell red teaming are selling snake oil. Red teaming—they can charge, actually, higher billing money as consultants if they’re doing red teaming versus typical sort of consulting analysis. It has a certain lure and appeal. But the whole point is you have to have modesty, as you should for any other strategic planning tool that management uses. It’s interesting you mention the case about taking down the plane, because the U.S. does covert smuggling tests, even after 9/11.

And there was one—the Department of Homeland Security inspector general had a report recently that auditors, with no specific special training—they literally take them off their desks—they smuggled banned explosives and guns past commercial airline screeners 67 out of 70 times at six specific airports. So as I point out, they’ve made a lot of improvements to their defensive systems, but it’s never perfect. The way you find the vulnerabilities and the problems can be through red teaming.

HAASS: Liz Economy, you get the last question.

Q: So, Micah, thanks. Really interesting. I just had a question about process. I mean, a lot of what you’re discussing sort of gets into the dark recesses of, you know, failure, really, for some companies. How did you get people to talk to you, you know, in terms of sort of contemporary, you know, red teaming efforts and things? Do you have any stories about that?

ZENKO: So it’s interesting because this is a hidden world by design, because the people who do it are either in the intelligence community; it’s classified information, they’re in the military, you have to have a public affairs officer at your hip to make sure you don’t say anything differently. Or even worse, I found the most secretive field of all is private sector. And there’s this famous hacker, his name is Mudge, he’s been around forever. And he was a senior official at DARPA and at Google. And as Mudge likes to say, when I was a senior official in the Pentagon, I could talk to the press more than I could when I was at Google, because there the information is propriety or you have to sign nondisclosure agreements. So it was very hard to elicit these stories and these conversations. It took—they had to trust me with what I was going to tell them. They had to be willing to, you know, be sure that I wasn’t going to try to screw their institution or anything. And then I had to go and talk to them.

I went and became—I’m sort of a certified business war gamer, if you need it. I went to Boston to sit through the courses. I went to Fort Leavenworth, Kansas, where—I’ve been there five times, including taking the two-week short-course to become a red teamer. If you need red teaming at your military command I can help you. (Laughter.) And I went and I talked to a lot of people who break into computers and buildings for a living. And it was only sort of building those relationships you could do it. I did over 200 interviews, and these are fascinating people, so.

HAASS: If you want to catch a thief, get a thief. The name of the book is “Red Team.” The subtitle is, “How to Succeed by Thinking like the Enemy.” And please join me in congratulating the author, Micah Zenko. (Applause.)

(END)

This is an uncorrected transcript.

Top Stories on CFR

Mexico

Organized crime’s hold on local governments fuels record election violence; Europe’s cocaine pipeline shifting to the Southern Cone.

Defense and Security

John Barrientos, a captain in the U.S. Navy and a visiting military fellow at CFR, and Kristen Thompson, a colonel in the U.S. Air Force and a visiting military fellow at CFR, sit down with James M. Lindsay to provide an inside view on how the U.S. military is adapting to the challenges it faces.

Myanmar

The Myanmar army is experiencing a rapid rise in defections and military losses, posing questions about the continued viability of the junta’s grip on power.