Tarah Wheeler: Legacy Systems, Insider Threats, Rules vs Norms, Efficiency for Whom?
Cyber without the security.
Fighting through all the trash memes, trolling, and terrible DOGE database security we somehow made it to Episode 3! This week we interview cybersecurity expert Tarah Wheeler, CEO of Red Queen Dynamics.
Tarah serves as the Senior Fellow for Global Cyber Policy at Council on Foreign Relations and is a member of the Electronic Frontier Foundation’s Board of Directors. You can find her on Bluesky @tarah.org.
Take a listen and let us know what you think on our socials!
We’d also love to hear your ideas and questions! Get in the comments!
Transcript Time!
We’re going to try and setup our production pipeline so we can offer you transcripts of all our episodes in a timely (and accurate!) way. Here’s our first stab!
Julian Sanchez: Hello, people of the Internet. I am Julian Sanchez.
Noah Kunin: And I'm Noah Kunin, and you're listening to WatchCats.
Julian: This episode, we have noticed, and perhaps you have too, that the activities of the Department of Government Efficiency and Elon Musk's Children's Crusade of young, unvetted engineers has been raising alarm in cybersecurity circles. Information security professionals and experts have been calling out with increasing alarm what they see as a dangerous series of practices that are at odds with the best standards for cybersecurity. There's a lot of concern in the air about the kind of access being granted to critical federal infrastructure, both in terms of the data that is being extracted from those systems and the changes that are being made to the functioning of those systems.
Noah: And it certainly doesn't encourage confidence in that crusade when one of the first flagship projects they have to do on DOGE.gov itself was just to put up kind of a glorified spreadsheet of their claimed savings. They leave their database open for anyone to push content into it. 404 Media broke that story. I'll put a link to their story in the description below.
Julian: So to try and get our heads around that, Noah and I thought we would talk to our friend Tarah Wheeler. She is a cyber security expert who is the CEO of the Cyber Security Compliance firm Red Queen Dynamics. She's also a Senior Fellow for Global Cyber Policy at the Council on Foreign Relations and is on the board of the Electronic Frontier Foundation, which those of us who work in tech policy nerd circles know as an important civil liberties and privacy organization. Tarah actually has a whole lot more in her bio that would take up a big chunk of the episode to read out loud. So you can go to tarah.org to read a little bit more about her and her experience.
Tarah Wheeler: Hi, I'm Tarah Wheeler. It is a pleasure to be with you. Thank you so much, Julian and Noah for having me on board today. I am looking forward to being wonky as hell over the intersection of cyber security, politics, aviation, and just sort of general opinions on the state of the world. So I do plenty of different things that intersect and make this an exciting conversation.
I'm the CEO of Red Queen Dynamics. We build a cyber security product that helps small businesses work with their managed service providers to get safer and more compliant. We translate a bunch of the legalese and regulatory world in terms of cyber security and government compliance into something small businesses can use to keep their businesses safe and sell to their enterprise and vendors and providers and partners. So that's my kind of day job. And then why do I have these long-ass conversations about policy on the weekends? I am also the Senior Fellow for Global Cyber Policy at the Council on Foreign Relations. I've been a US-UK Fulbright Scholar in Cyber Security. Spent a lot of time looking into information security. And I'm sort of a failed academic with extremely sketchy skills on a computer who has ended up in this place where I often bridge conversations between the technical elements of what's going on inside computers to the policymakers that have to make decisions about them. And just in case you wonder, I have lots of opinions. I'm thrilled to be here today to share so, so many of them.
Julian: Just in the past week and change, we've seen an op-ed from Bruce Schneier saying the US government has experienced what may be the most consequential security breach in its history, referring to DOGE. There was an internal threat assessment from the Treasury Department, writing that “continued access to any payment system by DOGE members, even read-only, likely poses the single greatest insider threat risk the Bureau of the Fiscal Service has ever faced.” So can you, I guess, put in context for our listeners, what is going on that is provoking this level of panic from information security professionals?
Tarah: I will do my best to explain that in the thing that I usually do with analogizing what's going on. In information security, we have three principles that are in tension and in balance with each other that we strive to try to assure in a given system. And when I say assure, I mean that in a way that is not just a checklist, but something that we genuinely think through and try to build into systems. Those three principles are called the CIA triad, confidentiality, integrity, and availability. Bruce's dear friend, if Bruce says something about security, you should probably pretty much just believe him. I read his foreign policy essay and it is absolutely brilliant. It puts into the written word what so many of us were feeling. And I mean, really what we're all struggling with right at the moment is that people have got to write down what's happening inside people's minds and hearts, right? Whether it's economics or it's Bruce Schneier translating this for people. So Bruce did it for the foreign policy crowd. I'm going to try to do it for a more general audience and for people who were struggling, I think, to figure out why we're all panicking. I mean, after all, a tech CEO is fixing the efficiencies that are happening inside the government, the inefficiencies that are happening inside the government. Why are we all so panicked about it?
I think the best way to put it is, there are systems in cybersecurity where availability matters so much that confidentiality and integrity take a backseat. And we most of the time care about confidentiality, keeping something secret, right? We don't want to let your username and password get escape out there, right? That's a data breach when confidentiality is broken up. When integrity is broken up, it means you can't trust the information you're getting. A good example of something like that is, could you tell if somebody was listening in on one of your encrypted communications? Like, if there was an additional person inside a group chat that was able to listen technically to what you were saying inside Signal, we would describe the integrity of that group chat as having been broken. It's when you can't tell if what you're seeing is real or not. It could have been broken in the middle. And availability is the usefulness of that system to the users who are using it.
If we had perfect confidentiality and integrity in a system, it would be offline. Users wouldn't be able to touch it. Users are the thing that sometimes cause an issue with confidentiality and integrity. But if they can't use a system, what uses it? That's the availability portion of it. As an example, I work all day long with companies and with my own company. I'll tell you what I do in a minute, too. But it's incredibly important that people be able to actually use their phones and computers and cars and planes and refrigerators and heating systems and nanocams and sous-vide temperature thermometers. It's really important that people be able to use the things that have computers in them. Everything pretty much is a computer with a motor attached to it at this point. So the problem is that there are some systems where that confidentiality and integrity matters a lot. You've got highly skilled users, a few of them, and you can really lock that system down. Like somebody who's making code at Signal, a chat application many of us use. You can be pretty sure that that is a profoundly confidential and a system with a great deal of integrity. Internally, right, like their access to their code base. They probably don't need but 20 to 30 people who are extreme experts in encryption and development to have access to that system.
Let's contrast that with a hospital emergency room. A couple of weeks ago, I was providing some advice on upgrading systems at a hospital. I want to come back to this Noah because we got to talk about want to cry and some pretty famous breaches in American history, and why they're a good analogy for what's happening right now. In a system like that, confidentiality and integrity go out the window if a patient is on the operating table. If your laptop can't unlock and that's where the patient notes are, it doesn't matter if it's up to date. It doesn't matter if it's secure. You have a problem where you need to save a human life, and you worry about the information security consequences later. Here's the problem. In a system like a hospital, we are extremely careful when we update and upgrade systems, and it's pretty common for hospitals to have out-of-date systems that are easy to use, that everybody's familiar with, that can cause information security problems. Again, we'll talk about WannaCry a little bit later. And when we update those systems, we're cautious because there's an impact on human life. They tend to be complex, they tend to be older, they tend to be accessible to a lot of people, and if they don't work, people die.
It's a pretty good analogy for what's happening right now inside the Treasury. If people don't get their Treasury disbursements, their Social Security checks, their Medicare, their grants, their funding, they don't just lose money. People who don't have a lot can die from the inability to buy food, and that is not real, I think, to the kids who are altering code inside the US Treasury. They don't feel that weight of responsibility, and we know that. We can see it. We can see with our eyes. We can see they don't feel that weight of responsibility for the people that are dependent on the system that they're operating. And when you think about last summer, July 2024, remember the CrowdStrike incident, where Delta just kind of didn't fly planes for about five days, and a bunch of logistics were impacted around the country? It was much more of a big deal inside my industry, and most people really noticed it as Delta didn't work for about a week, okay? When Delta ceased working, that's because it had an application called CrowdStrike on its endpoints, on the computers that ran inside Delta, and did things like print out tickets and stuff like that. Well, they didn't have that system at CrowdStrike appropriately insured against the change of one person. CrowdStrike, Delta going down, the banking system experiencing fluctuations, logistics that depended upon Delta's air cargo ceasing. That, by the way, included a lot of hospital medical supply chain logistics. A lot of hospitals just went without supplies for a couple of weeks as a result of this last summer. What happened in CrowdStrike was one contract engineer pushed one piece of content, a content update like DLC for your Xbox, pushed one untested piece of content upstream to a production system that broke every single CrowdStrike endpoint, every place it was installed, millions and millions of computers, just quit because one person, not even a full time employee of the company, made one mistake. That guy should never have been able to have access to the level that he could break American shipping and logistics. And yet the processes inside CrowdStrike didn't prevent it.
They fixed a bunch of that. But the difference is that the US Treasury is a system where we can't afford to ever have a break like that from one person. And right now, the people that are inside that code, who probably haven't spent all their lives writing COBOL, the system much of the US Treasury's setup is made in, I mean, they're 19. COBOL ceased being a useful language to learn about 50 years ago frankly if you were actually in the field. We know that not only number one, do the engineers in the US Treasury not feel the weight and responsibility for the people that they are guarding with their actions. But we know that one person's action can cause massive downstream effects, and we simply don't think that the people who are doing these kinds, who are creating these kinds of changes, either feel the weight, understand the responsibility, or realize how closely they are toying with the ability of people in this country to survive. That's the reason we are all panicked right now because almost every one of us has experienced what it means to screw up production. We all have screwed up productions. That's how you learn.
Noah: Absolutely.
Tarah: You learn. You can't screw that production up. Not only, it's not like you're not allowed to. You can't do it or people die. Many of us have experienced being the person who broke a system and people were inconvenienced or lost money. Some of us have broken systems where downstream, actuarially, people died because we screwed up a hospital production system. But this, en masse, is the opportunity for the biggest screw it. Let's push production Friday, fuck up, in American history. That's why we're scared.
Noah: I want to connect two things here because, yes, and it's even worse, because there's actually a deep connection between the fragility, what's tied to the availability metric you mentioned earlier, and their status as a legacy system. When you're developing a new modern software system from scratch these days, one of the things you're able to have, because of the way infrastructure has modernized and code has become more flexible, you're able to have lots of different environments, environments called dev or test or staging. Even if you don't work in tech, you can imagine what those things are there to do, which is to experiment with changes, to run tests, to run load balancing, to figure out, hey, is the change I made to this very complex system, going to create an error, going to create a bug, going to create a problem on any of those metrics we just mentioned? Because legacy systems are very, very, very fragile and require often specialized hardware and specialized code to run, some of these systems, especially on the IRS, for example, we're going to talk about in a bit, actually run on tape as opposed to an actual solid-state hard drive. Because those systems, those legacy systems, don't have those environments. Or if they do, they're very tightly scoped and they don't actually rerun the entirety of the code base or the entirety of the functions because those functions are themselves not even fully mapped in any kind of machine-readable language, much less having tests that are running on an automated cadence to see if the change I made allows all of those functions to continue working happily.
And so even if the number of people, 5, 10, 20 or one person who has the ability to push changes into that code base, even if they were the most well-vetted person imaginable, they can't actually experiment and figure out what the consequence of their change is going to have to be. So what happens in government, not just in government, this is true of large companies as well, that have those legacy systems, you have to fall back onto a human reliance. What you end up having to do is having a lot of people who have had years, if not decades of experience with that system, who can help you think through and slow walk your change, so you have a higher degree of confidence that it's not going to break something. Is that system perfect? By no means, but it's the best you can do. If you are not working with those people, you are flying blind. If you have no amount of reading the documentation or feeling you get into a valuable LLM to try to extract some sense out of these 2,000, 3,000, 4,000-page document repositories about how these legacy systems work, is going to give you the confidence on what those changes are going to incur.
Tarah: I love that you just said flying blind. I would love to talk about the FAA and why that's meaningful. If we want to, there's several different institutions, but the FAA is near and dear to my heart as a student pilot. And I've got so many opinions right now, guys. So many opinions. And I don't know if we want to have that conversation now. I mean, the IRS by itself is such a complex system and is such a fascinating one. We could sit here for the next hour and talk just about that.
Noah: We could. I do want to circle back. I do want to circle back to the FAA, but I want to go to the other side of the finance equation, right? Because then there's a natural flow there, and then we can circle back to the FAA because of some breaking news over the weekend. Like you were saying, Tarah, DOGE is now taking over outlays, right? The Treasury payment system's money going out.
But of course, what just broke over the weekend is now February 17th, what broke on February 16th by The Washington Post, quote, “Musk's DOGE seeks access to personal taxpayer data, raising alarm at the IRS.” This broke yesterday, was updated again at 10, 12 p.m. Eastern time, subheading the unusual request, could put sensitive data about millions of American taxpayers. It didn't, I want to correct that in a second, in the hands of Trump political appointees. I don't know why they say could there. The way the IRS systems work, this is again another IBM assembly and COBOL system. It's accessed through a terminal. If they gain access to what's called the Integrated Data Retrieval System, IDRS, its documentation is actually public. You can go on irs.gov right now, look at the documentation about how this system works. If this memorandum of understanding is concluded successfully between the White House and the IRS, that is in fact what will happen. Is that a particular individual, Gavin Kleiger, I'm not sure if I'm pronouncing this last name right. Gavin Kliger, K-L-I-G-E-R, is a DOGE software engineer for the White House, and he was detailed to USAID. He's the one who told all the USAID staff to stay home, went through all of that data. He's now back at the White House, and what the White House and IRS are trying to do, is trying to detail him over for 120 days subject to a 120-day extension to the IRS to gain access over IDRS. It is not entirely strange, it's, albeit rare, for a cross-agency detail from the White House or any other institution, to gain access to a sensitive IT system.
One of the reasons why you do a detail in government is because you realize you have a staffer in one agency who has a skill that will be useful somewhere else. That's perfectly sensible. What is strange and would be radically departing from previous standards, and that's not true. What would be departing from previous norms. So much of this is encoded in norms and not encoded in a spec or a standard. The deviation here is the political appointee status. He did not come up through federal term or term employment. He's a political appointee, direct to DOGE, probably a temporary member of the temporary organization status of DOGE, which we've talked about in the past. And in that case, having a direct political appointee have access to the fundamental data of the IRS because IDRS, and again, don't take my word for it, you go to irs.gov and look this up yourself, has access to the master file. What that means is, through that system, you can query any direct data point that is in your tax return of any particular individual. And so it's not just the outlays we're talking about now, it's now the incoming, and therefore that is now going to be, centralized within DOGE, for the first time in our nation's history, a consolidated viewpoint of all money and all personal data. So that is the news that broke this weekend.
Notably, despite their attempt to move fast over the weekend, this MOU, this Memorandum of Understanding, has not yet been signed, as reported by The Washington Post. Clyger met with Ken Corbin, the current IRS Chief of Taxpayer Services, and Heather Malloy, the IRS's top enforcement official. The agency is preparing at the same time for Trump-ordered layoffs as soon as this week. That could hit another 10,000 probationary employees there. And apparently, Clyger has not returned to IRS headquarters, which is just on the block from the White House since those initial meetings. Now again, maybe he's reading the, you know, multi-thousand page documentation list for the IDRS. It is impressive and Byzantine, even by government standards, it and a few other systems of the systems that when I worked in government, everybody wanted to work on, wanted to reform, wanted to improve. But so much of that legacy was in the way it seemed sensible to go after quicker wins. And to describe just how much legacy impacts the IDRS system, it actually holds the Guinness World Record of continuously operating piece of software in terms of verification. So essentially the same system has existed without alteration, additional functions, you know, put in bits, the same core software, same core hardware since 1960.
Tarah: This is the system that tracks the kind of information that is often used by law enforcement in lieu of being able to punish criminals or, you know, whoever, for anything that they may find wrong. Often, criminals are, there's an initial, or just anyone, suspects, are prosecuted with an initial violation of the tax code because it's so easy to find when someone's done something wrong on their taxes. You can then use that as an opportunity to look further into someone's activity, and knowing their financial activity means knowing their life. This is how we put Al Capone in prison, folks. So now imagine that in the hands of a political appointee who has the ability to look at any person's data warrantless and use it to try to find and capture information that they can use against a person, for any reason whatsoever. I'm not sure that this is the kind of thing that local law enforcement might not be able to receive access to.
Noah: Not at this level. No, they wouldn't have access to the direct terminal. They'd have to make a request. That would go through a giant IRS bureaucracy, and then they would give specific data, mainly if there was sufficient warrant and other checkboxes on it. But they wouldn't simply have terminal access to be able to issue these commands directly. And let me, it's not so much defending, just making sure we're being clear about this. The access to these systems right now, there are bureaucrats, right? Technologists and government who have access to the system. That's how your taxes get done, right? This system is built on trust. Yeah, the system says like, hey, don't use it to like check up on your family members. Just don't use it to check on your own taxes, right? Don't use the data you gain from this system in your private life. Of course, there are rules around that, but especially with these legacy systems, they don't have the continuous monitoring necessary to check on that. It is primarily around trust. And that is why political appointees haven't been given access to it because they don't have that reputation and that history of coming up through the agency and proving over time that they're trustworthy. This isn't a matter of whether or not Gavin is trustworthy or not. It is that in lieu of other technical controls, the way agencies and just large organizations into the private sector, achieve this is through the repeated observation of trust. And again, you listener might be going, well, there's got to be like a standard. There's got to be a spec. And the reality is that there isn't. The governing law for federal IT system, something called the Federal Information Security and Management Act, or FISMA. The other one lost, but this is the main one. The way FISMA dominoes down to like actual standards to something called Axis Control 6, this is a real control, and it talks about least privilege. So the operating principle of the government and private entities is that organizations should employ the least privilege for specific duties and systems, right? And this is the idea is that if you have a staffer or you have a human being, you should only give that human being technical access to a system to accomplish their job, to accomplish their business function. They should not have more privileges necessary than they need to do their job. And this is why the original EO, Executive Order for DOGE was so radical and potentially dangerous, is it expanded that scope to, I think, reasonably the largest degree possible, right? So armed with that EO, not only could a staffer get access, but what would be subjectively determined as their job is everything, right? In order to accomplish the goal of the executive order. Yeah, it is perfectly logical that you need to give them access to all the data. And now they have a piece of paper allowing them to cover their ass and have like air cover as to why they should get that data. I'm not saying that's a good thing. I'm just saying that is the radical departure of how things work that has been affected by the original DOGE executive order.
Tarah: The principle of least privilege is part of the basis of information security. You don't give people access to something they don't need. Not just whether it's not just a question of trust. It's also that people can be fooled. People can be vectors for attack. And the fewer ways that you have to access data that should not be in wide distribution, the better. The reason it's so strange, the reason this is a breaking of norms is the thing that you were just talking about, Noah, that people should not be able to that there's a there's a trust of employees, that they won't just go access information about people and their taxes. That is one of the controls. Trust can be a control on confidentiality, one of those principles in information security. The principle of least privilege is about making sure that we have that we maintain not only confidentiality, but the integrity of systems. When someone has the ability to alter a system, they have an ability to alter its outputs. Right now, there's nothing that stops somebody at DOGE, not just from accessing and searching and looking at the information of individual US taxpayers, or will when this happens, there's nothing that stops them. And as you noted, Noah, there's no administrative controls, no technical controls and no auditing over it, nothing to prevent those employees from altering the data. It would be very uncomfortable if somebody all of a sudden added a zero to my adjusted gross income from last year, and all of a sudden I had a tax bill that was overdue. How would I go about proving that that wasn't real? How would I go about proving it wasn't true when the system that is the method of record for my country can be altered at will by a 19-year-old?
Noah: Yeah. And the reality is, you couldn't, you would have if you file via paper, you would have had to maintain a photocopy and then have that notarized probably. You know, in some ways, this is a reason why it out, despite the fact that IRS Direct File exists and is a great piece of software. If you had a chance to use it, you might want to go back to TurboTax or something similar, just so you have another copy of that data in case you feel like you might be targeted.
Julian: Apropos of trust, one of the DOGE staffers who's been in the news is Edward “Big Balls” Coristine, a 19-year-old who has, well, a checkered history in a bunch of ways, but most notably was fired from an internship at an Arizona-based company called Path Networks for allegedly sharing internal data with competitors. A lot of these folks, as reporting is showing, have in various ways dubious online histories. It really doesn't seem like these folks who have, in many cases, worked at various different Musk companies, have undergone the kind of vetting one would normally expect for access to highly sensitive federal systems. Tarah, in your experience, what does that process usually look like and how does it differ from what appears to have been done in this case?
Tarah: So Julian, I like to say that information security isn't an industry. It's a mafia trying to become a guild. Our field is still incredibly new and people who are at mid to upper levels of leadership in cybersecurity don't have cybersecurity degrees because they didn't exist when our field was being created. You can tell those of us who came in through multiple side doors have degrees in fine arts, political science, English, math, electrical engineering. There's a lot of people who came in through doors that involved tinkering. So we've created our field and the rules and norms of our field really meaningfully over the last 20 to 30 years as a system of trust, very much that handshake trust. You build that trust with your colleagues over time by being willing to share information, by practicing principles that make you a solid member of the community, and being somebody who multiple people will vouch for. Now, I want to say that somebody who has been down the dodgy path can absolutely come back into information security. We can rehabilitate people who started as black hats. We do. In fact, that's a pretty common way for people to get started in my field, is perhaps being places they have absolutely no business being learning, how systems work, and then going, God, I could do this better after you get tired of watching people screw it up so much. So ripped my 20s and frankly, my teens while we're at it too. So sometimes we had to learn the hard way. But the challenge I think we're seeing here is not only is it incredibly difficult, they gain the trust of the information security industry.
When you vouch for somebody and you say, this person is going to make a great leader at your company, you are investing much more of your personal reputation in that recommendation than would be true in some other field, I think, because there's really no way to credential people in cybersecurity. We don't have like a bar, like the American Bar Association that tells you, are you a member in good standing of your field? We don't have the AMA that could take your medical license away. There's nothing that stops you from practicing in cybersecurity if you have a bad reputation, other than not being given access to things that people who have earned trust have access to. You can see behind me right now, I pretty clearly have a giant chip on my shoulder about the fact that you can't be credentialed in cybersecurity. I've got as many pieces of paper, slain, skinned, stuffed and mounted on my wall as possible to tell people I'm a trustworthy member of my profession. I've put time and energy in decades of my life into being somebody that someone would trust with systems that can kill people. That I think is bewildering, rage-inducing and such a challenge for those of us in information security to watch. People who have never, clearly, never put that time and energy into being someone you could trust with a system like that. Not only that, but appear to have a deep contempt for that trust and skill and expertise. That's scary as shit to me. And that's the reason I think it's so difficult for us to watch these people editing and changing and having access to data that we all have worked so hard to protect the ancillary systems for. And now the keys to the kingdom are already gone. Why are we locking the doors?
Julian: Apparently, some employees at the Office of Personnel Management have sued over what they say is an unvetted private server that DOGE had connected to their network. The OPM network incidentally apparently connects to the Defense Counterintelligence and Security Agency, which is one of the reasons for concern there. This seems a little ironic given the extent to which a private server loomed large over the 2016 election.
Tarah: But her emails!
Julian: Yes. What kind of concerns are there about unvetted equipment being connected to networks like OPMs?
Tarah: I think the concern that we have is that is literally how we test and break into companies and systems and government offices. The job of those of us, and I've spent a good chunk of my time in information security, is something called a Red Teamer. We call it an information security researcher, which is, as I've said before a couple of times in front of people who were chairs of committees, is how you say hacker in front of legislative committees. When you are somebody who tests those systems, Red teaming means checking a system all the way through, not just necessarily the digital system, but sometimes even the ability to walk into a building, steal a computer, plug something else in. Plugging a server in an OPM that has access to external systems without the appropriate technical controls is literally a hack. It is literally what we do to demonstrate security is bad, and they just did it on purpose. That's why that's bad, because it's like somebody who swears they're going to be in charge of security for the Air Force driving a plane through a fence and saying, we just decided to make this the opportunity for planes to jump on to airfields now, and leaving the fence open, when all of the people who have been involved in security have been meticulously checking to make sure that fences are continuous and appropriate security guards for decades.
Noah: I want to tie in something else here to give voice to some tremendous nuance that I've spoken on a lot on this issue, and why these 120 details of the current authority are so problematic. Many agencies haven't figured out how to do the 10 most critical things when running this assessment, and as a result, 60, 70, 80 percent of their time is spent on security theater, right? So there's a spectrum here. There's doing absolutely nothing and being wildly irresponsible, and then there's letting digital security theater take up all of your time and resources so you don't actually get to do the stuff that matters. And this is true in government as well. So it was fairly frequent to the current day in government to take 90 days out of those 120 days to even get your government furnished equipment, right? And if you're talking about access to a sensitive system, that easily could be 120 days right there, right? Absolutely. Because these agencies didn't modernize fast stuff, because they don't have a 14 day or 18 day sprint, right? To get you with 99% of that assessment and quality done, so we can bring you in in a controlled fashion, in a secure fashion, it's the 90 days or nothing, right? So because DOGE is moving at extreme speed, they are right occasionally in diagnosing some of these symptoms that these processes take too long and don't deliver high quality outcomes. But because there isn't a happy golden path to revert back to, they're doing nothing, right? So these are the outcomes we're getting.
The IRS absolutely has likely a two week, a 30 day, a 60 day and a 90 day training program for IDRS, depending on the privilege and responsibility you have in that system. And I will bet my entire life savings that Gavin is not going to go through any of those systems, right? Going through those trainings is not going to be contingent on him getting access into the system because of the political posturing and the nature of the executive order. So I want to tie this together is it's not just tremendous disrespect for the system. That's also true. It's because in the attempt to modernize, we didn't move fast enough. And one of the things I'm sure we'll cover in the grand history of WatchCats is like, why we didn't get to where we need to be from a non-ideological perspective, to have safe ways to modernize these systems. IDRS is a particularly terrible example. Its modernization processes have actually failed twice. There was one in the 90s that failed. There was another one that failed on the Aughts. The current one, which was supposed to be done in 2013-2014, has not just failed. It was pushed out another 10 years, which they've now missed. Also, the scope has come back in. The replacement for a lot of these systems was supposed to be a full system replacement. Just like tax returns would just be processed by it. Now, they're going to have their core functions done by the new system. But the more exotic, one in 100,000, one in a million functions the system needs to do, those will actually still be run by the classic system. Then over time, the idea was slowly modernize those.
Tarah: Yeah, but you have to have a path to slowly modernize something. You have to have a tested process and you have to be able to demonstrate that the new system is just slightly better than the previous system and then update everyone on the state of the world under the new system. That is at least part of the problem that we're seeing. It's part of the problem with the firings at the FAA. It's part of the problem we're seeing with the access to IRS. It's part of the problem we're seeing with attempting to modernize the connection between OPM and the Department of Defense. The OPM is the victim of essentially the most famous hack in, if not the most famous...
Noah: Another victim of it right here.
Tarah: Yeah, exactly. You got your parcel in the mail over that one. That's 20 million people applying for clearances who were part of the Defense Department where their SF-86, that form that you have to fill out to get a bunch of people to look into your background and get you clear to look at government stuff was stolen by the Chinese because OPM didn't have modern secure systems. Cybersecurity matters profoundly in these cases. I can both, believe it or not, believe that the IRS was doing an incredibly terrible job at cybersecurity and updating and modernizing its systems, and think that this is the wrong way to go about fixing it. You can have both of those problems happening at the exact same time.
Noah: Exactly. More than one thing can be true at once. Before I throw it to Julian, I want to make sure we're on the record of holding DOGE accountable. DOGE could have come in, especially with their new tech focused, which they pivoted at the beginning of this administration, be like, hey, it's not a big secret as to what are the top five impediments to doing this the right way and modernizing its speed. Let's actually work with Congress to work on those. They're not working on those at all. When DOGE leaves, it is very unlikely at the current moment that we're going to have any of the tools necessary on the technical side to do this work better in the future. We may have what is the remainder, the smoking charred corpses of some political authorities that may or may not be useful regardless of their morality or legality. Julian, back to you.
Julian: Yeah, I actually wanted to circle back to the OPM hack. We talked a little bit about some of the risks of pushing out code updates, breaking systems that however cataclysmic it might be, if people stop getting social security checks or Medicaid payments stop going out, that at least would be immediately detectable. One thing a lot of people are concerned about is that if the Children's Crusade here introduces novel vulnerabilities, that could, as I think happened in the case of the OPM breach, create attack surfaces that are exploited for months or potentially years before they're detected. In a sense, the most obvious threat is they're going to break something that is important to people's lives. But I wonder if we could look at some of the more high-profile breaches we know about, and the federal government has suffered its share. And maybe, is there something about how those were carried out and executed that can inform us about the risks of what's happening now in terms of creating novel vulnerabilities in critical systems?
Tarah: Well, the issue, Julian, is that when you have people who have to run a system and you have to balance equities across that system, confidentiality, availability, integrity, you end up always with the possibility for one person to introduce error. It's why we always have processes to check and triple check and quadruple check when people are updating systems that involve people's lives. And the most important one I can think of right at the moment is the SolarWinds hack that happened in 2020 that led to the executive order in May 2021, which really spurred the creation of the Cyber Safety Review Board. And I have lots of it. I've given plenty of opinions on that one too. But the reason that particular hack is interesting is that was a piece of software that was compromised by a Russian agent during its build process inside SolarWinds. And it happened, we think, because an intern set of production server and a password on that production server to SolarWinds123. And the fact that an intern would have access to a production server, that there was no checks on what was uploaded to that server. And then that software was then used on hundreds of thousands, millions of computers, including, most importantly here, the Nuclear Energy Administration, multiple government laptops and computers to theoretically protect and run those laptops better is absolutely mind-blowing. One mistake opened the door for an attacker, one person opened the door for an attacker, specifically Russia in this case, to compromise the security of the American Nuclear Energy Administration. This is, by the way, the administration that a couple of days ago Trump fired a bunch of people from and then blew his mind apparently that a bunch of people, by accident, he fired the nuclear safety specialists at the Nuclear Energy Administration. This is the same agency that was impacted by a hack SolarWinds that was so devastating to US government systems that it created the need for the Cyber Safety Review Board.
As we watch that particular hack be completely ignored by the current administration including its impacts everywhere, I sit here and think the consequences inside of that organization were that the CEO blamed the intern, fired him, it's my understanding. And later on at SolarWinds, as far as I've been able to tell, the CEO was not prosecuted, CISO was, and one other person inside the organization. But I recall, and I have to check my facts on this one, it's early on, on Monday morning, I got into all of my coffee and I want to check on this one. But one of the real challenges is the process that we use for software everywhere is a test and a check and a quality assurance before it ever gets pushed to production. This is like writing the declaration without going through a draft process, right? You're just kind of stuck with whatever you end up with it at the end of that conversation. There are more pretty famous hacks in American history. I specifically am specialized in WannaCry, which impacted the US, yes, a great deal. The reason it's so important is it impacted the National Health Service in the United Kingdom. Shut down hospitals there for three weeks. That's what I have spent a lot of time studying. It's personally impacted my life in a pretty profound way.
The difference between that hack and the ones that are happening right now inside the US government, the CrowdStrike one that impacted logistics in the US, the SolarWinds one is the ones that we're dealing with now are just dumb. They're a single flaw in a process. WannaCry was a complex one that had a lot of really interesting elements to it. Yet, the thing that it impacted most were the most important systems in the US and in the UK, hospitals, places that couldn't afford to have systems that were updated and potentially out of service. So, does that help to start explaining a little bit about some of the systems that we've seen impacted, Julian? OPM matters a lot. But remember that when we talk about breaching systems, a lot of times the way that we breach systems is by knowing a lot about the people that run them. The OPM hack isn't dangerous. The original one in 2014 isn't dangerous because specifically that information was stolen by China. It's dangerous because every single one of those 20 million people who run systems, the information about who their mother's maiden name is and their first dog and their dabbling problems and their bank accounts is in the hands of people trying to crack these systems, and it just takes one.
Noah: And so one thing to clarify about that, it wasn't just people in the Department of Defense. They were probably impacted in an outsized way because a lot of them do require a secret or top secret clearance. You also need secret and top secret clearance in the civilian governments. For example, the reason why I was impacted by that and my SF-86 was in fact stolen wasn't because I was even working in or working for DOD at the time, but because I needed access to a certain Department of Homeland Security system to help me with cybersecurity and continuous monitoring, or at least that was the marketing of that system. Interesting enough, do you know what that system ran on? That I needed to get access to to help me protect my own systems at GSA? SolarWinds. So they were actually trying to give me access to a system that itself would be the proxy for compromise later. And you could see it easily. It was very obvious given our monitoring systems that as soon as our SF-86 were stolen, trying to very quickly figure out who would likely, just from like LinkedIn and other blog posts, who would likely have root access. And I absolutely saw a huge uptick in both social engineering attacks and just kind of brute force attacks on my accounts from China because my SF-86 was stolen.
What I also want to pull out a thread on is centralization of technology is always a double edged sword. Why was CrowdStrike such a devastating failure? Why was SolarWinds such an appetizing target? It's because those systems in order to run and provide the security and monitoring services that generate the value of having those systems to be in with, also means those systems often have excessive privileges. This also ties back to you want these privileges on the human side, you also, ideally from a software architecture side, want these privileges on the digital side. But that's difficult, you have to be really intentional. That takes time and effort, and sometimes it's easier just to give a software program or process very permissive settings in order for it to do its job. The more permissive the setting, the more centralized that agent or software is across IT hardware and its integration with other pieces of software, the more appetizing the target becomes. What's happening here now in DOGE? They're centralizing technology power on the human level at the same time that they're trying to centralize technology access on the technical level. That just makes it an unbelievably juicy target. I can't imagine the teams that are responsible for this in Russia, in China, in professional crime networks. They are running this 24-7 because why shouldn't they? It's a very open question to me, given some of their political motivations, whether or not they'll find those vulnerabilities and sit on them, lay some landmines, lay some trojans down, and then not activate them until a future administration, or if they'll activate them in this administration. I think that's a big open question. Speaking of the exploits that the government itself can often find, hidden underground in its own systems, wasn't it the case, correctly if I'm wrong, wasn't WannaCry's exploit based on a vulnerability that we ourselves, like the NSA, developed?
Tarah: Absolutely. The vulnerability itself was something we call Eternal Blue, which is the name for an exploit in Windows 7, in Microsoft's Windows 7.
Noah: Windows 7, geez.
Tarah: Yeah, so Windows 7 was vulnerable for a long time, but the interesting thing about Eternal Blue is that it was based upon a vulnerability that Microsoft had discovered a couple of months earlier and patched already before WannaCry even was an issue. And WannaCry really exploded on May 12th, 2017. Ask me how I know. Would people in my industry know where they were on May 12th, 2017? Like, people remember where they were when the Twin Towers fell or Kennedy was shot. Like, you have visceral memories of that day because we were all on incident response. Like, we were grabbing interns by the hair and really, it was insane. And the nature of that attack was built around a vulnerability that was discovered and exploited and kept by NSA, then stolen in a trove of data theft by a criminal group called the Shadow Brokers. I know a lot of people that were deeply central to that story. That's a different conversation for a different day. But suffice to say that that attack and the theft of the data from NSA led to the NSA disclosing those vulnerabilities to the relevant companies. Microsoft quietly patched Eternal Blue two months, 60 days before WannaCry actually spread before it was weaponized. And WannaCry is just the dumbest of all attacks. It was created by North Koreans. It was an APT. It was some idiot kids screwing around in systems. They didn't have any business screwing around and then seeing what would happen. And you know what? It was for those of you who saw this, it was the day that your computer screens turned red. There was a demand for cryptocurrency to unlock your systems. But the process for that cryptocurrency unlocking was non-functional. It just locked up systems.
Now, very fortunately, a young man named MalwareTech, Marcus Hutchins, discovered a flaw in the code that meant that it was looking to get a ping back from a domain that didn't exist. And if it did exist, then the infection would cease spreading. He grabbed the domain, registered it, and now has what we call the want to cry sinkhole to stop that infection from spreading. He still controls it, and he controls it because as I am, absolutely, there's another gentleman, MalwareTech, a very, very dear friend, Jake Williams, and I have been colleagues for a while on this. And a couple of years ago, we did the math and discovered that of the systems that were vulnerable to want to cry in 2017, probably about 15 percent of those systems are still vulnerable, meaning they're still running unpatched Windows 7 right now. So this is the long tail of this vulnerability. And this came from the NSA withholding a patch for an exploit that is simple to use. I want you to think about Eternal Blue as the difference between having the specs on your computer to build a super, super, like you need to go, like building a ghost gun. Okay. You have to purchase the equipment, you have to figure out how to use it, you have to get into the community that would teach you how to do it. You have to get the equipment, the parts, figure out how to make it work, and then build your own ghost gun. That's the thing that a lot of us do. We do exploit building in information security, but that knowledge being out there, there's still a gap between the vulnerability and the ability to build a functioning exploit. Eternal Blue was handing an AR with the safety off to a 10-year-old. Ready to go. Just pull the trigger. Anybody can use it. You could use it. In three minutes, I could show you to use it. That's it.
Noah: I want to comment one more thing, throw it to Julian, and circle maybe back to the FAA stuff, which is that in the Trump I administration, which I think you could very charitably say, did not have a overwhelming cornucopia supply of technical acumen. Again, independent of ideological or process competency stuff. Trump I, not a lot of tech know-how at the top.
Tarah: I'm going to disagree with you when you're done. Go ahead.
Noah: I would love to hear that. Because what I'm getting at is what I recall on what I've been reacquainting myself here in the background is Trump and the White House's communication strategy during these top line hacks was not great. I would give it a D at best, if not an F in some cases. That isn't to say I would give Biden that much better scores. Especially with Salt Typhoon, I believe, is the text base. In general, I think government could be a lot better here. Trump one was particularly bad. Is there any reason to think given the reforms that have happened at the Department of Homeland Security or DOD or in the intelligence sector, since some of these top line hacks that we should expect better communications, more transparency, more action here. Are things better or worse or about the same?
Tarah: I think in Trump one, and this might shock you, I think that the technical acumen that was created specifically under Chris Krebs as the first head of the Cybersecurity and Infrastructure Security Agency was outstanding. Chris Krebs is not only a- I mean, I'm going to say this, I don't want to sound like Mark Anthony here or anything like that, but he's an honorable man and he did an incredible job of making sure that it was technocracy that won in the creation of that organization. Now, there was a lot of elements of trying to get the word out, trying to make things actually happen that are always a struggle when you're getting an agency up. But CISA, from its inception under Trump, has been by far the most respected, technically competent agency in the United States government. As far as my industry is concerned, no hesitation, head and shoulders above, and Chris Krebs started that.
The work that they did as they began communicating about election infrastructure and security, the collaboration that they did, absolutely outstanding. It was almost accidentally that competent because Trump woke up to the fact that his technologists were doing a good enough job that they could prove that the 2020 election was secure. That's the reason Krebs got fired. He built an agency that was so technically competent with wonderful people that trusted him and would come work for him, even under a Trump administration, that they actually collected relevant data, secured relevant election infrastructure, were able to assess and audit it competently, and determined that the election was as safe as imaginable. It was the safest American election we had so far. Go ask Matt Blaze, a professor at Georgetown. A bunch of people did independent audits of CISA's independent audits and discovered exactly that. That's the challenge.
In fact, under Biden, I am not sure that we saw an appropriate continuation of all of that technical expertise. We saw a massive jump in the trustworthiness and willingness to work with CISA under Jen Easterly. They were very different leaders for... I'm speaking as highly as I am of Chris Krebs because we do have a lot of respect for him, I think, as an industry. Jen took it to the next level by integrating CISA across the country to provide election infrastructure and local cybersecurity expertise in a way the rest of the government didn't. And that would not, frankly, have happened without Trump creating CISA, accidentally, I think.
Julian: Apropos of firings at CISA, you know, that among the wave of people being placed on administrative leave and slated for layoffs are security experts at CISA and apparently elsewhere within DHS. Initially, at least primarily folks working on election security and misinformation systems. But it seems like very likely we're going to see additional cuts there. The administration is describing these as non-critical and provisional employees. Are the cuts that we're starting to see in, you know, among cybersecurity professionals within the administration something to be concerned about?
Tarah: You know, yes is the answer. To elaborate at least a little bit on that, fuck yes. To continue to elaborate further, these are the people who are on the ground building the trust we're all talking about with local election officials. For people that have volunteered at a poll before, that is a moment where Democrat and Republican partisanship, it evaporates in the desire to protect everyone's ability to cast their vote no matter who. People are throwing, they are throwing themselves, you know, in front of every problem imaginable to make sure that people, no matter who they are, can vote. And that was building so much trust locally. It is very difficult not to see this as first retaliatory towards CISA after 2020, because Rand Paul is deeply convinced that CISA is corrupt in the 2020 election was stolen. And on the second level, it's deeply difficult to believe that this isn't laying the ground to make sure that the data and trust isn't accreted to show the security of American elections based on the technical equipment that we're now using. It's really difficult, right? It's difficult to say that your dentist says to you, I fired all of my dental hygienists, but it sure looks pretty okay and you can take care of your own teeth from now on. And you're like, but you're not doing the stuff that they used to do, like scrape it and everything like that. And they're like, that's all right, here's an instruction manual. You're probably okay from this point on. And then a year later, he spends half as much time with you. And then even though your teeth hurt, you're like, is it because of a dental thing? And I haven't been doing this the right way. He spends half as much time with you and charges you twice as much and you get kicked out of the office. It's difficult to not interpret that as a disimprovement in service, if nothing else. Does that make sense?
Julian: One thing we should probably clarify is when the administration says that the people it's laying off are provisional employees, I think maybe it's the reporting...
Tarah: Probationary employees, yeah.
Julian: Probationary employees. Perhaps some of the reporting on that is not as clear as it could be. Can you explain what that means?
Tarah: Sure. When I put an employee on probation, I'm the CEO of a small company that works to help manage service providers, IT guys, kind of in middle America, work with their small business clients and keep them safe and secure and compliant and able to do things like pass the government cybersecurity checks so they can sell to federal government contractors and so forth. When I tell an employee they're getting put on probation, it means that they have done something wrong. However, in the government, there's this concept of a probationary employee, which just means that they're new to the job. Has anybody watched like NCIS with this like Mark Harmon and a bunch of people, and this is like an old school procedural television show? There was a new agent called a probie, a probationary agent. That's what I want you to think about is a probie on a police force or in a government agency or a law enforcement officer. That's what we're talking about. We're talking about basically anybody in the first three years of their career, not people that have done wrong. In fact, as a probationary person in government, you're probably actually more likely to be following the rules so that you can get to the point that you've actually got a long-term secure job. What's happening is we're firing all the new talent.
Noah: Yeah. What's worse, it's completely illogical because there is no standard for the probation time period. When an agency creates a job description and actually puts it up, that is when they have to decide how long the probationary period can be. On average, it's about a year, but it can be longer and also agencies don't need it at all. There is no rhyme or reason as to any particular staffers journey in the probationary time period. We may have wiped out an incredible amount of talent that had nothing to do with actual efficiency and effectiveness in government. It has not tied to anyone's performance or necessarily the importance or unimportance of that job.
Tarah: I got to grab the yoke and wrench this plane around and talk about the FAA on this one because Trump over the weekend fired 300 air traffic controllers, the probationary ones. Did you all not know that?
Noah: Even after the crash?
Tarah: Yes, that's the problem. Here, I'll just go ahead and throw this link over here with you folks. But be aware, Trump just fired a bushel basket of ATC employees. These are air traffic controllers. These are the people being trained to take over the position of air traffic controller, which, as you all may recall, very likely, as a pilot, I can go into this further if you want, but it was a lack of staffing at DCA's tower that very likely contributed to the problems that caused that accident a couple of weeks ago. There's no explanation for this behavior. The air traffic control profession is difficult to get into.
Julian: There's no logical explanation.
Tarah: Well, there's a logical explanation. It's just motivated by things I don't want to try to describe. The challenge here is I'm a student pilot, Noah. I've spent some time, and as a busy cybersecurity professional, writing articles, doing some stuff on the side, it's taken me a few years to get my private pilot's license. I'm now at the point where I've done the thing, like I've graduated from law school and now I'm just waiting for somebody to schedule a bar exam for me, right? Like I've done all this stuff. I can fly the plane. Now somebody's got to sign off on me. Well, here's the problem. The FAA has a huge number of positions that are vacant or understaffed right now. And right now what we're really seeing is not only is Trump not increasing budget for air traffic control to improve the pipeline and re-staff all of these agencies. If what he's trying to do is take the FAA private, there's a massive drop in aviation security that everyone in aviation believes is about to, we're about to experience. And the FAA has the shining example of how we fix stuff in a life-critical industry, which is the National Transportation Safety Board, the NTSB, the thing that the CSRB was supposed to be modeled on. And just ask me about my opinions in my US Senate testimony on that one in a minute. But the challenge we're seeing here is ours are going unstaffed.
There's a field north of me called Bellingham International. It's one of the ports of call if you're flying across the Canadian border. It's a littler airport. But right now, they've just decreased the number of days that a tower operator will be there. Also, that guy is cranky as hell. And I don't blame him. He doesn't have anybody coming to save him. He's kind of mean, honestly, on the radio sometimes. You can go listen to that radio. And you know what? He's tired and probably kind of older. Right. Please don't. Oh, my God. I just told you that. But it's OK. He doesn't know who I am in my call sign. It's fine when I get up there. Bless your heart. There's no one coming to save these general aviation, smaller airport operators. And it's a fundamentally declining profession without that pipeline that Trump just nuked all the way through. He just fired everybody in the pipeline for it. Where do you land planes? Where do we go? How are people learning this? You went on to something else, too.
It sounds like, and I am repeating something that is a little more complex that the aviation community is talking about. It appears that there has been some form of announcement or devastation in the Oklahoma City Medical Examiner's Office in the Federal Aviation Administration. Now, the medical administration place is the place you send your paperwork so they can say, yeah, you are not blind and you are probably not going to have a heart attack, you do not have diabetes, you are allowed to fly a plane. Here you go. There is third, second and first class medicals you can get. Obviously, if you are flying a passenger jet, you have a first class medical. Here is the problem. If one, if your medical examiner wants you to have to talk to them and make sure something is okay, that paperwork used to take months. It was very bureaucratic and problematic to get through, but eventually you could figure it out. It sounds very much like that process has just ceased completely, at least in part because of some political moves on the part of the administration that are unnecessarily clogging up that pipeline, such as changing all of the documentation and requirements around something like HRT, very specifically targeted towards people who are women, transgender, people who experience hormone imbalances. There's a lot of concern right now that aviation itself is going to experience a huge shortage of pilots as all of our medicals get held up in OKC. It's not just people that aren't your traditional Captain Berry-friendly flying an Alaska jet. There's a lot of people involved in aviation who have to maintain a medical. If those just cease getting processed, there's going to be a lot of planes sitting on the ground, a lot of jobs impacted by that. So I care deeply about it. I could keep going because there's a real meaningful crossover with the CSRB that I really want to talk about if possible.
Noah: Well, one thing I just want to remark just to connect it back to the cybersecurity side, these people apparently didn't even get their emails indicating they were fired from a.gov. They got it from just a Microsoft 365 address called executive order, right? We literally have no way to even verify if these firings are intentional. I'm sure they are. I'm sure this is DOGE moving fast and they didn't bother to bind to the top level domain and they just went to M365, create email and sent out a blast.
Tarah: Yeah, kind of makes me want to speculate as a person who does this kind of security testing, I think Korea, part of the basis of our profession is testing to make sure people's email is safe. There's a pretty convincing email address I could set up right at the moment and send to everybody at SeaTac and just say you're all fired, go home. You want to, because that process is so over, is so in the air right now, how many of them will believe me? Right? Dear everybody, I wouldn't do that. I'm just letting you know. But somebody who is not me will do it.
Julian: I think maybe an understressed element is: We've been talking about the security vulnerabilities in terms of the technical systems that poorly vetted people have access to or that maybe posting commits that have not been properly tested to critical systems. But one of the things that's happening here is that the normal protocols for communication within the government, and for access, are being disrupted. So we've had a number of stories about the DOGE kids showing up and security going: “Who the hell are you?” And they're saying: “Don't make me call Elon!” And I think I joked at a time, God, if I were a Chinese or a Russian spy, I would grab a teenager from the nearest high school and throw a bad suit on him and say, let's see how many agencies you get into saying, I'm here from DOGE. But partly it's because official communications are coming by unauthorized channels now. So if a tweet comes out looking like it's from an official account, or an email from some address to you don't know, but it purports to be from an authorized source. Can you talk about instances where that kind of attack surface has been exploited?
Tarah: I mean, that is the attack surface. It's not a hypothetical thing. That is the attack surface. We spend all of our lives as cybersecurity experts trying to get users not to click on suspicious shit in their inboxes. And now the suspicious shit is the stuff that impacts their jobs. We're never going to be able to run a meaningful phishing exercise ever again. Now, I mean, those by themselves are problematic for a lot of different reasons. But I want you to think about the number of people whose organizations were impacted in terms of security, experienced data breaches because three months into the COVID pandemic, they received an email that said, hey, additional payouts for people affected by COVID are available here. This is your payroll. I know this, we just spun this system up to try to make this work for you, but we're going to issue a bonus this week of $1,000 to help you purchase equipment at home. Do you know how many of those scams succeeded? It'd be great if we had a Bureau of Cyber Statistics on that one, certainly something that many people in the community have been asking for, for a long time, Adam Shostak, Steve Bellovin, Ed Felton, a bunch of people in US government and in the community, have been asking for a reliable collector and disseminator of statistics, like a CDC for cybersecurity. Certainly go look at Adam Shostak's work on that and the public health of cybersecurity. But we don't have a way to know. I can just tell you anecdotally, just like Noah knows that the breach of his SF86 led to increased phishing attempts and social engineering attempts. I can tell you anecdotally, businesses failed as a result of having their data stolen from people falling from those scams and now we're training people to open email from suspicious sources. It's too easy, right? This is a playground for computer criminals. So this is a real challenge.
Julian: Yeah. I want to note, incidentally, a story I just remembered that I think is probably illustrative of, in a nutshell, the sort of unintended consequences here is that apparently there was a kind of executive order demanding that a bunch of left-inflected terms be purged from federal websites. If you've got a page that says climate change, you've got a page that talks about racial equity or transgender, pull those pages down on public-facing websites. Apparently, one of those terms that was decided was lefty and verboten was “privilege.” A memo went around at NSA and apparently other agencies with cybersecurity responsibilities explaining that they needed to pull down their public-facing pages discussing “privilege escalation.”
Tarah: That's correct. Not only public-facing pages, but internal agency documentation as well. That is so much worse. Absolutely. When we describe, when we in the field talk about how we attack systems, the most commonly used term is privilege escalation. It's our jargon for I got root on a system. I escalated privileges. That's the name for the attack that gets us into systems. Every piece of documentation that would have been describing how to do a thing in information security offensively, and how to prevent privilege escalation externally defensively, would have been nuked from internal documentation at NSA. The kids are not all right.
Noah: I love that the free speech boosters in private industry when they join government immediately turn to new speak levels of control. Let's circle back and wrap it up regarding what you want to talk about, the Cyber Safety Review Board or the CSRB. Just to set it up for our audience here, this isn't like the Underwriter Laboratories. This isn't like UL, the people who make sure your toaster doesn't explode and melt your face. What does the CSRB do? What is the purpose?
Tarah: The CSRB, as I said before, was created in the aftermath of SolarWinds. There was an executive order in May 2021 that directed that an agency be created, that a board be created to look into incidents. It even specified that it be modeled on the National Transportation Safety Board. As a pilot, I am personally responsible for reading and understanding a book of regulations about Yay Thick called the FAR AIMS. That's the Federal Airmen Regulations, Federal Aviation Airmen Regulations. The regulations are updated every single year, and sometimes more rapidly than that. But I have the responsibility to be updated every year and to abide by those regulations as a pilot. We know those regulations matter because those regulations get updated when the NTSB engages in an accident investigation, determines the cause of it, and figures out whether or not there is pilot error, there's an equipment failure, whether or not meteorological conditions were involved. That board is made up of 15 people who are experienced engineers, pilots, plane builders, people who spent years and years learning not only about aircraft, but also how to investigate those crashes. You're talking about somebody who maybe spent 20 years as a pilot and then 10 years as an accident investigator being appointed to the NTSB full-time, and then when they run investigations, there's much more than that, but the NTSB consists of 15 people total. There's additional personnel brought in to examine when an incident occurs. People nearby, the pilot of an incident is brought in to be investigated. The air traffic controllers, the local airspace, the relevant information is all brought in, and a blameless investigation occurs until the NTSB can figure out what the cause of that crash was. The reason it matters so much that these people be dedicated specialists full-time with a non- there's no politics involved in this process. The reason it matters that these are technical specialists is because we in aviation believe them when they say something was the cause of an accident. We know how to prevent accidents that are caused by something we call CFID, controlled flight into terrain because the FAA has given us procedures to minimize those risks. You can never stop completely the risk of crashing a plane. You can minimize it to the point of near impossibility, but eventually something's going to go wrong if you don't have safe conditions. That is very much what happened in Washington DC a couple of weeks ago. The airspace got too stuffed. There was an understaffed tower. Anyway, suffice to say NTSB began that investigation very quickly. The challenge with the CSRB, and last year in January 2021, I along with a couple of other people testified in front of the Department of Homeland Security, the US Senate Committee called HISGAC, Homeland Security and Governmental Affairs Committee. This is the committee that, for instance, Gary Peters, Max Blumenthal, Josh Hawley, Maria Hassan and several other senators serve on, and they provide oversight to the activities of the Department of Homeland Security under which the CSRB was constituted. One of the problems we really had was that the CSRB was basically constituted of government officials who had other full-time jobs and didn't know anything about cybersecurity.
Noah: Well, that second part seems important.
Tarah: There's a couple of really great exceptions on that. Rob Joyce, the former Director of Cybersecurity for the NSA, absolutely top-notch human being, would trust with national security 100 percent, five out of five stars, no notes, incredible person. Katie Maciris, extremely accomplished technically and very, very good at explaining how cybersecurity impacts things. A lot of the other folks were talking about an Assistant Director at the FBI, who sometimes does counterterrorism stuff and who has a tech unit report up to them. And who, anecdotally, and a lot of the times these guys just, they just sent a deputy to listen in on stuff. When it came to investigations, they also never investigated SolarWinds, right? They chose to investigate something called Log4j at the beginning, which was an attack that wasn't, it was just a vulnerability in an open source web server framework called Apache. Log4j was what we like to call Apache or shit vulnerability. It's, there's no complexity there. The thing that we want to know is how did a whole last process break down, and where do we improve process in future to fix the risk we can fix? Not all of it, but what we can fix. So what we want is investigations into something like Equifax. Everybody remember in 2017 when all your credit card data was breached? That, by the way, was one nighttime contractor who failed to patch one thing in Apache struts, who was then fired by the company, and that was their internal remediation for it. 147 million Americans lost their credit card history and their credit histories, and it was blamed on one person not updating one machine. That's what we talk about when we say it's not just one guy's fault. There's no single point of failure in that one. I want to know the whole ass process that led to a single machine having that much control, that led to outdated software without being patched, that led to somebody untrained operating it, that led to exactly, that's what we need. And we need that from the NTSB and they give us that. And we believe the NTSB when they tell us that something has gone wrong with a series of issues that need to be addressed in order. Right now it's really difficult to have that same conversation about CSRB results. It was a pretty good investigation of Microsoft last summer that came out. There were certainly improvements on the two previous investigations they had done. But the CSRB in its life did three investigations. From early 2022 to when it was fundamentally disbanded by President Trump, he fired everybody on it basically, who wasn't a Republican, I think.
Noah: So this is interesting. I didn't realize that. So instead of withdrawing the executive order that created it, that executive order is still in power.
Tarah: Correct.
Noah: Rather, there's just nobody's working there anymore under CSRB.
Tarah: There's not someone. As far as I can tell, the independent and technically capable board members have all been fired. So the challenge we're really having here is, how do we find out what's true about cybersecurity in this country? How do we know how many small businesses have been impacted by phishing attacks over the last five years? How do we know whether or not multi-factor authentication is a more effective method than asset inventory management in a medium-sized enterprise in this country to prevent data breaches? That's the body that should be doing investigations that give us an understanding of the real state of the world. Right? When the FAA tells me that I need to follow a regulation because it's going to massively reduce my risk, I believe them. So I believe a lot of what's coming out of that now.
Noah: Well, one thing we should flag is, well, the CSRB may have had a rocky start, but was showing signs of improvement. One reason why the NSTP has been so insanely and wildly successful over the decades, right? Why we have the safe aviation sector, we have is the NSTP has subpoena power. And that was one thing that I'm realizing CSRB did not. There were, because that needs to come from the legislature, right? The president under traditional presidencies can't wave a magic wand and give an entity subpoena power unless Congress does something first. And Congress had yet to give CSRB subpoena power. Is there anything we should have spoken about today that we missed to provide context or nuance to how DOGE is working with or working around tech in the public sector?
Tarah: I think the thing that I would try to explain to somebody who's not in tech and not in policy, and I actually tried to explain this to a family member a week ago, is I said, you know, there's some real concerns about DOGE taking over the US Treasury. And my family member kind of left, DOGE didn't, Elon Musk isn't taking over the US government. You've got to calm down with all this hyperbole. And I realized in that moment that on a lot of levels, we in security and in technology are like doctors who have worked our whole careers to try to keep the people around us healthy. And we're giving the best information we know how. And it's always being updated. And every once in a while, a patient comes back and says, but doc, last year, you told me that eggs had too much cholesterol. And now you're telling me they're part of a healthy diet. We go, hey, our knowledge has been updated, but I'm still providing you the best information I know how to keep you healthy. And we're going to get better as doctors over time at that. I'm going to keep updating you because it's my responsibility to make sure that the people that I'm responsible for are as healthy as possible with all the knowledge and all of the guts and heart that I've got practicing my profession. And it's as if the American Medical Association all voted to be replaced by TikTok influencers. And now all of the information coming to my patients is about how a cayenne water cleanse and olive oil, colonics, and breathing oxygen is going to replace chemotherapy and pushups. And you can just do stuff that makes you feel good. And there's not really any way to know for sure. After all, remember how doctors are always telling you something a little bit different every single year? You shouldn't trust them. We are having this experience right now where in cyber security, it's like we've watched this alien creature take over our profession and take over the decision-making processes of all of the patients that we've been working so assiduously to keep updated and healthy for decades and decades. And I don't know that I can overstate the devastation so many cybersecurity professionals feel right now.
There's a real level of hopelessness, and I want to both acknowledge that and counteract that by making sure that people who hear this, whether you're a cybersecurity professional who feels really alone right now or you're just a person who is sitting here thinking, God, it's not real. There's not a takeover of the US Treasury. I want to make sure that you understand that this is not only unprecedented, that it's something so far outside the good practice of our profession, something so potentially damaging that we never expected that the value of protecting this information would come under question. That there needs to be a real reckoning and you need to at least begin asking the questions. If what you want to do is your own research, do it. But I want you to ask yourself the question, if you watched someone who was clearly unqualified, start to run a system or try to repair a system that impacted the lives and health of people that you cared about, wouldn't you at least want to make sure that they cared enough to try to do it the right way? Then listen to the people telling you that there's a real problem in the process that is both happening and not happening right now. We feel a sense of responsibility and care to people that we try to protect, even when sometimes the information that we are digesting and passing, as cybersecurity experts is too complicated. We update it, we give you analogies, we try to simplify it, but we feel that care. I desperately want to know and I fear the result of finding out whether or not the DOGE kids running the US Treasury software systems right now feel that sense of responsibility to the people they are supposed to be protecting.
Julian: That is fantastic, Tarah. Thank you so much for taking the time to join us and sharing your expertise with our audience. We look forward to, unfortunately, I suppose, requiring your insight in the future.
Tarah: I look forward to it. I absolutely look forward to it.
Noah: Tarah, where can people find you online? Do you want people to find you online? That's debatable.
Tarah: Come find me. What the hell? I'm @Tarah pretty much everywhere. Come find me on Bluesky. I'm at Tarah on InfoSec Exchange on Mastodon. LinkedIn is probably a great place to find me. I think I'm @TarahWheeler. Look, you're going to see my profile on LinkedIn, and I'm going to look like I've got a problem with you and with the world. That's me, red hair, and a problem with the world. If you want to reach out, please do so if you're a managed service provider, and you're struggling with what's going on right now, and how you explain to your small businesses why they should still keep the payroll data of their internal users safe when the government isn't doing it. I'm happy to have that conversation with you. Just reach out, redqueendynamics.com.
Noah: I think with that final statement, Tarah cut to a vital heart of this, which is so much of this system and this approach and this discipline was based on norms, and those norms themselves are based on values, and now we're about to find out what happens when that foundation stone of values is yanked out from the bottom of the Jenga Tower. One of the topics we talk about with Tarah is the access of a political appointee to a critical piece of digital infrastructure, just as a sort of general principle. It's not something that happens very often. Thomas Shedd, who is the Commissioner, who runs the Technology Transformation Service at GSA, the institution I used to work at, I used to be the Infrastructure Director there, asked for admin, asked for root access to Notify.gov. Notify.gov is a shared service of government, allows digital systems in government who don't already have a way to text people directly a platform in order to do that. Without reason or justification, he has asked for direct admin access, root access into this system. This is an outright violation of FISMA, the Federal Information Security Management Act, which we talk about in the upcoming interview with Tarah. There are very few things that the actual law tells you is in violation of cybersecurity principle. In general, FISMA and all other associated cybersecurity laws give the government incredible flexibility to determine its own security and privacy policies. There are very few things which are explicitly prohibited by the law itself. This is in general a standard pattern around modern government, which is whatever the domain is, whatever the scope is. Very rarely does Congress specifically say, and you shall do X on Y for any reason. They indicate a general goal, and then they let the agencies figure out how to approach that goal.
Julian: Which of course makes sense because cybersecurity best practices and standards evolve as the ecosystem evolves, and it's generally undesirable to hard code into statute, anything very specific about what those best practices should look like.
Noah: What is important about FISMA is the one thing it says in Black and White, with abundant clarity and directness is after you establish what that security policy and procedure is, what the acceptable use of the system is, you must follow them. And if you do not or if you believe it is an imminent threat to not following them, that is a security incident. And we are going to be specific about what happens when a security incident occurs. FISMA specifies that anything that is an incident, and in this case, we have the Commissioner of TTS asking the Notify.gov team to specifically violate their own security policy and acceptable use procedures. And if it's an incident, per FISMA, which this demonstrably is, that has to be immediately reported to the United States Computer Emergency Readiness Team or US CERT, which is where the US government collects and analyzes all incoming threats to the US digital structure. A federal judge yesterday denied an effort by 14 states to block Elon Musk and DOGE from accessing data systems or making personal decisions at multiple federal agencies. And the TLDR of this is that it's not enough for there to be a possibility of harm, right? There needs to be proof of harm. And a lot of that proof of harm or proof that they're actually violating the law requires people to come forward with data, come forward with information. If a law does not have a statutory penalty and you can't prove harm, what happens if anything? There is no FISMA jail, right? There is actually no, last time I checked and I read this two days ago, FISMA fine even.
Julian: Right. In principle, if there's no specified penalty, you could just get a court order, you know, ordering them to discontinue the prohibited conduct. And then the court using its inherent authority could impose contempt penalties for disobeying the order.
Noah: In that world, I wonder how much flexibilities judges have on what the contempt fine or consequences, right? Like they can't haul somebody into jail, I'm assuming.
Julian: In principle, they can, sure, to compel compliance. If someone is refusing to comply with a court order, the court can, in principle, order someone to jail until they agree to comply. Question is, is any court really going to be willing to order the jailing of a senior administration official? Would that do any good if they can just be replaced with someone who will also not comply? It seems not that probable to me that we would see a judge willing to do that, given how successfully Trump and folks in his orbit have slow walked or just declined to comply with court directives in the past without facing particularly serious consequences. I'm not hugely optimistic about that kind of big gun actually being deployed.
Noah: So, it seems like from the research from the US, states have laws that give minimums and maximums under certain circumstances for how long a judge can throw somebody in jail for contempt. At the federal level, it looks like, and the laws giving the courts these powers are quite old in fact, but there are no limits, it's totally up to discretion.
Julian: Right, and there's a difference between contempt as a penalty for misconduct and contempt to compel compliance. In general, the phrase you often hear is that the person who is held in contempt to compel compliance “holds the key to their own cell.” So, the idea is, look, someone who is jailed to compel compliance can walk out any day they like, they just have to comply with the court's order. And so, in a sense, a lot of these normal kind of due process limitations on that kind of authority are lessened because the principle is essentially, if this person decides to stay in jail for weeks because they don't want to comply, that is, in essence, their decision, rather than the court arbitrarily deciding some span of time is going to be imposed. But again, it seems extraordinarily unlikely to me that we're going to see a court that is prepared, just given the track record, to take that kind of step of saying, well, either the head of this agency or Elon Musk is going to find himself in a jail cell pending compliance with the court's order.
Noah: I'm perhaps more bullish on that. I think when the violations are so clear and simple, and given that we are in a new environment, like yes, you are absolutely not wrong, Julian, in terms of the trajectory and backwards looking analysis. That is all spot on, right? But I think we're in a very different world now. Whether or not the judicial system will awaken itself, I think there are some indicators it's starting to, to the attacks on its constitutional authority writ large, right? And how those systemic authorities kind of, as we're discovering with Congress, only survive as long as you exercise them, right? Like sure, you have rights and authorities and powers. The reality is, if you don't use them, it's not really clear that you do.
Julian: I mean, the other problem here is, is, all right, suppose a judge is willing to say, well, you know, this official is going to be jailed to force compliance with a court order. Who's taking them to jail?
Noah: I was going to, yep, you beat me to it.
Julian: Because it's executive branch agencies that are the ones that essentially have to, at the end of the day, put a gun behind the court's orders. And, well, the executive branch can say, no, you're ordered not to execute this court order. Of course, I mean, that would launch us into full blown constitutional crisis. But at this point, I don't think we can rule out the executive saying that, which I think is an additional reason. Courts are likely to be extraordinarily cautious here, because I think they don't want to, in a sense, force a scenario where the executive plays that card, and the courts are essentially forced to confront that they have no inherent actual executive enforcement mechanism. They're dependent on the executive itself to, not just to comply with its orders, but to carry out physically the penalties they impose.
Noah: Yeah, because what are the pathways here? It goes to GSA, hey, you're clearly violating FISMA. Don't. They stop because it's just not worth their time. That's honestly probably the highest probability, right? It's like, yes, they want this admin power, but it's like, given everything they're working on, is this what they're going to trigger the constitutional crisis over? No. But let's say, especially with what we've been seeing this week, they go, you know what, we're going to max out everything. That goes up to the head of GSA. I don't think he's going to do anything. Then that goes to the president. The president says, no, go kick rocks. I'm not telling them to not do this, right? Yeah, you're right. I don't think the courts are going to necessarily push somebody into contempt, even civil contempt, because it turns out there's two different kinds of contempt, civil contempt and criminal contempt. This would clearly be civil contempt. One of the ways that that would be more likely, at least, even if I don't think it's going to happen, is the burden of proof is not beyond a reasonable doubt. It's simply preponderance of the evidence. And if you're under civil contempt, everything you were saying about you hold the key to your own freedom is true. So instead, what happens or what would normally happen, normally happen, what the system, what the Constitution expects to happen, right, is then Congress goes, hey, hey, that's our law. We passed that, right? That was us telling you to do something, president. And you have a chance to now do it because we're telling you to do it too. And if you don't, well, we're going to try and impeach you. But like under their impeachment powers, it's only treason, bribery, or other high crimes and misdemeanors.
Julian: Those are terms that are, I mean, “high crimes and misdemeanors” refers to—misdemeanor, they're meaning misconduct, not misdemeanor in the kind of modern legal sense of, oh, you were jaywalking. And “high crimes” meaning, again, a sort of an antiquated term referring to crimes that involve abuse of power. So a high crime being the kind of crime that a high official is uniquely situated to commit by dint of having access to a position of power and being able to deploy that power corruptly.
Noah: I don't think this would pass muster for a high crime. Under the like, hey, whatever the convention of what they believe the word misdemeanor to mean at the time, it's never defined or specified in the Constitution.
Julian: That's the subject of- These are effectively political concepts. What is a high crime and a misdemeanor? I mean, essentially, whatever Congress decides rises to the level of warranting impeachment. It is basically within the purview of Congress itself to decide what kind of conduct qualifies. And I think that's very much by design, right? There isn't- You're never going to have a court come in and say: “No, Congress, you're wrong. This doesn't rise to the threshold for impeachment.” That's just Congress' domain to decide.
At some level, I think we all recognize that this is somewhat academic, that given the current composition of Congress, Donald Trump would really have to strangle a toddler on live television to actually be impeached and convicted by the requisite majority of the Senate.
Noah: Yeah. I think that's absolutely accurate.
Julian: I mean, I think in a sense, one of the questions I've been ruminating on these past couple of weeks is when we talk about “efficiency,” the Department of Government “Efficiency” as well. “Efficiency” for whom and toward what end? It strikes me that, whether or not this is part of some grand plan, it is anyway a side effect, which is that one lens through which to view efficiency is that the more stripped down and streamlined the bureaucracy is, the smaller is the total number of people who need to be prepared to comply with directives for the government to continue functioning at whatever level one finds acceptable for it to function at. One of the trends we're seeing, as we discussed in our episode with Henry Farrell, is the structural changes we're seeing are, in a way, about centralizing control in a way that requires less engagement with layers of bureaucracy that will include people with their own reputations and their own concerns about legal liability who can throw a wrench into the process by saying, wait, wait, wait, I don't know if I can carry out my part in this process.
Noah: Efficiency for whom? Let's answer that from the perspective of a person who definitely thinks he's the main character of this story. And it's perfectly reasonable for us to take him at his word, which is that in the one and only X space live stream that Elon has done since the inception of DOGE, which is about a week, week and a half ago now at this point, who can tell? Week and a half ago, 80 days, it's all compressing into the same perceptual environment right now. But on that stream, he said and said repeatedly with a few members of Congress on the line, if he could wave a magic wand and simply invalidate, not just the staffing of the administrative state, but simply all regulations, 100% of regulations, just remove them all at the same time, let society run for a few months, and then as things break, and as problems arise, Congress will then rearticulate more specific laws or give regulatory power back to the government to reinstate some number of those regulations. And this is an incredibly radical perspective. It cannot be articulated within the context of Republican thought of the late 20th century or early 21st century. It can't even really be found within MAGA circles itself. Not really. The only place it can be really found is in the neo-reactionary movement. And this I think is best explained by the positioning of incredibly wealthy, incredibly powerful, incredibly influential tech oligarchs, who are now associated with Trump, with the Republican Party, with MAGA that we didn't see in Trump 1. And I think that philosophy of the neo-reactionary movement is something we're going to be covering a lot more in the future, to help explain why they're saying things like that.
Julian: Frankly bizarre, the extent to which the wealthiest people in Silicon Valley have had their brains captured by, I mean, really kind of absolute crackpots with blogs, many of whom we both read 15 or 20 years ago and sort of said, wow, what a nutter. But you have people with billions of dollars at their disposal taking this stuff absolutely seriously. I mean, maybe this is why it would be ideal if people with that level of wealth had studied some philosophy and history and political science in addition to technical subjects, because that kind of background tends to make it pretty clear that these folks are rambling cranks.
Noah: I'll put it out there. I know I and you would have to manifest quite a lot to make this actually appear in the world, but I'll put it out there right now, is I wonder if one of the biggest impacts we have here on WatchCats over the next six to 18 months, just for an arbitrary time period, is an ongoing series or a short documentary around the neo-reactionary movement, because I think there's a lot of stories there that actually make it pretty well understood, but it's all in one place, with an easy to understand beginning, middle and end, and so what. And I think that's a hole that needs to be filled probably sooner than later. And in the meantime, the best way to support us is with a five-star rating on Apple Podcasts or your podcast app of choice. We're not picky. We saw we have a ton more subscribers than we have ratings. If you're enjoying the show, you want to see us make more of it. Drop a rating, share the show with your friends.
Julian: Until then, I'm Julian Sanchez.
Noah: And I'm Noah Kunin, and this is WatchCats.