It’s a lot easier to pull teeth than it has been to extract sense and meaning from the parade of data points we’ve been treated to in the last 15 months on Whatever We’re Calling This Thing. (Russiagate, Muh Trump, Deep State Attacks – pick your cultural pop-reference of choice.)

But we’ve reached the point at which two threads I’ve been presenting converge.  It’s time to lay out why they do, and what it means.

The latest evidence of a leak from inside the White House clarifies the moment.  What’s important about the leak is that it happened.  After all these months of vigilance, it still happened.

If you simply assume that that’s because neither Trump nor John Kelly is competent to run a staff, there’s not much I can do for you.  The media have been doing their best to promote that as a theme, after all.

But if you’re willing to consider that the leak probe, in fact, originates with people who were there before Trump took office and that the “leaks” may not be occurring the way most of us mentally imagine them, the whole picture looks different in a very significant way.  It looks coherent.

Trump has had two big problems in his national security apparatus since he took office.  They both map to the same cause.  One is that there is a career establishment, some of it very entrenched, that has been opposing the policy priorities and intentions of Trump and his top aides.

The other is that some of the same “career establishment” people have been participating in a campaign to promote that opposition publicly, whether through attacks on Trump and his personnel (i.e., on their character, intelligence, etc.), through leaks of classified intelligence, through “leaks” of things that aren’t even true (such as the allegation that Trump exposed highly sensitive Israeli intelligence to the Russians), or simply through speaking out in direct opposition to the president’s policies in public – something they would have been excoriated for doing to Obama.

Trump had to starve the State Department into a condition of de-staffed paralysis to make it incapable of sabotaging him that way.  If we recognize, as I believe we can, that Trump has all along regarded H.R. McMaster as a place-holder at the National Security Council, and that he’s been waiting to gain control of it himself, instead of having it snuggled under his arm as an adversarially-staffed viper waiting to bite, the timing of his change of national security advisers makes perfect sense.

Trump is ready to take control of his own foreign policy.  The latest leak may well have helped accelerate the McMaster-Bolton transition.  But that wouldn’t be because McMaster himself is a snake in the grass.  It would be because McMaster’s tenure was a compromise for Trump: a measure to keep his most venomous opponents in a comfortable stasis.

Beyond his foreign policy, however, Trump feels ready now to take the initiative and get control of his NSC staff.  I think it is accurate to say he understands that’s where the “leaks” problem is.  And even if he doesn’t know all the ins and outs, I suspect he has a system-level understanding that his leaks problem comes from the same set of conditions that enabled the Obama administration to come up with its “insurance policy” against a Trump presidency.

The short explanation is that there is a mechanism made available by the capabilities of the U.S. government, which, if used improperly, can be deployed to spy on, preempt, and defame a target, up to and including the sitting presidential administration.

Two important features of that mechanism are the converging threads I spoke of earlier.  One is technology that took a quantum leap in automation and pervasive availability under the Obama administration.  The other is an organizational application that we now know came together in the Obama administration’s final six months, with the “Russia” task force that was set up in August of 2016.

Information technology

The quantum leap in automation had more than one element.  I identify three major ones: the IT upgrade of the White House Office (along with the coincident emphasis under Obama on IT upgrades in other agencies of government); the streamlining and homogenization in particular of the U.S. intelligence community’s IT apparatus; and the spread of cloud computing in government IT – again, particularly in the intelligence community.

It’s worth taking a moment to make a couple of points about the Obama emphasis on upgrading IT, and the modernization of the White House Office.  There’s an underappreciated story behind it, which started with “fixing” the healthcare.gov website that failed so spectacularly with the Obamacare roll-out in 2013.

We won’t rehash the story here.  The gist is that the ad hoc team of IT workers assembled to fix healthcare.gov went on to supply most of the main figures in Obama’s White House cyber team: the one he formally created in March 2015 to replace the older consortium of multiple agencies that had procured and operated IT infrastructure for the White House Office.

From a larger perspective, one of the most significant aspects of this change wasn’t even specifically about the White House Office and its IT arrangements.  It related instead to the Office of Management and Budget’s supervision of IT strategy for the federal government as a whole.  By giving his White House IT team a charter embedded in that OMB role, Obama was able in the last two years of his administration to deploy the IT team, on a basis of agile ad hockery, and implement rapid IT transformations in several agency cases (including immigration documentation, refugee vetting, a VA application process, and borrower information for student loans).

Basically, his team bypassed the normal, lengthy design and procurement process to install “fixes” that reportedly delighted federal workers and customers, and saved money to boot.

There’s no need to doubt that this move did some good.  But it’s taking a lot on faith, to skip procedural steps that were intended to guard against such dangers as having your IT system used against you.  And Obama’s White House IT team did exactly that: skip procedural steps.  By early 2016, Congress was calling them out on it.

Notably, one of the early victories of the incipient Obama IT team was shifting components of the healthcare.gov website’s functionality to cloud computing.  What that basically means – cloud computing – is that instead of you, as an organization, hosting all the data and applications on your own “enterprise” backbone, you contract for “cloud” services from a company that makes the computing power and data storage available at need.  You save money by not having to maintain and manage the infrastructure.  You can also get upgraded capabilities faster because someone else is making it his job to offer them as a service.

Your data and organizational operations, on the other hand, are exposed to an outside cloud contractor.  Trust is the most essential element in the transaction.

This is why there was initial resistance from the Department of Health and Human Services to embracing cloud-enabled operations for healthcare.gov.  The website was going to be processing a whole lot of private information about Americans.  But the ad hoc fix-it team pushed ahead and counted it a major success to realize the gains in agility and affordability from shifting some elements of healthcare.gov to a commercially-contracted cloud.

We can consider this an informative example of the Obama team’s approach.  For context, keep in mind that the Obama intelligence community had already (in 2013) decided to embrace cloud computing for its highly-sensitive information, a year before the healthcare.gov fix-it team was pushing it for the Obamacare customer interface.

There was never much reporting on the Obama IT team’s activities outside of industry-focused media.  Most Americans probably haven’t been aware that the IT infrastructure of the White House Office underwent a significant overhaul in 2015 and 2016.  The description of its outdated decrepitude in 2014 certainly suggests that the improvements were past due.

Obama Administration

But for our purposes here, the important point is one you probably wouldn’t think of, unless you were thinking about leaks.  The automation and digitization of the White House Office systems didn’t make it harder to leak information from the White House.  It made it easier.

Upgrading the infrastructure to 2016 standards made it compatible with automated means of transferring information, in ways it had not been before.  Where once it might have literally required being in the presence of outdated recording equipment and locally stored records to get hold of leak-worthy information, it could now be done through modernized, connected systems.  Digital telephony alone made a watershed difference in that equation.

Suspend judgment for a moment on whether a given administration would want to gather intelligence on the next one.  The point here is that in 2014, such intelligence-gathering would have met with formidable IT obstacles.

By 2016, the new WHO infrastructure had knocked the obstacles down.  It was no longer literal air gaps and analog equipment that stood in the way of moving information on the sly.  It now required proactive attention to security, to prevent the possibility of information being moved on the sly.

Obama Administration
Making his first calls to world leaders in 2017. (Image: SScreengrabof AP video, YouTube)

The intelligence community transition

There are three reasons the IC’s IT transition matters to our story.  One is that the increasingly streamlined, automated IC infrastructure reached into the White House (e.g., in the Situation Room and on the NSC staff).

The second is that it transformed the way an important process was accomplished: the process of unmasking the identities of U.S. persons in the National Security Agency database.

I have written about this at length before (see my explainer here from August 2017), and refer you to the earlier posts for full understanding and verification.

The points to keep in mind are that unmasking is no longer a procedure-intensive operation at the point of the user transaction itself and that the key controls on it, for practical purposes, are the permissions associated with users’ accounts on the intelligence community’s IT systems.

As a refresher, someone like Susan Rice who wants to unmask a U.S. person is very unlikely to be literally performing the operation herself.  Trusted staffers on the NSC staff would be empowered with the user-account permissions to unmask identities, invoking Susan Rice’s credentials as an unmasking authority.

It has been possible to a limited extent to perform computer-automated unmasking since the early 2000s.  But in the period after about 2011, the system design and connectivity were put in place to make the capability much more broadly possible, although not necessarily available.  The vision that guided Director of National Intelligence James Clapper was the axiom “tag the people, tag the data” – meaning that users’ access to cells of information should depend on their account permissions, and how those permissions matched the security tags on the data.

That concept, plus or minus some level of detail, is the way to understand what it took to unmask U.S. person identities (USPI) as the streamlining and automation of the IC’s IT infrastructure progressed.

It’s essential to understanding the issue, in fact.  It’s why, as the standardized desktop environment for IC users exploded from 9,000 users in 2014 to more than 50,000 in 2016, the number of unmaskings exploded as well.  Here, as a reminder is a picture of that unmasking explosion

Obama Administration
(ODNI transparency report graphic)

In the absence of automation and widespread availability of a common IT environment, there’s no way the level of unmasking could have increased as it did.  The intel agencies literally couldn’t have processed so many unmaskings on a multiple-humans-in-the-loop basis.  Automation was essential to the morally meaningful development: the dramatic increase in unmaskings, without apparently commensurate justification.

The cloud

The same principle – that technology fostered a morally meaningful development – is true of the third reason why the intelligence community’s IT arrangements matter.

Cloud computing (again, refer to the August 2017 explainer for more detail) did something exceptionally important in this drama.  Making an IT cloud available, for users to manipulate data in, enabled users to perform certain functions in a less closely watched computing environment.

Obama Administration

 

 

 

 

Someone unmasking a U.S. person’s identity in the NSA database can’t make that action invisible to NSA, which is always watching like a hawk.

But on the other hand, the NSA administrators of the database, and of its integrity and use, don’t necessarily know what’s being done with the USPI after it has been unmasked.

And it’s what is done with the USPI that makes the unmasking lawful or unlawful.  NSA, making scheduled reports on unmasking to the FISA court, can account for the number of unmaskings, and which accounts performed them under what authority.  But it can’t guarantee what purposes other agencies have unmasked USPI for.  That’s up to the other agencies. (Handling unmasked USPI is governed by Executive Order 12333; see the explainer post.)

The IC cloud, contracted with Amazon Web Services in 2013 and implemented (as “C2S”) from 2014 to the present, is the component that would have enabled users to manipulate USPI out of sight of the auditing functions performed by NSA, and by local networks run as part of the CIA backbone.  The IC cloud isn’t hosted or managed by IT staffs at the NSC, or at ODNI, or the Defense components, or dedicated back rooms at client agencies like the FBI, DEA, or State Department.

The IC cloud is hosted and managed at an Amazon Web Services facility in northern Virginia.  To audit the IC cloud, keystroke by keystroke, you have to start there.

As laid out in the August 2017 explainer, the IC cloud is required by federal standards to be auditable.  But that’s not the same thing as saying that it is routinely audited for the potential use or storage of U.S. person information, unmasked and imported from the NSA database.

.

Obama Administration
One of several massive Amazon Web Services data centers in northern Virginia, providing cloud computing services to government agencies. (Image: Google Street View)

And if USPI were shared among users in the cloud – potentially without the security tags that are supposed to follow it around – and then deleted, that would complicate any auditing that was done.

The threads converge

That, in broad strokes, is the summary of the technology migration that made it possible for unmasking and leaks to both become so prevalent.

That’s one thread.  The other is the collaboration thread, embodied and symbolized in the special “Russia” task force that was set up by Obama at the highest level of government in August of 2016.  Although that task force ostensibly had a legitimate purpose, its own members have expressed great disappointment that it never really did anything – at least not in the service of its official reason for being.

But there are two key reasons we need to focus on that task force.  One reason functions as a pivot point from the technology thread, because the two major effects we are ultimately talking about – surveillance of the Trump team, and leaking – depend on data-sharing and collaboration.

They depend on data-sharing and collaboration within a group, certainly, but those functions also have to be arranged, on the leaks side, with persons outside the group.  In either case, the cloud is the ideal mechanism for the transactions involved.

What makes it ideal for either purpose is that you can drop things in it, and someone else can come along and pick them up, without ever having to “meet” you detectably in any sense.  Inside the cycle of audibility– if there even is one – the cloud can function as a virtual and unwatched dead drop for the exchange of information.

We can assume that the members of the “Russia” task force in 2016 weren’t trying to hide things from each other – although we are told they took exceptional precautions to avoid the ordinary awareness level of workers on the NSC staff (e.g., shutting down a routine video feed from the Situation Room when the task force’s meetings were occurring).  The task force was, apparently, under orders to hide its activities from other people, even in the Obama White House, who were not “read in.”

But facilitating leaks to the outside – ultimately to the media, although not necessarily to them alone – could also be a function of the cloud.

Obama Administration

Remember, the IC cloud is hosted and managed at an AWS facility miles away.  The question is not whether we know for sure that no information was ever dead-dropped to an outside party this way.  The question is whether we can guarantee that it couldn’t possibly be.

And that cannot be guaranteed.  Only a fool would say it couldn’t happen, and that it needs no looking into.  Don’t forget that the same man who owns Amazon Web Services owns the Washington Post.

The necessity of collaboration

This much, we have already explored in earlier articles.  But now that Trump is moving to secure control of his own NSC staff, it’s time to highlight the final reason why we need to focus on the 2016 Obama task force, in light of these converging threads.

The task force is so significant because its creation demonstrates something crucial about arranging surveillance of Team Trump – and using it for political rather than legitimate national security purposes.

The crucial reality is this.  To do that without leaving a flashing neon trail of wrongdoing, it would take the collaboration of multiple agencies.

This is something that has been visible to me almost from the beginning (call it February of 2017).  Initially, in fact, although I knew the FBI had to be involved to some extent, I was inclined to be more suspicious about the NSC and ODNI staffs.  (I have never thought it likely that there was complicit involvement by people from NSA.)  I thought, in the beginning, that any FBI or DOJ role was probably minimal.

Obama Administration

 

 

In the year since we have learned that DOJ and FBI personnel appear to have been right in the middle of it.  But we are in some danger now of over-focusing on their participation, and forgetting the extensive roles played by others: John Brennan and the CIA, James Clapper of ODNI, Susan Rice and the NSC staff, and even the State Department, including John Kerry and Samantha Power, the latter of whom was the ambassador to the UN at the time her credentials were used to authorize hundreds of unmaskings of USPI.

We can speculate in each of these cases about political motivation and level of enthusiasm for the collaborative effort of spying on Trump.  But the actual bottom line is that it wasn’t possible to do it and keep it effectively under wraps, without the involvement of all of these parties.

Obama Administration
Obama meets with national security principals in the Situation Room in 2014. (Image: The Obama White House)

The reason is simple.  It takes such collaboration to circumvent the safeguards built into our surveillance laws.  Each party has functions it can properly perform, but which it is not supposed to go beyond.  And only one party, out of all the ones named above, can actually evade a true, meaningful audit of its activities, for long enough to keep this kind of enterprise going.  That party is the one at the pinnacle: the NSC (and implicitly, the Executive Office of the President itself).

The task of putting Trump and his associates under surveillance was broken into two major parts: gathering the data, and processing it.  The only agency that could lawfully gather the data was the FBI, under the DOJ’s authority.  And the FBI had to justify gathering the data to the FISA court.

Of equal importance: once it had gathered the data, the FBI was on a clock to do something lawful with the data – and would be detected pretty quickly doing something unlawful with it.  There’s a reason why the shocking spreadsheets of USPI were done for Susan Rice, and not for James Comey or Loretta Lynch.  They also weren’t done for John Brennan or James Clapper.

There may or may not have been an element of interest, enthusiasm, or profound complicity in Susan Rice’s very central role.  But there was an element of necessity.  She was the only one who could commission such spreadsheets, and the lawless analysis they represented, without inevitably awakening the alarms of career professionals with consciences and integrity in her organization – people from whom such activities could not long be hidden in agencies like the FBI or CIA.

Significantly, the NSC also had staffers whose user permissions on the IC’s IT system allowed them to perform unmaskings beyond the “two-hop” circle from an FBI target like Paul Manafort or Carter Page.  The FBI has such staffers too – but they owe integrity reports to the FISA court, and are at much greater risk of being audited for their actions.

Obama Administration

 

 

 

 

A reconstruction

If I had to reconstruct how the surveillance scheme worked, I would suggest the following, based on what we know today.

In 2015 and some portion of 2016, targeted surveillance of Trump probably relied for its core of cueing and relational unmaskings on the surveillance warrants for Paul Manafort.

As 2016 wore on, the need was foreseen to refine the focus – to be more effective without having to cast a wider net – and keep surveillance going.  A likely basis appeared to be one that emphasized a particular connection: between a Trump team member, however peripheral, and Russia.  The handiness of this probably evolved naturally, to some extent, from the FBI’s prior encounter with Carter Page in 2013, as well as the familiarity of many of the DOJ, FBI, and State Department players with Russian organized crime.

It was an obvious hook, and Trump’s own business connections with Russians were only a small part of that.  Eventually, it became the justifying narrative for the whole enterprise.

Between the surveillance authority for Manafort and Page, the FBI could justify pulling a lot of identifying information on hundreds of people via the NSA data trove.  But the scheme had the larger utility of cueing others with unmasking permissions (e.g., staffers of Susan Rice, John Brennan, and Samantha Power) as to which discriminators they should use to most effectively unmask additional persons, beyond the ones the FBI could justify.  The staffers could put the information about those additional persons in circulation among the members of the collaborative surveillance enterprise, which included the FBI.

In other words, Susan Rice’s staff wasn’t just fishing when it created spreadsheets for her.  It was probably acting on cues derived from the FBI surveillance of Carter Page and was in a feedback loop with the FBI.  But what was done with that information had to be done at the NSC staff.  The FBI couldn’t be caught doing it.

Neither could the CIA, of course.  It isn’t clear to what extent Brennan’s staffers were involved in the onward processing of unmasked USPI.  The same is true for Samantha Power’s personnel.  I tend to think Power’s unmaskings were done – probably either at State or in New York – because it was convenient to have such a comparatively unwatched node for unmaskings.

But for all of this activity, the obvious vehicle for follow-on collaboration at the worker and analyst level would have been the IC cloud – because no single organization was on the hook for auditing what people were doing in it, with respect to orchestrating FISA “surveillance,” or observing the data handling requirements of E.O. 12333 (which should have governed what was done with USPI in Susan Rice’s office, for example).

The existence of the interagency task force afforded top cover.  But it was a flimsy cover for the long run since some of the analyst-level activities contravened what the agencies were supposed to be doing.  Hence, a need for the effective anonymity and forgetfulness of the cloud.

The interagency organization remains crucial, however.  The CIA and State Department, whatever other roles they played, were central to the effort to justify the all-important surveillance authority, which had to be obtained from the FISA court because there was no way to avoid the transactional record created by pulling data in the NSA’s IT systems.  Those data retrievals – the actual method of most modern “surveillance” – put the FBI on the hook to remain justified under the proper authority of the FISA court.

The CIA and State Department were both integrally involved in bolstering the Steele dossier, the central basis for justifying surveillance.  The CIA contributed supposed additional intelligence from foreign official sources, while State forwarded the “investigative research” done by Clinton crony Cody Shearer, as purportedly separate, corroborating information to back up the Steele dossier.

John Brennan’s task force of “several dozen analysts from the CIA, NSA, and FBI,” assembled in August 2016, became the analyst group behind the intelligence community assessment that was briefed to Congress in early December 2016, and presented in an unclassified form to the public in January 2017.  That assessment laid out the bones of the narrative that was used to justify the surveillance enterprise – and that turned into the basis for the Mueller investigation.

Turning the “resistance” into the “persistence”

The very last piece of this puzzle brings all the elements of the two major threads together.  What we know about what I’m about to outline is that every bit of it is possible.  We just don’t know if it happened.

How do you take the collaboration that may well have been the real enterprise of the Obama “Russia” task force in 2016, and keep it going in 2017?

First, you keep the surveillance authority on Carter Page going.  Beyond that, you need users at the NSC staff and the FBI (at a minimum) who still have the permissions to perform unmaskings, if necessary.  Such users will have to limit their use of unmasking permissions, because on the NSC staff, in particular, they no longer have the proactive cover of senior officials.  But they can probably still manage some amount of activity, at least in the first few months.

With Dan Coats and Mike Pompeo in the DNI and DCI roles, and Nikki Haley as UN ambassador, it is virtually certain that any unmaskings being done in their offices ceased as soon as their predecessors were gone.

But the other activities would enable a continuation of the surveillance enterprise at some level.

Meanwhile, the ability to leak the fruits of that surveillance enterprise would depend, in the simplest scenario, on the same element it would take to leak things from the new president’s Oval Office: someone – probably on the NSC staff – who has visibility on what’s going on in the White House, and can also move information outside of it undetected.

A digital pipeline could actually be the least detectable way of doing that if it’s the right one.  The individual wouldn’t necessarily need a digital backdoor into the systems that serve the president – although that would be handy.  It’s probably not meaningless, in that regard, that White House CISO Cory Louie was summarily dismissed in February 2017.  We just can’t be sure of what that dismissal meant.

An alternative to digitally transferring data created in a presidential IT system would be entering it by hand into a system that could move it to its destination.

And as long as no one is alerted to the possibilities it represents, the IC cloud is the obvious vehicle for that.  It reaches into the White House, numerous NSC staffers have accounts with access to it, and its other end is in a faceless data facility in northern Virginia.

It’s also possible, of course, for an NSC staffer to meet face to face with long-time acquaintances and pass information on that way.  The acquaintances could then make the actual contacts with media reporters.  Perhaps that’s all that has been going on.

But in light of everything we have learned up to now, assuming it’s all that’s been going on is not the smart or prudent approach.  It took a team effort of government officials and a lot of technological evolution to get us to the situation we’re in now.  It’s more likely, not less, that the persistent leaks map to the same model and set of circumstances.

Leaks that continue, serving the same narrative that has been driving the “resistance” train from the beginning, are unlikely to be suddenly coming from freelancers.  We know enough now to be looking in the right places.

Cross-posted with Liberty Unyielding