Monday, March 21, 2016

More on Purple Teaming

I wanted to add a bit more context/info/explanation on Purple Teaming after publishing the Ruxcon slides as well as Facebook and Twitter interactions on that topic.

What is Purple Teaming?

Currently, there are as many definitions for Purple Teaming as there are talks and blog posts on the subject but I'm going to throw mine in as well.

Purple Teaming is "conducting focused pentesting (up to Red Teaming) with clear training objectives for the Blue Team."

The clear training objectives (aka a plan to eventually get caught) for the Blue Team is what differentiates Purple Teaming from typical Red Teaming. By its very nature, Red Teaming is making a HUGE attempt not to get caught. You are pulling out all the tips & tricks and big boy tools NOT to get caught.  With Purple Teaming, you have a plan to create an alert or event in the event the Red Team is not detected by the Blue Team during the Red Team process so the Blue Team can test their signatures and alerting and execute their incident response policies and procedures.

It isn't a "can you get access to X" exercise it is a "train the Blue Team on X" exercise. The pentesting activities are a means to conduct realistic training.

A couple practical examples:

The Blue Team has created alerts to identify Sysinternals PsExec usage in the enterprise.  The Red Team would at some point use PsExec to see if alerts fire off and the Blue Team can determine which hosts were accessed or pivoted from using PsExec.  The Red Team could also make use of all the PsExec alternatives (winexe, msf psexec, impacket, etc) so the Blue Team could continue to refine and improve their monitoring and alerting.

Another scenario would be where the Blue Team manager feels like the team has a good handle on the Windows side of things but less so on the OSX/Linux side of the house.  The manager could dictate to the Red Team that they should stay off Windows Infrastructure to identify gaps in host instrumentation and network coverage for *nix types hosts and also to force incident response on OSX or Linux hosts.

Another example could be to require the Red Team not to utilize freely available Remote Access Trojans such as Metasploit or powershell Empire. Instead they could ask that the Red Team purchase (or identify a consultancy that already uses) something like Core Impact or Immunity's Innuendo or find a consultancy that has their own custom backdoor to spice things up.


Other Purple Teaming resources (in no particular order):



Anonymous said...

I think the idea of purple teaming is great. It promotes a better teamwork approach, rather than "us against them" when you have red & blue teams. I feel it also offers excellent benefit to the defense, as they learn more about what the offense is doing...and vice versa.

Anonymous said...

This is all Red Team. It came from the army and used by the army to exercise all kind of simulations of adversaries.
Red team exercise goals may vary but it is still a Red Team exercise

CG said...

i disagree. I was on the army red team. we never gave two craps about training other people on how to catch us. There was a "Blue Team" that was in the same organization but they focused more on teaching vuln management and there was only high level work on syncing the two missions.

Not sharing our craft is also partly why we have defenders not getting better or the people that are getting better throwing TONS of money at the problem. As red teamers, we come in and beat the shit out of people and they are left later trying to figure out what they should do, change, or fix to detect and respond better next time. Very rarely did i put anything on detection in my reports. Even more rarely did i know if i was caught along the way, what they have for NSM stack or anything like that so i could make an useful observation to the client.

I'm sure there is some magic consultancy that does this for every client but i don't know who they are. I have to go off my own experiences. In my experience, If you determine what the blue team needs work on and crafting attacks to force them to train on those topics has some positive outcome.

Anonymous said...

I completely agree with CG - I cant tell you how many times I asked how "agency" red teams gained access to no avail, even when we went to Senior Executive and General Officer level. It was pathetic. I questioned what their mission was...

Anonymous said...

Ive been in the army in a combat unit and now in infosec. We used Red Team to name the adversaries and Blue the defense. So the purpose of the exercise was always to train both sides, every time the goals were changed. What i am saying is that i agree with what you wrote about the need for tailored exercises and their value. I disagree with naming it Purple.

And no magic consultancies, i know of 3 who does this. Only trouble as i see it is that clients mostly are new to this and you need to help them see why it is good for them

CG said...

I was in for many years and read all those manuals too. I see what you are saying but you are repeating shit you have heard and not talking about the reality in the real world. Or to be more fair, the reality that i've seen and currently live in. I cant speak to your work reality because you've given zero details.

My reality is that Red Teams (in general) dont help the Blue Teams aside from throwing a smackdown on them in a "real world situation". This has tremendous value but it is distinctly different than having objectives for the blue team to work on. Yeah yeah the military does this some, but i've seen it rarely in the commercial world.

You disagree with calling it Purple, but propose nothing else. You know of at least 3 consultancies that do this but refuse to name them. You're bringing little to this discussion and thus far nothing involving original. I've heard/read the above a hundred times.

However, if you'd like to actually continue this, feel free to email me or post under a real acct because i feel like you are on the cusp of having something valuable to impart related to this subject but you just havent yet.


dre said...

Figured I'd chip in even if my views aren't your favorite. There is an interesting perspective that the TrustedSec team is engaging with --

However, I'm still sold on the terminology here --

There are a few inefficiencies with that terminology, some of which I commented on in that post. A vulnerability assessment is a counter deception practice. Vulnerability management is a totally different concept, one that should go away, especially with tooling moving towards cve-search, VulntoES, or future-relevant concepts in data engineering.

Purple teaming is not possible because red-teaming analysis (RTA) is the only possible technique that can be performed by analysts in an intelligence-led cyber operations, cyber defense, or cyber investigations program. RTA is a table-top exercise at best. However, adversary simulation is possible -- this is what most people refer to as purple teaming. If we really want to get technical with the terminology, I'll concede that purple teaming is possible, but it must be table-top, not active. Any red-team engagements must be done by external forces, not OPFORs in an active environment. Another potential exception to this rule is the MITRE-standardized Cyber Exercise, which includes components of what you possibly describe as purple teaming, but, is, in fact, definitely defined as a Cyber Exercise with its unique and already-standardized terminology. Why reinvent the wheel or why knock the wheel off and replace it with a hoop and say it's a wheel? It's a wheel.

CG said...
This comment has been removed by the author.
CG said...

background on the deleted comment. I had a few questions around definitions in Dre's comment that i found were in the MITRE documentation. so i deleted it.


CG said...

"Purple teaming is not possible because red-teaming analysis (RTA) is the only possible technique that can be performed by analysts in an intelligence-led cyber operations, cyber defense, or cyber investigations program. RTA is a table-top exercise at best."

You are going to have to explain this more. I read the MItre docs and the table-top is basically an intellectual exercise with no real traffic being generated or alerts created. I think you are trolling me on this one....

dre said...

@ CG: the winterspite links explains that one cannot perform their own red team engagement, but one can perform an adversary simulation. If you look at the sections on Simulation and RTA in The Analyst's Cookbook Vol 2, you'll see direct mappings to my use of the terminology. I'd argue that winterspite is incorrect only his use of Vulnerability Assessment and Adversary Emulation.

In other words, yes, Purple Teaming is EXACTLY that: a table-top exercise (no traffic, no alerts) similar to the one seen here --

If you are running actual traffic and alerts, this is referred to as an Adversary Simulation, a simulation, or a war game. In this situation, what you (and others) often refer to as red team is actually called the OPFOR, not "the red team" because they can't be a red team. Being a red team is reserved for special situations, namely red-team engagements (i.e., outside parties), RTA activities which usually result in intelligence products (i.e., FINTEL), and/or MITRE-standardized Cyber Exercises.

If you want your mind blow, the winterspite phrasing of Blue Team is excellent. My comment on his blog about Vulnerability assessment is another mind-blowing event in the works.

No, I'm not trolling you.

gabeleblanc said...

Who gives a shit about proper 'terminology' and fuck MITRE, people give them way to much credit. I was an operator on the USMC Red Team, then Army Red Team (CG was actually my replacement), and created a certified Red Team at SPAWAR. I now run the CNDSP side of the house, aka Blue Team. I have created a 'Purple' team which is comprised of members of both the Red and Blue Teams, hence Purple. Red Teaming is a function of CND, always was in the InfoSec sense, not talking Marcinko stuff here. Red exists and emulates real bad guys to make Blue better. When Red writes a report we (blue) chop on it to add the IR/Defensive stuff that can/should be done. The more we raise our game the more they have to raise theirs. I teach my blue team red TTP's on a weekly basis. They then take that training, run it in our production environment, turn to our data to find it, if nothings there that drives the creation of an indicator. Bottom line red working WITH blue as opposed to against makes the world a little bit more purple. #purpleteam