Friday, July 6, 2012

Some Thoughts on "Secret Sauce"

There was a lot of interesting information exchanged at the SANS DFIR Summit, but one of the phrases that I keep returning to in my mind was one used by Chris Pogue during a question posed in his "Sniper Forensics" class.  Chris posited, quite rightly, that companies are loathe to share their "secret sauce".  As much as we as individuals are a community, we can't overlook the fact that many of us are employed by competitive businesses.  And for those of us that have worked IP Theft cases, it is obvious that Intellectual Property is a Big Deal.

That's quite some Catch-22.  The community as a whole needs people who willing to share, but people in the community by virtue of being employed in the field, can't share the "secret sauce".  What to do?


I think that at least part of the answer is easier than we are making it out to be.  Many people recognize the necessity of a company holding on to its own intellectual property.  But what actually constitutes "secret sauce"?  Could there be information to be shared that benefits the community while not harming, or perhaps even helping, DFIR employers?  I would argue that the items discussed below fall into this category:


Testing Results of Industry Tools
Tools are an important part of any examination.  While examiners should be aware of how data is stored natively, parsing vast amounts information without an automated process would be patently silly.  Simply stated, we need tools.  And sticking with just one tool can be detrimental since I know of no single tool that does everything needed for a complete exam.  The answer is to have a robust toolkit.  
Testing of your tools should be par for the course for an examiner, but with such a wide range tools and data sets, the chances are high that testing cannot be done on every scenario.  This is where having a community all testing and running tools can be a great benefit.  A group of people running tests in different environments and with different data sets will cover far more ground than any one individual.  The trick here is that it needs to be shared in order for us all to reap the rewards.
But how does an employer benefit in this scenario?  Beyond getting a more robust toolkit for their DFIR teams, having tools well known and accepted in the community can help with testifying experts on the stand.  And most companies I know like the idea of getting a good, respected product especially if it is easier on the wallet.

Industry Research

It isn't news that our field changes rapidly.  And staying up on the newest artifacts, data repositories, and gadgets is quite a daunting task, even for a whole community.  Some of the biggest DFIR advancements made in recent years came from people down in the trenches who starting looking into something, and then shared their results.  I honestly believe that the sharing of individual research is vital to keep our field moving forward.
But how does this sharing benefit an employer?  One way, is that by having employees share knowledge, they can establish themselves as subject matter experts.  The value of a name (either personal or company) attached to research can be invaluable.

Okay, so now that I've detailed what I think isn't "secret sauce", is there anything left over that is?  Certainly there is.  The steps of a highly planned, orchestrated team approach could be one example.  Even a template could be considered another.  But even if these are examples of "secret sauce", it's the details that make them so, not the fact that they exist.  Recommending to the community that they create their own versions, without necessarily giving out yours, is yet another way of sharing.  (A perfect example of this is the "Sniper Forensics" presentation by Chris Pogue I mentioned earlier.)

But this is just the ramblings of one person out of many.  What are your thoughts on "secret sauce" and what it is or isn't?  Please share the ingredients to make our own community non-classified sauce.

6 comments:

  1. I believe the talk at the summit was about IOCs and how those constituted 'secure sauce'. I wanted to scream at how terrible of an example that was as most of IOC data is trivially obtained by even a novice malware analyst or even automated sandboxes.

    If a company thinks highly enough of IOCs for them to be safely guarded, it probably means their analysts and research is really weak and they have nothing else to go on.

    I also thought it was funny when he said clients would actually consider who has what IOCs when determining a company to use. I hope that was something made up quick on the spot and that he doesn't actually believe that.

    ReplyDelete
    Replies
    1. I think sometimes the decisions on what can and can't be shared is made on a higher level than those of us attending the conference. I'm starting to believe that the solution to the problem is in part educating companies on the benefits of having their employees be a bigger part of the community.
      I'm going to cop out here and not take sides on the issue of sharing IOCs, mostly because I believe that both sides have valid points. I'm going to go sing songs around a campfire now... :)

      Delete
    2. There is a counter intelligence reason for to consider when timing the decision to release an IOC. Releasing an IOC alerts the adversary that they their method is detected. I want to be reasonably sure the release of that information will not hinder my incident response/mitigation. We have to take that release of information to the adversary and weigh it against potential benefit to the community as a whole when deciding at what point to release an IOC's details. It is not just not wanting to release the details.

      Delete
    3. I understand your part Mal, but that was never discussed at the summit about why to have delays. All that was mentioned was 'competitive advantage' as if companies count IOCs before choosing a company for IR or similar.

      Delete
  2. nice posting. thanks for sharing

    ReplyDelete
  3. A bigger problem that I see is not so much something as simple as an IOC but the techniques used by software vendors to obtain their results. For example many analysis platforms provide a carving feature, but unless we know what algorithms are used to carve we cannot determine how effective they will be, at least without a lot of blackbox testing. The same goes for mobile phones. A vender might claim to be able to extract data from x model of phone but does not provide any information about how that is actually achieved. This may mean that the examiner has no option but to blindly trust the vendor when they say their tool is forensic.

    In my opinion this is one of the strongest arguments for open source software.

    On the other hand developing a technique to acquire one model of mobile phone may take months. The ability for a product to be able to deal with that phone is then a market advantage and they should be rewarded for the time taken to develop that solution.

    ReplyDelete