It was reported recently that a security researcher found several exploitable vulnerabilities in a FireEye product. ‘I tried to work with them,’ he said, but was apparently rebuffed/ignored, so here you go: an 0-day. There are at least three sides to every vulnerability disclosure story so I don’t particularly care about who said what when. What we all should be concerned about is the law that applies to all software, regardless of what it does for a living. That law?
Functionality trumps security.
People don’t think twice when random commodity software product is found to have some horrendous vulnerability that makes it look like its code was produced by a band of monkeys that was rejected from the Shakespeare project, but when code belonging to something meant to keep your enterprise safe is found to have holes, that’s news.
It shouldn’t be.
I’ve been involved with enough security software projects to know that even the most security-minded people want their stuff to work first, then they lock things down. I don’t know that there is such a thing as a secure developer, there are just developers with varying levels of concern about security and different ideas on when that concern should be addressed. That any security product has holes in it should not be a surprise; what’s a surprise is that disclosures like this are not more common.
In fact, I would not be surprised if the last portion of the year didn’t see an increase in the number of flaws in security products being revealed publicly, with a corresponding increase in the level of hype. Much of that hype will be justified because – to draw on a popular security analog – if someone sells you a brick wall, you expect it to be able to withstand a certain level of physical damage; you do not expect to find out that key bricks are actually made of papier mâché.
Does that make the security company who sold you the software negligent? Well, does it work as advertised? Yes? Then the answer is probably ‘no.’ Remember: security products are not silver bullets: EVERYTHING you use has holes in it and you need to prepare and respond accordingly. You don’t terminate your workforce because people are demonstrably the weakest link when it comes to security, you manage the problem and associated risk. The same should be true for ALL the software you run, regardless of what it does for a living.
I know enough legacy-Mandiant people to know that they go to work every day trying to do the right thing and this latest development is just another example of how thankless computer security is (regardless of who you work for). Like the philanderer who didn’t use Ashley Madison pointing and laughing at the guy who did, the hypocrisy factor is going to go through the roof. My suggestion: save your self-righteousness and channel that energy into tightening your own work and helping tighten up the work of others. Demonstrate that you're about security, not being famous.