Today, Microsoft confirmed that there is an unpatched remote execution exploit in Windows XP and Server 2003. This vulnerability was first reported to Microsoft on June 5th by discoverer and Google employee Tavis Ormandy. Microsoft had to confirm this unpatched vulnerability today because Ormandy decided to release the details of the exploit under the guise of Full Disclosure yesterday, five days after reporting his finding to Microsoft.
Now, I certainly do not consider myself to be an expert in the security field, but I am someone who is responsible for deploying vendor patches and monitoring the security of our systems. I personally find Ormandy's actions in disclosing the details of this exploit before Microsoft could patch the issue to be suspect. Now we are left in a situation where exploit code has been released by a security researcher a month or more before the vendor can analyze the details, develop, test and release a patch. How does this situation improve our overall security?
Microsoft, and most other large software vendors, advocate for Responsible Disclosure of security flaws by researchers. As you are probably aware, in Responsible Disclosure the details of the vulnerability are kept secret until the vendor can release a patch or acceptable workaround for their systems. Years ago, this wasn't considered an option because many companies sought to hide security flaws in their products. Over the last 10 years or so most companies, Microsoft included, realized that hiding flaws and vulnerabilities is no longer acceptable standard operating procedure.
So why did Ormandy disclose the full details of this vulnerability a mere five days after first reporting it to Microsoft? It's hard to determine his true motives, although he states that Microsoft would have ignored his warnings otherwise. Ormandy also cites Bruce Schneier's essay on Full Disclosure as a reason for moving forward with his publishing. Schneier does say that full disclosure "...is a damned good idea" but then he also goes on to state the following -
So a bunch of software companies, and some security researchers, banded together and invented "responsible disclosure" (See "The Chilling Effect"). The basic idea was that the threat of publishing the vulnerability is almost as good as actually publishing it. A responsible researcher would quietly give the software vendor a head start on patching its software, before releasing the vulnerability to the public.Schneier essentially says it is best to use responsible disclosure under the threat of full disclosure if the vendor does not respond to the vulnerability report. As someone who has to deal with patching issues on a weekly or even daily basis, this is something I am fully behind. It certain beats pushing out rushed and partially tested workarounds. Note that Ormandy's workaround on the full disclosure mailing list does not actually work (again, rushed and untested).
This was a good idea -- and these days it's normal procedure -- but one that was possible only because full disclosure was the norm. And it remains a good idea only as long as full disclosure is the threat.
Interestingly enough, Ormandy apparently believes in Responsible Disclosure for some cases because he did not make public the recent security issues detected in Adobe Flash.
Hopefully in the future, security researchers can put aside whatever biases they have against certain companies and use some discretion in reporting security vulnerabilities. It certainly helps make our jobs in IT a little easier.