Exposed? : Examining Secunia Unpatched Warnings – Part 2

This is Part 2 of my look at the perceptions and realities concerning disclosed, but unpatched vulnerability trends between Windows and Linux. You may want to read Part 1 first.  

UPDATE: Oh, and Part 3 with results will be posting on Friday.  I followed some comments on OSNews.com and noticed that folks seemed to think Part 2 was the final segment and results were not posted.  Sorry if I was not clear – the parts are fairly long, so I broke it into: part1/setup, part2/method, and part3/results.

Before I dig into methodologies, I’d like to thank Secunia’s for reaching out to me and looking for ways to strengthen their “Please Note” section on each products statistics page. They had already said

“Please Note: The statistics provided should not be used to compare the overall security of products against one another.”

However, based upon our discussions concerning unpatched issues specifically, they’ve further clarified and strengthened their caveats. You can see the new wording here: http://secunia.com/product/1/?task=statistics, which includes:

Please Note: The statistics provided should not be used to compare the overall security of products against one another.” 

It should also be noted that some operating systems bundle together a large number of software packages, and are therefore affected by vulnerabilities, which do not affect other operating systems that don’t bundle together a similar amount of software packages.
Additionally, the number of unpatched vulnerabilities for a product may be affected by the fact that, for certain products that consist mostly or solely of third party software (such as Linux distributions), Secunia tracks the number of issues fixed by the product vendor and not the issues reported in the third party software.  …”

I think the new caveat is much more clear, and let’s face it, mapping disclosures to Linux distros is not an easy task, especially given how many products Secunia tracks. 

Tracking Disclosed But Unpatched is Hard for Linux Distros

Let’s take one example, CVE-2006-1342, to illustrate the challenge faced by Secunia and others who track vulnerabilities.  CVE-2006-1342 is an IPv4 vulnerability which I chose that affected the Linux Kernel 2.4.x.  Digging around, I find that Secunia issued advisory SA19357 on 3-23-2006 concerning this vulnerability listed as affecting Linux Kernel 2.4.x and Linux Kernel 2.6.x.  (Good job!)  What about the distros?

Did it affect RHEL4, which is *based upon* kernel 2.6?  Hard to say, because Red Hat customizes their version.  What about SUSE 10? Ubuntu?  RHEL 2.1?  RHEL3?  Red Flag Linux?   I can say they are probably all affected, but I can’t say it definitely until the vendor acknowleges it – usually with a fix.  I do know that Red Hat has fixed this issue on RHEL2.1, but that they have not issued a fix on RHEL4.  Does that mean RHEL4 is impacted, but doesn’t have a patch yet or, alternatively, does it mean RHEL4 is not affected?  Probably Mark Cox knows, but I do not.

So, multiply that example times hundreds of issues in hundreds of distribution components and it is pretty easy to see how hard it would be to keep an accurate list of disclosed, but unpatched issues for Linux distros.  I certainly can’t hold it against Secunia for not being able to validate every issue on every distribution – it would be an insane amount of work.

On the other, the job isn’t nearly as hard for say, Windows XP.  If someone discloses an issue affecting XP, it’s pretty easy to track that, and so, naturally, Secunia does track that for their customers.

So, to be clear, it seems to me that Secunia is doing a reasonable best effort at tracking accurate Unpatched data, which you may recall was the standard I set in part 1.  Also, I think I’m comfortable saying that the Unpatched data is not tracked on a consistent basis between Windows and Linux distros, but that I also understand why that is the case.  Having said that, the lack of consistency (understandable though it may be) makes it even more important to explore what an accurate and consistent comparison would look like.

Having acknowledged that it is a monumental task to get an accurate and consistent comparison today, I still think we can do more to shed light on an accurate comparison over past periods of time.

A Method for Tracking Unpatched Vulnerabilties

It turns out that I’ve been thinking about these issues for a while and I developed a paper called “How Exposed is My Software? A Method for Reasonable Estimation of Unpatched, Publicly Disclosed Vulnerabilities in Software“, IEEE Security & Privacy, to be published in 2007.  The paper leverages the full set of vulnerability and patch data availalbe for Windows 2000 over a four year period to propose and examine a methodology for charting the number of disclosed, but unpatched, issues in a particular software product.

When it is published in the Spring, I will update a link here to point to the full paper.

Without giving a way the full detail, I’ll describe the methodology at a high level here as I intend to apply to to Windows XP and Red Hat Enterprise Linux 4 WS in part 3 of this series:

1. First, one must build a comprehensive list of fixed issues as released/validated by the vendor, and for each issue, track down when the issue was first disclosed to the public.  The is a big job, but feasible, given the vendor list of fixes.  For Red Hat, it is even easier, as Mark Cox keeps some of this data handy for you at http://people.redhat.com/mjc/metrics.html.

2. Next, one can examine the disclosure and fix data over past periods to develop “time to fix” statistics for the product.  Using statistical methods, this data can be used to chart histograms and cumulative distribution functions (CDF) to establish how far back in the past one must look to get results with 95%+ accuracy.

3. Finally, one can look at any given day and use the data gathered in step one to identify vulnerabilities that were a) publicly disclosed prior to that day, and b) not yet fixed on that day.  Do this for each day over a period (e.g. a year) and one can produce a graph that charts daily exposure to disclosed, but unpatched vulnerabilities.  Finally, you use the statistical results of step 2 to identify when the chart accuracy ends, in order to establish an accurate period of time for the charts.

What Do You Think The Numbers WIll Be?

I will honestly tell you I don’t know exactly what the charts will look like yet, as I have not done the work for Red Hat and Windows XP.  As I said above, I used Windows 2000 data for my methodology development and paper, so let’s ask a few questions and have some fun guessing:

  • Do you think either product will have days without any exposure?  (zero disclosed, but unpatched)
  • If yes (to the above), how many days of no exposure do you think they might have?
  • What do you think the highest exposure number will be?  (most disclosed, but unpatched on any one day)

I look very forward to sharing results with you later in the week!

Best regards – Jeff

About the Author
Jeff Jones

Principal Cybersecurity Strategist

Jeff Jones a 27-year security industry professional that has spent the last decade at Microsoft working with enterprise CSOs and Microsoft's internal teams to drive practical and measurable security improvements into Microsoft products and services. Additionally, Jeff analyzes vulnerability trends Read more »