It’s now been over three months since I blogged about the status of the various Java SOAP stacks and the bug counts. When I first posted it, many people considered it a “wake up call” to the projects and were wondering how the projects would respond.
A lot has happened in the last three months so I wanted to re-examine the states of the projects.
The CXF team had a busy three months mostly working on fixing bugs directly reported by users. They released 2.1.8 and 2.2.5 in November and 2.1.9 and 2.2.6 in January. Between them, over 150 JIRA issues were closed. Some bugs, some new features, etc… CXF also had two books published targetting CXF users providing additional documentation and tutorial information.
The Axis folks also had an exciting quarter. They released their Axis2 1.5.1 patch that fixed around 15 bugs, one of them a critical issue that was causing a lot of problems. Their most exciting news, however, was that they have been promoted to a Top Level Project at Apache. Probably long overdue, but it’s nice to see them out of the “Web Services” umbrella and out on more equal footing with the other projects at Apache. The process of getting things moved out to the new TLP areas is still ongoing, but a major “Congrats!” goes out to them for finally getting that done!
Obviously the big news for Sun is that the aquisition by Oracle is now complete. Time will tell if that ends up a good thing or bad thing for the Metro folks. In anycase, they released their new 2.0 version in the last few months which includes a bunch of new things such as JAX-WS 2.2 support.
As you can see, it’s been quite a busy three months for all three of the projects. However, I’m still interested in looking at how many “known bugs” exist in the projects. As of today (Feb 12, 2010):
So, of the three projects, only CXF reduced their bug count. That said, I certainly cannot fault the Metro folks. I would definitley expect an influx of bugs after releasing any “#.0″ version and the uncertainty of the Oracle/Sun situation I’m sure put a damper on things for quite a while.
The last table in my previous post showed what percent of bugs logged recently had been fixed. If you look at just the bugs that have been logged since my previous post on Nov 5, you see:
||10 (90% resolved)
||29 (59% resolved)
||46 (31% resolved)
Last time, people complained about using raw bug count numbers. I agree that raw numbers are usually not useful since projects with more users are more likely to have users that encounter (and report) bugs. Complexities of the projects are different. Features are different. How the tracking system is managed is different. Etc…. However, it is the trends that the raw numbers expose that I think is important. Bug counts going down or staying steady is good. Projects fixing a majority of reported bugs in a timely manner is good. Projects providing new releases that fix reported bugs is good.
Of course, bug counts and releases are only a couple aspects that are important when selecting a project. Features are important (after all, if it doesn’t have a feature you need, it doesn’t matter how bug free it is). Performance is important. Documentation is important. Ease of use is important. A friendly and responsive community is important. I’m sure there are lots of other reasons that people use when evaluating projects. I’d love to see CXF “on top” in all those categories, although I have to admit, I’m pretty bad at doing documentation.