Quality over Short Term Speed
As a manager I had a proud moment on Friday at a regular standup meeting. Everyone went around the room as usual and since it’s nearing the end of the sprint, the testing members pressed on when the code would be ready in QA to start formal testing. The developer responded:
“We’ll be done Wednesday, like we said.”
The scrum master chimed in with:
“Well, could be be done Tuesday afternoon, or how about Wednesday morning?”
Again the developers responded:
“We’re going to be done by the end of the day Wednesday. That still gives you four days to test before the end of the sprint. It won’t be done until the late Wednesday because we still have a whole JSF page, backing bean, service, DAO, and unit tests to write. On top of that RAD is crashing on us all the time.”
The scrum master responded:
“OK, I was just checking to see if we had some wiggle room, sounds like we’ll test starting Thursday morning.”
I’ve been coaching my developers to always keep in mind everything they have to do including unit testing and refactoring and not to compromise quality when the push comes to get things done faster. Here was an example of a developer really standing up and asserting that they weren’t going to rush it and cut corners.
Intranet Portal Sprint Review #2
Stumbled my way through leading a Sprint review for our Intranet Portal project. I felt a bit unprepared as I hadn’t even bothered to make nice slides of the 14 product backlog items we completed. (I just read off the 3×5 index cards I pulled off our Sprint board that morning) To top it off I hadn’t checked the site the day before and it turns out the search feature still wasn’t working as we hadn’t set the indexing spider off crawling the site. Whoops, that was part of our Sprint goal.
Still the review went remarkably well for one primary reason. We finally had the product owner, an editor in our Communications department, drive the actual review. The buy-in you get when your customer actually drives the demonstration is priceless. I hadn’t experienced this on one of our projects, because I’ve never gotten to run a Scrum project officially. The business owners have continued to largely follow our broken waterfall methodology and we’ve just run the project as a Scrum project sans a participating project owner.
To top it off the product owner had only worked with Websphere’s Content Management system for a few days and suffice it to say it is quirky and a poor example of software usability. The fact that he did a fine job of adapting to it gives us more faith that we’ll have some luck rolling it out to 30 or so content authors across the company, given a healthy dose of training and support.
The other nice side effect is the team is gelling a lot better now than when we started two months ago. An earlier attempt at getting the intranet relaunch ended with the customers and the developers at almost polar opposites and truly distrusting each others motives. In two short months we’ve been able to repair things and we’re collaborating again, not fighting pitched battles over how many colors to use on each page.
Timeline for Websphere Portal Server 7.0
Websphere Portal Server is our current technology boat anchor. We’re stuck on Websphere Portal Server 5.1 until there’s a new release. Today we got the real picture on the release schedule from an IBM Portal Architect:
- Portal Server 6.0 late summer
- Portal Server 6.0.1.x maybe November 2006 (the stable one he recommended waiting for)
- Portal Server 7.0 early 2007
So possibly this time next year we can finally move up to Java 1.5 after waiting about 2.5 years.
UDDI is Dead?
In a meeting today with a lot of IBM Websphere product architects they didn’t quite come to the conclusion that UDDI was dead, but they did say that at this point UDDI is an important spec, but it’s just not meeting the needs of people trying to roll out SOA architectures.
IBM’s thoughts around this assume that everyone is coming to the same conclusion that UDDI just doesn’t do enough so they’re creating more functional replacements. The replacement is called Websphere Service Registry and Repository Capability at this point. I’m not sure whether there are more WS* specs or Websphere branded products in the world.
At this point they didn’t have a whole lot of details around there product in particular, but they were willing to speculate on where UDDI is going. There idea is that a few proprietary approaches are being developed by vendors and then a spec process will happen after some experimentation and everyone can essentially agree on the UDDI replacement or UDDI 2.0.
They didn’t really seem too out in left field on this as our company architect has been telling me UDDI is pretty much dead on arrival for a while now. In addition at a Bird of a Feather session on SOA at SD West 2006 I asked how many of the 50 participants were using UDDI. One brave soul raised his hand from the State of Oregon. He said they did have a UDDI registry setup and a few web services registered with it. He then explained that everyone using their web services had hard coded them without using the UDDI registry at all.
Test Driven Development Doesn’t Mean Test First?
In a post to the errata for Agile Web Development With Rails, the commenter notices that the the book barely touches on traditional TDD where you actually write the test first:
#2327: The author enters several assertions before ever trying to run the test. As this section is on TDD, it might be better to get the test to pass at the assertion of the flash. Then add the next assertion. The reader would get a better feel of the flow of tdd (write a little bit of test, see it fail, write a bit of code to make it pass, see the test pass, refactor, repeat).
Dave’s follow up is enlightening:
(Dave says: I believe this is a confusion between Test First Development and Test Driven Development. TDD doesn’t require tests to be written first)
So again the argument crops up around what Test Driven Development really is? My experience points to Kent Beck’s explanation that TDD is:
Test-Driven Development (TDD) is the powerful combination of two techniques: test-first programming, in which the programmer writes automated tests in advance of implementation, and incremental design, in which the programmer continually improves the design of the software to match the actual requirements.
Or Scott Ambler’s:
Test-driven development (TDD) (Beck 2003; Astels 2003), is an evolutionary approach to development which combines test-first development where you write a test before you write just enough production code to fulfill that test and refactoring.
Or Wikipedia:
Test-Driven Development (TDD) is a computer programming technique that involves writing test cases first and then implementing the code necessary to pass the tests.
Or Bob Martin:
The steps:
- Write a test that specifies a tiny bit of functionality
- Ensure the test fails (you haven’t built the functionality yet!)
- Write only the code necessary to make the test pass
- Refactor the code, ensuring that it has the simplest design possible for the functionality built to date
I may have missed a memo somewhere, but TDD doesn’t compromise on test first. I think the only confusion is that the name Test Driven Development isn’t as clear as something like Test First Development or Test First Design. Of course we seem to be headed towards approaches like Dave Astel’s Behavior Driven Development to help alleviate any confusion like this.