Sprint Task Buckets

I really hate bucket tasks, but I haven’t found a satisfying way to avoid them. We tend to have on every project things like:

  • Defect Resolution <div class="codecolorer-container text vibrant overflow-off" style="overflow:auto;white-space:nowrap;">
    1
    16hrs

    </div>

  • Update Use Cases <div class="codecolorer-container text vibrant overflow-off" style="overflow:auto;white-space:nowrap;">
    1
    8hrs

    </div>

  • Update Rules Doc <div class="codecolorer-container text vibrant overflow-off" style="overflow:auto;white-space:nowrap;">
    1
    6hrs

    </div>

My best idea is to leave them off and include them in the slack typically left in a project or discover them when we actually have to do some work on them. So far the teams have been more comfortable with the bucket approach. And at the end of the Sprint the time mysteriously dries up.

One Small Acceptance Test

A small milestone today. One developer deployed our first ever Fitnesse acceptance test on a real project. I sat down with him at his desk and clicked on the ‘Test’ button. Soon after I had 4 green bars. This was a fairly straightforward edit, just checking that a city field was contained in a certain list of acceptable cities. I added two cases one for a city spelled out in lower case and one for a partial city name. Save the page and run the tests again. Six green bars.

Should make the Sprint Review a lot more visually exciting.

From Story Points to Ideal Days

Dave Churchville recently expounded on ideal days versus story points:

Personally, I tend to prefer the ideal time units, since it’s easier to explain to customers, but I have heard reports that point systems have had good results as well after the initial confusion.

My problem with story points is the teams have never gotten over that initial confusion. I’ve used story points mostly since attending a convincing talk on ideal days versus story points by Mike Cohn at SD West 2005. We’ve done many ‘planning poker’ sessions with cards and story points. I think some developers enjoyed the game aspect a bit, but our estimates haven’t been that great.

Story points should work well if everyone on the team really gets the concept. The problem is especially early on most the backlog items you’re estimating are really ballpark estimates since there is no historical information and generally only a high level description of items. So you end up with some widely estimated story based on unknowns, not much relative comparison. We write use cases so it goes something like this:

Backlog Item #1: Maintain Batch Report Use Case.

Team: Umm, how about 5 story points.

Backlog Item #2: Convert all the legacy and historical data from the old schema to the new schema.

Team: Hmm, that sounds hard how about 20 story points.

Backlog Item #3: E-sign Documents Use Case.

Team: We’ve never done that before. After some discussion since everyone feels it’s still risky maybe 100 story points.

Facilitator: OK, do we really think E-signatures are about as much work as doing 20 reports?

When we deal with ideal days these discussions tend towards better estimates, because everyone is familiar with using time units. I still think story points are a great idea in theory, but I’ve fallen back to using ideal days because it hasn’t played out as well in practice.

Unit Testing Service Component Architectures

Nice to see someone at IBM has finally taken a look at how you might test their wonderful SCA modules.

Turns out you have to go through many steps in Websphere Integration Developer (WID) and it appears to rely on Cactus which tends to be a real pain. Getting the whole thing setup involves several XML configuration files and plenty of clickety-click development. And they aren’t really coming from the perspective of test first, since you always lay out the process first using their visual modeler. I personally despise this whole style of development, but being able to actually force it into something of a test harness does make it a bit more palatable.

Getting to a green bar with an SCA module:

  • 47 steps.

    1. To import these modules into your workspace using WebSphere Integration Developer, select File => Import.
    1. In the Import dialogue, select Project Interchange, and then Next.
    1. Select the Arguments pane. Under VM arguments enter the following code(Figure 12):
    -Dcactus.contextURL=http://localhost:9080/MT_TestMailServiceJUnitWeb

    Note that this specifies the localhost and port 9080; if your server or port for HTTP requests are different, you will need to adjust this string to match your choices.</li>

      1. Select Run to initiate the test.</ul>

    Getting to a green bar testing a POJO:

    • 4 steps.
      1. Write a failing test.
      2. Implement a class and method to fix compiler errors.
      3. Implement just enough code to pass the test.
      4. Run the test for a green bar.

Everyone Writing Tasks

I changed up the usual way I run the detailed part of our Sprint planning meeting with everyone coming up with tasks. For almost all prior meetings:

  • Stick selected backlog items for the Sprint up on the wall on giant stickies.
  • One by one lead the discussion on what tasks need to be done for each item.
  • Write each task down, come up with an hour estimate, and generally let someone claim the work or assign it to the team.
  • Repeat until finished with all backlog items.

Today at the suggestion of some of my developers on another project, I changed things up a bit:

  • Hand out large 5×7 stickies and pens to everyone on the team.
  • Stick the backlog items on the wall.
  • People started to write down their tasks and walked up to the wall and pasted them under the backlog items.
  • Then I just facilitated walking through each backlog item and asking if we needed any more tasks and adding hour estimates to tasks that didn’t have them. In some cases team members just wrote down new tasks as they thought them up and added them throughout the discussion.
  • Repeat until finished with all backlog items.

Overall the experience was better, we got done a little quicker, and the tasks had the names the people doing the work had defined.

My hesitation to try this approach had come from one early Sprint on a project. I had handed out stickies to start the process, but almost everyone was new to Agile and they hardly wrote any tasks. In order to really pull out the tasks I had to stand up and really lead the discussion, writing down each of the tasks myself. As I got adjusted to working this way I just fell into a habit.

It never hurts to reexamine and test your assumptions.