Cardboard Boxes and Modern Web Frameworks

My daughter spent an hour the other day cutting holes, drawing red bricks, and pasting grass along the bottom of a simple cardboard box. It’s a common story among parents, especially of small children that despite spending a lot of money on a gift, the kid ends up enjoying the box and ignoring the toy.

I’ve noticed a common quality of many newer web frameworks is that they provide you with a nice box to build from. Typically:

  • A default structure so you know where to find the models, views, controllers, extra libraries, etc.
  • A command line for generating stubs, running tests, and deploying.
  • A plugin system for easily adding functionality from swapping out javascript libraries to adding a security system.

Though there are probably others I’m aware of this model being used by Spring Roo, the Play Framework, Gryphon, Grails, and Rails. I don’t claim to be a historian on this, but Rails was my first experience with this style and I assume to be the originator of the approach.

The benefits of this approach are obvious from the first time you start off with a Hello World tutorial. For one you generally have a single download for the framework. After unpacking you’re able to use the command line to generate your Hello World controller, find the view in a predefined location, add the Hello World line, and fire up the application from the command line. Passing the 5 minute test is pretty important if you expect developers to give your framework a chance.

As a consultant working on numerous legacy code projects, there’s always the groan moment when you start looking into the code and you realize its non-obvious where to find things. I’ve seen libraries sprinkled about at random, key configuration files that were supposed to be moved into the users home directory, model classes mixed in with controllers, and a host of other inconsistencies. With a framework that reinforces the default structures it becomes easy to find things and much easier for plugin-authors to write plug-ins that make the framework much more valuable.

Finally, the plugin system is a critical part of new frameworks. Being able to add security framework on the fly, swap out your testing framework, or simply add in a nicer date library really starts to make things feel like magic. Indeed among these frameworks they are starting to push more and more modularity into the plugin systems to allow for the framework to evolve better over time.

I hope this trend continues in the world of web frameworks, as I really like a nice box to start a project from.

Mock With Spock

My default rule with mocking is to try to stick to stubs where possible. I don’t enjoy having to setup and verify interactions with mocks, but sometimes you have some code where that’s exactly what you need to do. I’ve used many frameworks in Java over the years from EasyMock to Mockito, but I was quite happy with how easy it was to do in Spock. I recently found myself having to build a test harness around some legacy code. The real world code was more involved, but it looked something like this:

1
2
3
4
5
6
7
8
9
10
11
public void addDefaultQuestions(Category category) {
  if (categoryDao.getCategory(category.getId()).getQuestions().isEmpty()) {
    for (Question question : category.getQuestions()) {
      if (question.isDefaultQuestion()) {
        categoryDao.addQuestion(question, true);
      } else {
        categoryDao.addQuestion(question, false);
      }
    }
  }
}

The method is taking a category, checking if any questions related to the category exist in the database and then saving all the questions with a valid flag set to true or false. Not unusual in a typical corporate application, but I want to test two things:

  • I can add new questions to the database with the proper valid flag.
  • If the database category already has some questions then do nothing.

After walking through the Spock mocking documentation I had a pretty good sense of the approach. In Spock it’s referred to as interactions, but it doesn’t follow the typical expect-run-verify pattern. You just verify what you need to if you need to. And given a choice I prefer not to have to verify the mock at all.

With this code I needed to mock the categoryDao which used straight JDBC and made calls to the real database. That meant I needed a way to verify that the questions were added correctly with calls to the categoryDao. Hence I needed the power of an actual mock and not just a stub class.

The first test would show that I could save new questions in a category to the database:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
def "should only insert new default questions"() {
  given:
  def question1 = new Question(defaultQuestion: true)
  def question2 = new Question(defaultQuestion: false)
  def category = new Category(questions: [question1, question2])

  CategoryDao dao = Mock()
  dao.getCategory(_) >> new Category()

  CategoryService service = new CategoryService()
  service.setCategoryDao(dao)

  when:
  service.addDefaultQuestions(category)

  then:
  1 * dao.addQuestion(_, true)
  1 * dao.addQuestion(_, false)

}

So the steps are:

  1. Setup a Category object with two questions.
  2. Create a mock dao.
  3. Define a method and its default return value on the mock DAO.
    1. We define arguments to getCategory() with a wildcard operator the underscore standing in for an id.
    2. Then with the right shift operator (») we define that we will return a newly created Category object.
  4. Inject the mock into the service class we’re testing.
  5. Finally, we make verifications on the addQuestion() method by just stating the number of times we expect the method to be called with a given set of arguments, again reusing the wildcard underscore character.

You can even specify the particular order you expect by breaking the verifications into separate then: blocks. For this example it wouldn’t matter on the order, but in case it did the last when then block would change to:

1
2
3
4
5
6
7
8
when:
  service.addDefaultQuestions(category)

  then:
  1 * dao.addQuestion(_, true)

  then:
  1 * dao.addQuestion(_, false)
And to round out testing the legacy Java code we need to test the negative example where it should do nothing if there are already questions in the database for the given category.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
 def "should add no questions if questions are already in database"() {
    given:
    def question1 = new Question(defaultQuestion: true)
    def question2 = new Question(defaultQuestion: false)

    CategoryDao dao = Mock()
    dao.getCategory(_) >> new Category(questions: [question1])

    CategoryService service = new CategoryService()
    service.setCategoryDao(dao)

    when:
    service.addDefaultQuestions(new Category(questions: [question2]))

    then:
    * dao.addQuestion(_,_)

  }
So we can test for the negative case by just verifying that addQuestion was called zero times.

Grails Unit Testing: Mocking With MetaClass Stubs

On a recent project I ran into issues with testing controllers in Grails. Starting test first, I spent some early time figuring out how much support there was out of the box for unit testing Grails domain classes and controllers. I setup Spock as a plugin and plunged in. I was dealing with a legacy database which had 100% compound primary keys so many of the findBy type operations that are well supported for controller testing are of no help. Often I found I needed to mock a call to a find or findAll with HQL syntax or a criteria call. I knew metaClass mocking could work here, but I wanted to understand better how that would impact the rest of the code after I started replacing methods on the fly. Turns out Grails unit testing has built in support for cleaning up metaClass hacking after every test with the registerMetaClass() method in GrailsUnitTestCase. Mrhaki has a good post on this. Here’s an Spock example:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
import grails.plugin.spock.ControllerSpec


class ContractControllerSpec extends ControllerSpec {

  def setup() {
    registerMetaClass Contract
  }

 def "validateContract() returns true for an existing contract"() {
    given:
    mockDomain(Contract)
    def contract = new Contract(division: '33', unit: '99', contractNumber: 'C7777777').save(flush: true)
    controller.params.division = '33'
    controller.params.unit = '99'
    controller.params.contractNumber = 'C7777777'
    Contract.metaClass.static.find = { String query, Map namedArgs -> contract }

    when:
    controller.validateContract()

    then:
    "TRUE" == controller.response.contentAsString
  }

}

Grails has support for the same idea using mockFor(), but the syntax is much like EasyMock. The same test would look like:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
import grails.plugin.spock.ControllerSpec


class ContractControllerSpec extends ControllerSpec {

 def "validateContract() returns true for an existing contract"() {
    given:
    mockDomain(Contract)
    def contract = new Contract(division: '33', unit: '99', contractNumber: 'C7777777').save(flush: true)
    controller.params.division = '33'
    controller.params.unit = '99'
    controller.params.contractNumber = 'C7777777'

  // EasyMock like syntax
  def mockContract = mockFor(Contract)
    mockContract.demand.static.find(..5) { String query, Map namedArgs -> null }

    when:
    controller.validateContract()

    then:
    "TRUE" == controller.response.contentAsString
  }

}

For pure stubs I prefer the syntax of just using MetaClass. I often don’t care about validating the exact calls to the dependent class so I don’t really want a full mock. You don’t need a call to registerMetaClass() because mockFor() does this for you.

Developers and Desktop Databases

Typically development groups fall into one of two patterns:

  • Shared development database – every developer shares a single centralized development database often managed by a DBA group.
  • Desktop databases – every developer has an individual instance of the database.

The practice has evolved over time and developers tend to follow precedence on it. Historically, developers didn’t think of running a local database, primarily because of the resources involved. Installing and running something like Oracle locally used to be a significant drag on a development machine. You were already trying to run some sort of IDE that sucked up most of your machines resources so running the database locally wasn’t a great option. If you were lucky enough to be running a pretty light weight database you might think about running a local copy, but otherwise you just setup a connection and shared the dev instance.

As machines have gotten quite capable of running databases, IDEs, and even virtual machines all at the same time the option to run a local copy of the database became very real. In this case a number of developers began to install and run local copies of the database. Running your own local copy quickly makes some development issues stand out:

  • You have to learn a bit more about how your target database software is setup. Suddenly it’s just not just a set of connection parameters.
  • You have to think about the best way to coordinate changes to the database model. Making a lot of changes locally to apply later to the central dev integration instance is going to end in a lot of angry emails with how you broke the build.
  • You can quickly experiment with local changes to the database model without impacting the rest of the team.
  • You have a truly complete development environment, so if you take your laptop home at night, work remotely for the day, or get called for troubleshooting in the evening you can be effective without having to VPN into the dev database or drive into the office.

I think the long term trend is most developers will run local instances of their target database. This is almost standard in startups and newer organizations. As a consultant I love having a complete development environment since I can run and demo the project at a moment’s notice. The issues with integration are being addressed with approaches like database migrations in Rails and maintenance of DDL in source control, treating it like an equal member of the code. I’ve also seen partial steps towards this model with the idea of in-memory local databases used for unit/integration testing where you use an ORM like Hibernate to build the database model on the fly whenever you need to run tests.

Spock Intro Tutorial

I gave a presentation on Spock a very nice BDD framework in Groovy a few months back to our Groovy Users Group in Sacramento. After using it on a real world Grails project the last few months it has grown on me to become my go to testing framework for Groovy/Grails or Java projects. A typical specification looks something like this:

 def "a pager should calculate total pages, current page, and offset"() {
  when: "count, rows and page number"
  def pager = new Pager(count, rows, page)

  then: "should return correct total pages, the current page, and the offset"
  pager.totalPages == totalPages
  pager.currentPage == currentPage
  pager.offset == offset

  where: "you have a number of different scenarios"
  count | rows | page | totalPages | currentPage | offset
  100   | 10   |   1  |    10      |     1       |   
  950   | 100  |   5  |    10      |     5       |   400
  72    | 20   |   3  |    4       |     3       |   40
}

If that passed your 5 second test take a look at a fuller introductory tutorial I put together.

A Gentle Introduction to Spock

And if you want to try out executing real code the project has a nice browser based environment at Meet Spock.