6 Months with Vim


College Roomate: So you know vi?
Me: Of course, that would be command shift-Z-Z

This is a conversation I had about twenty years ago. For most of my Unix editing I got by with pico and I found the modal nature of vi downright frustrating. Hence the importance of knowing the command to escape from modal hell.

At my current company vim was the default text editor, I even paired in the interview on vim. Over the years I’ve bounced between text editors and IDEs depending on the languages and environments. One key point though was that when I reached for a text editor it was always one with a GUI front end like BBEdit or TextMate. Despite working in vim most of the day I still default to MarsEdit for writing blog posts.

I jumped into vim with some lingering fear remembering my dislike of its modal nature and the lack of any mouse support. I tried out VimGolf and spent some time trying to memorize basic navigation and selection commands. It came pretty slowly at first. I bogged down my pair quite a bit fumbling around with arrow keys or splitting the screen the wrong way. About two months in I had developed some basic proficiency, but I found myself wondering if RubyMine or TextMate wouldn’t be a better option.

At about the 6 month mark I looked back and realized I was finally comfortable in vim as I found myself defaulting to it for most of my text editing even when I wasn’t working on code. I still have a few cases where I prefer using a visual editor or IDE for tasks like:

  • blogging
  • email
  • exploring large projects
  • larger refactorings

Looking back there were several items that accelerated my vim mastery. One was our switch to sharing a single base .vimrc so when you pairing with anyone you had the same configuration of the editor. It also included several features like macros for commenting, formatting, auto-complete, and running inline RSpec. After setting up that .vimrc I saw how vim could be a more powerful tool.

Another big step was learning to love ‘hjkl’ over the arrow keys. That took about a week or so to break the habit and it was painful, but the speed difference was worth it. The mouse is already gone, so might as well toss the throwback to the arrow keys. (I found the hack here on Stack Overflow.)

A critical change that I arrived at fairly late was remapping my caps lock. Normally you rarely touch caps lock, but it is in a very convent position especially compared to the escape key. I remapped my caps lock to tap it to get an escape and holding it down yields a control which are the two most commonly used modifier keys in vim. That made my fingers very happy.

Finally, the biggest learning expeditor was pairing with a number of developers who had more expertise than myself. Since the team had standardized on vim before I arrived they had all gone far past where I was on the learning curve and were able to quickly demonstrate how to use vim or which keystroke to use as I fumbled along. I greatly appreciated their patience and willingness to let me bang something out that they could have typed in half the time.

I still don’t feel anything like a vim expert, but I do feel comfortable in a modal editor. Never thought I would be typing those words all these years later.

Jenkins: My Personal Bodyguard

I’ve been running Jenkins as my CI solution for years now, but for the first 6 months at my new job I used it in an entirely new way. The typical CI pattern is you setup to run against checkins to your master branch run all the unit tests, and depending on how sophisticated you are integration tests, static analysis, or even performance tests.

In my current shop we had a standard CI build using CC.rb against checkins to our master branch. It ran about 5000+ specs sequentially and if everything’s green master is available for production pushes. That fulfills the standard case for CI just fine, but I quickly found that it a little light for my needs. The issue is we relied on feature branches for rolling out all of our stories and while those stories are being worked there’s no CI job at all. Another pain point is even with parallel specs and a nice 4-8 core processor the full test run takes 5-10 minutes so it doesn’t get run nearly as often as one might like.

I’d let Jenkins be my default choice for CI in Javaland, but I was really open to switching it out for a more Ruby style CI server. I figured the first place to look was our already running CC.rb implementation. I tried out CC.rb long ago and remembered liking it a lot better than the original cruise control, but with years of development on it I figured it had evolved. Turns out CC.rb is a pretty basic solution for CI and it hasn’t had much love over the years. It runs a build showing the console output and a very basic web console. Coming from Jenkins the lack of features and plugins was pretty shocking.

Given the Ruby community’s love for XP practices there must be a better option available. I looked at numerous CI servers for Ruby including CI Joe, Integrity, and Travis. All of them followed the lines of providing a very simple CI server much like CC.rb. Travis CI appeared to be a bit more with its use of a hosted open source solution, but it is completely tied to hosting your project on Github. None of these projects showed the depth of a solution like Jenkins.

So the solution was to run Jenkins as my own personal bodyguard. We were looking at swapping out the CC.rb implementation, but it wasn’t a priority right away. So I dived in and setup Jenkins to run a build on my laptop. Instead of just building master I added whatever git feature branch I was working on and hooked up an audible trigger to play James Brown “I feel good!” for successes and “Houston I think we have a problem” for failures. I expected to use the rake plugin to launch the build, but it turned out it was much easier with using things like rvm and bundler to simply execute a script. My final scripts for the build looked something like the following:

1
2
3
4
5
bash -l -c "rvm use 1.8.7-p330@acme"
bash -l -c "bundle install"
bash -l -c "rake db:migrate"
bash -l -c "rake parallel: prepare"
bash -l -c "rake parallel: spec"

So finishing up a bit of functionality, I’d commit and about 10 minutes later my laptop would exclaim “I feel good”, or occasionally “Houston I think we have a problem” to let me know I missed something. So I had my own personal bodyguard build and I could happily let it run in the background.

This year we replaced CC.rb with Jenkins and dropped feature branches so I’ve retired my laptop Jenkins, but I recommend the practice if you find yourself in a similar spot.

Stubbing Partials with RSpec in Rails View Tests

Working with my pair yesterday we ran into an issue testing a view that pulled in several partials. In the interest of making progress we punted after about half an hour of trying to setup expectations on the partials and just tested the negative cases where we didn’t have to worry about the partials being called. Intellectual curiosity and not wanting to leave the views lightly tested I dived into seeing how we could effectively test that the partials were called as expected.

Say you have a view like the following:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
<div id="content">
<div id="page_title" style="padding-bottom: 20px;">
<h2>Books</h2>
</div>
< %= render :partial ='tabs' %>;

< % if @books.empty? -%>;
<em>No Books</em><em>
< % else %>
< %= pagination_links = render :partial => 'shared/pagination', :object => @books %>;
< %= render :partial => "book_rows", :locals => {:books => @books} %>
< %= pagination_links %>
< % end %>
</em>

</div>

You want to test the conditional to see that the partials to display the list of books work correctly. You don’t want to worry about concerns like setting up the books collection, and you might even want to stub out the call to the partial that displays the tabs a the top of the page.

The Rspec example group for this looks like the following:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
require 'spec_helper'

describe "books.rhtml" do
  before(:each) do
    assigns[:books] = []
  end

  describe "with no books" do
    it "should display 'No Books' message" do
      render
      response.should have_text(/No Books/)
    end
  end

  describe "with books" do
    before(:each) do
      assigns[:books] = [Factory.build(:book)]
      template.stub(:render).and_call_original
      template.stub(:render).with(hash_including(:partial => "tabs"))
      template.stub(:render).with(hash_including(:partial => "shared/pagination"))
    end

    it "should render book rows" do
      template.should_receive(:render).with(hash_including(:partial => "book_rows"))
      render
    end
  end
end

The key find was to use with(:hash_including()) which I found from a helpful link at Stack Overflow.

Cardboard Boxes and Modern Web Frameworks

My daughter spent an hour the other day cutting holes, drawing red bricks, and pasting grass along the bottom of a simple cardboard box. It’s a common story among parents, especially of small children that despite spending a lot of money on a gift, the kid ends up enjoying the box and ignoring the toy.

I’ve noticed a common quality of many newer web frameworks is that they provide you with a nice box to build from. Typically:

  • A default structure so you know where to find the models, views, controllers, extra libraries, etc.
  • A command line for generating stubs, running tests, and deploying.
  • A plugin system for easily adding functionality from swapping out javascript libraries to adding a security system.

Though there are probably others I’m aware of this model being used by Spring Roo, the Play Framework, Gryphon, Grails, and Rails. I don’t claim to be a historian on this, but Rails was my first experience with this style and I assume to be the originator of the approach.

The benefits of this approach are obvious from the first time you start off with a Hello World tutorial. For one you generally have a single download for the framework. After unpacking you’re able to use the command line to generate your Hello World controller, find the view in a predefined location, add the Hello World line, and fire up the application from the command line. Passing the 5 minute test is pretty important if you expect developers to give your framework a chance.

As a consultant working on numerous legacy code projects, there’s always the groan moment when you start looking into the code and you realize its non-obvious where to find things. I’ve seen libraries sprinkled about at random, key configuration files that were supposed to be moved into the users home directory, model classes mixed in with controllers, and a host of other inconsistencies. With a framework that reinforces the default structures it becomes easy to find things and much easier for plugin-authors to write plug-ins that make the framework much more valuable.

Finally, the plugin system is a critical part of new frameworks. Being able to add security framework on the fly, swap out your testing framework, or simply add in a nicer date library really starts to make things feel like magic. Indeed among these frameworks they are starting to push more and more modularity into the plugin systems to allow for the framework to evolve better over time.

I hope this trend continues in the world of web frameworks, as I really like a nice box to start a project from.

DSLs In Action Review

As a software manager and developer I’ve followed the gradual adoption of DSLs as a mainstream technique. I’ve worked with numerous DSLs including:

  • Rails
  • RSpec
  • Easymock, Mockito
  • Selenium
  • Grails, GORM
  • Rake
  • Ant (as ugly as an XML DSL is)

RSpec was a wonder at the time compared to JUnit 3 where I spent the bulk of my testing time. I loved the ‘should’ syntactic sugar and the clean English definitions of the specs. Expressive tests are critical to maintaining a large codebase and they almost require DSL implementations. Rails pushed the idea into a full fledged web framework and made ugly implementations like JSF in Java land look like poor cousins. DSLs also helped with the adoption of newer dynamic languages like Ruby and Groovy, because method chaining in languages like Java don’t make a nice DSL.

It’s taken a while for the literature to catch up to the use of DSLs among developers. Indeed Martin Fowler has been working on his DSL book for 4-5 years. The great majority of DSL use is through frameworks and libraries so currently very few engineers are coming up with their own DSLs. DSLs in Action by Debasish Ghosh is an attempt to show some options in taking on DSL development on your next project.

The book defines two types of DSLs:

  • Internal DSLs – Rails, Grails
  • External DSLs – generated with tools like ANTLR

The book covers both types with a greater focus on embedded internal DSLs that the reader is probably more familiar with. If the approach had been to focus more on external DSLs I think the audience would have been more limited as most developers are not ready to jump into the complexity of writing there own external DSLs or even had the time to look at something like a DSL workbench like JetBrains Meta-Programming System. I loved that the diagrams were hand drawn whiteboard UML-lite style, as thats a trend I strong encourage in technical books.

The code samples do a good job of illustrating the various strengths of different languages you can use for implementing DSLs. The example domain is a Wall Street trading firm. The Java example looks like:

1
2
3
4
5
6
7
Order o =
  new Order.Builder()
      .buy(100, "IBM")
      .atLimitPrice(300)
      .allOrNone()
      .valueAs(new OrderValuerImpl())
      .build()

Typical method chaining and a bit verbose. Not exactly the code you hope to show to a technically savvy business user.

1
2
3
4
5
newOrder.to.buy(100.shares.of('IBM')) {
  limitPrice       300
  allOrNone        true
  valueAs          (qty, unitPrice -> qty * unitPrice - 500)
}

Much nicer, even if the curly brackets still bug me.

1
2
3
4
(def trade1
  {:ref-no "tr-123"
   :account {:no "c1-a1" :name "john doe" :type 'trading'}
   :instrument "eq-123" :value 1000})

Lisp syntax, but very readable.

1
2
3
 val fixedIncomeTrade =
  200 discount bonds IBM
    for_client NOMURA on NYSE at 72.ccy(USD)

A nice syntax parsing out all the strings. I found the actual implementation classes for this example hard to follow, but the end resulting syntax might be worth the pain, though you can accomplish a similar syntax in Groovy or Ruby.

To note this is a polyglot programmer assumed book. The code examples range include Java, Ruby, Groovy, Clojure, and Scala. If you’re still mostly a Java developer this probably isn’t the book you want to start with as the coding syntax especially with Clojure and Scala will make it difficult to follow. I have a long background in Java, Ruby, and Groovy, and reasonable exposure to Lisp like languages, but I found the Scala examples and syntax jarring. It did help solidify my feelings that I just don’t like the syntax and complexity of Scala at least on the surface. (To be fair I probably need to spend some time with it, maybe it grows on you.)

The book does a good job of walking you through examples of implementing DSLs in various languages and the approaches you might take. In daily development the likely DSLs to be implemented will be embedded DSLs, but the book does a good job of dealing with external DSLs and the complexity involved. Even the the author admits that at the end of a chapter on implementing external DSLs using Scala’s parser combinators feature that:

If you have reached this far in reading the chapter on parser combinators, you have really learnt a lot about one of the most advanced applications of functional programming in language design.

Overall I found the book helpful in expanding my overall understanding of DSLs, and I will probably reference it the next time I need to build a small DSL in Groovy or Ruby on a project.