Saturday, March 25, 2006

Teddy Bear code reviews

I have been a big advocate of peer code reviews for some time now (see my previous post on peer code reviews), and I feel very lucky to have been a consultant at a company for the past year and a bit that firmly believe in peer code reviews. As my time comes close to winding up at this company, I am faced with the prospect of going to consult for a company that does not have the same attitude to peer code reviews, and I've been wondering what my approach should be.

My first thought is to attempt to convince the managers at whatever company I end up at that peer reviews are a good idea, and explain the development to the overall development life cycle, and show them the graphs that explain the cost of fixing bugs at various points throughout the software development life cycle, this in my opinion should be enough to convince any decent manager that an extra 2 hours in development is better than an the 20 hours of test/dev/management time if a bug makes it into system test, or worse if it makes it into the wild and is reported by a user. However, being experienced with the way management works, and knowing that not all managers listen to common sense or think that developers shouldn't write bugs in the first place, I've been trying to think how I will attempt to maintain my code quality in such an environment.

I'm not sure how well known the concept of teddy-bearing has become, but the urban myth starts with a development team that had a team lead who was as a developer pretty useless. The developers however, found he was great for one thing. If you were stuck on a programming problem, you could call him over and ask him to have a look. Because he was so useless, the developers would have to step him through the bits of the code that were causing the problem. In the process of explaining the problem at a level that the team lead could understand, the developer would quite often see his/her mistake and be able to fix it right there in front of the team lead which made the team lead feel as though he was immensely important. When this team lead eventually moved on to bigger better and greater things, the team was at a loss to know how they would fix these hard to find bugs, until one day, one of the developers who was stumped on a problem picked up a teddy bear that was lying around the office and sat it on his desk started explaining the problem to the teddy bear. They found that the teddy bear was every bit as effective as the team lead, and had the added advantage of having better people skills. The teddy bear become an office legend and was invited to all the office parties where legend has it he told far better jokes than the old team lead.

From this the concept of teddy-bearing was born, and is used whenever you encounter the phenomenon of explaining your code to someone in order to find a problem, and find the problem yourself with no real constructive input from the person you asked to help.

So what does that have to do with code reviews? Well, I'm a big fan of the whole subversive style of process improvement. If management won't buy source control, install subversion and maintain your own repository, if the team doesn't believe in unit testing download nunit and write your own, if there is no nightly build process, download nant and do it yourself. So if there are no code reviews, buy yourself a teddy bear, give it a name, and sit it on your desk. Fellow colleagues will probably think you are a little unusual, but you're a geek, people generally think you're more than a little unusual anyway.

When you get to a stage where you are completing a chunk of work, sit the teddy-bear beside you and talk it through your code. A lot of the psychology of a code review will be there, and in reality you will have one of the toughest reviewers you can possibly have..... Yourself.

Wednesday, March 01, 2006

QSR Releases NVivo7

The company I have been consulting to for the past year and a bit (QSR) have today released the product we have been working on, their flagship product NVivo 7.

I'd also like to add that it was released on time, and without stressing out the entire team for months leading up to the release. Just goes to show what good project management and appropriate resource allocation can do.