Saturday, May 24, 2008

Heron, Gentil Heron

This is a follow-up for my last rant about how an unhappy Java developer I was on Leopard. I just had my first week of work with Hardy Heron running on my MacBook Pro and I am so glad I went through the few hours of setup and configuration: this bird really flies!

Gone are the feeling of clunkiness and resistance Leopard was giving me: things flow so easily in Ubuntu that I actually forget about the OS. Of course, I turned off all the fancy schmancy visual effects and activated just what is needed to jump between applications and desktops with a few key strokes.

I can now enjoy standard Java JDKs, too! It is a pretty good feeling to be back to year 2008.

In the migration, I have lost Entourage, which is in fact probably a blessing. I now use Thunderbird IMAP-connected to Exchange and have a Firefox tab opened to the Outlook Web Access 2007 (pre-web 2.0) interface to access my calendar.

The only application I will sure miss is the excellent Omnigraffle Pro. Any decent alternative on Linux?

So after this trial, I decided, Gentle heron, that I will not pull the feathers off your head.

Friday, May 23, 2008

Unit Tests: No Future?

Industry expert Andrew Binstock has just posted an entry in his blog titled "Is the popularity of unit tests waning?", where he discusses the staggering state of the practice of unit testing.

Andrew asks the crucial question of why did we end in this situation. I do not pretend to have the answer, but here are some patterns that I have observed, which, I think, could decrease the appeal of unit testing.

Bad unit testing practices

It is very easy to write fragile unit tests, for example by using stubs when mocks would be enough or by creating time sensitive tests that work intermittently or stop working after a while. Similarly, it is common to see integration-like tests sleep into the realm of unit testing, making these tests fragile, slow and dependent of external resources.

These bad practices tend to lead to a decrease of confidence in unit tests, which then can lead to a progressive reduction of their usage. For example, one programmer can decide he will not write any tests anymore for DAOs after struggling with poorly written tests.

Bad software design

There is a tight relationship between good code design and testability. I have already blogged about how increasing test coverage can lead to a better design: unfortunately, not all developers are ready to review their design to make their code more testable.

To be fair, our languages and frameworks often force us to write code that is hard or uninteresting to test. Who wants to write unit tests for infamous JavaBeans getters and setters? Not Allen Holub, for sure!

Other programmatic idioms like equals/hashcode often exhibit a high cyclomatic complexity: writing complete tests for those would be tedious, unless one use test generating products like Agitator, from late Agitar, or Jtest, from Parasoft.

Test benefits blindness

Management tend to be blind to the benefits of unit testing and very aware of its costs. A project I know have been deemed to have gone overboard with unit testing. At the same time, this project has the code base that is the most maintainable, flexible and fun to work with. It seems something prevents businesses to see the value added of solid test practices, as if it was all about some obscure geeky self satisfaction activities (as writing "perfect code" would be).

No Future?

Is there any hope, then? I think that, like for most of our problems on this little planet, education is the key. Whether we look at software engineers or business managers, there is still a great lack of education about unit testing, how it works and why it pays back.

Wednesday, May 21, 2008

SVN? VoilaSVN!

If you are using Subversion as you source control management system, there is now a way to go further with it and turn it into a full fledged project and knowledge management system.

Indeed, Arcetis has just released the first version of VoilaSVN Enterprise Edition, which "provides the tools to successfully manage your projects, to coordinate all your resources and capitalise on the knowledge of your team".

VoilaSVN leverages GWT to deliver a smooth web interface, which is pretty nice for a tool on which you will have to enter and mine data.

There is also a free edition, if you have simpler needs. So give it a try and, voila!, see how far you can go with Subversion.

Friday, May 16, 2008

Just Read: Release It! and ThoughtWorks Anthology

If you intend to write software that lasts and fares well during its journey, give yourself a hand and read this book. From actual situations, the author derives very concrete recommendations towards writing scalable and operable applications.

As far as I am concerned, reading this book has been an exhilarating experience. Though to a much smaller scale, I have experienced the same pains and came to similar conclusions than the author. Reading some pages entailed some sheer moments of excitement, very much like: "I have been saying the same!".

All in all, I am grateful that Michael Nygard has written such an authoritative book on this crucial matter for now no-one will be allowed to say "I did not know" anymore.

This book does a pretty good job at offering insights and real world feedback from top notch ThoughtWorkers on a variety of software development related subjects (coding, designing, building, QA). For a collection of essays from different authors, the overall uniformity has been pretty well maintained both on style and content.

As a software developer, I have found Neal Ford's "Polyglot Programming" and Jeff Bay's "Object Calisthenics" to be the most compelling pieces of this book.

At the end of the day, the real interest of this anthology resides in the fact that a small consulting firm has decided to share its sheer passion for software in a direct and hype-free manner. How many of the big ones out there could do the same?

Saturday, May 03, 2008

I don't know... yet!

In movies, computers know everything. Just ask the computer and you will get an accurate answer now. For us, developers who have to deliver the promises made by Hollywood, programming software that gives correct answers immediately is very easy. Unfortunately it is also very costly.

Giving the correct answer generally translates into querying the one source of truth of an application: the database. This is easy.

Sometimes the database is far away from the access layer the caller interacts with, whether it is a web tier or a service tier. No problem! It is very easy to ask the caller to hold his breath until a long chain of synchronous calls gather all the correct data needed and finally present it as a glorious reply. Usually the caller has enough breath to wait until it has to get some fresh air again (some call this a time out).

All this is fairly easy but unfortunately very costly. Contention to access the centralized source of blessed truth increases as more and more callers want to access it. If, for the sins of the application, it gets successful, the number of callers holding their breath while waiting will increase dramatically.

It is a time when the application would love to have callers with smaller lungs, so they could time out faster and free the threads they are holding while sitting idle. But they are not: the dozens of second they wait are an eternity for a server. Instead of being glued waiting for a remote service or database to deliver the ultimate answer, the application would like to afford replying:
I don't know... yet!

At least, this imprecise but immediate answer would allow the server to release threads almost as fast as they come. This would give the application the luxury of replying later, whenever possible...

Of course, such a paradigm shift is all but transparent for the caller: he must learn to deal with imprecision. He must embrace asynchronism. He must survive eventual consistency.

We now have the tools, from the web tier to the back end. Can we succeed in this paradigm shift?

I don't know... yet!