Tuesday, April 10, 2007

Yet Another Amateur Opinion

So, here I am, reading The Braidy Tester, where I frequently lurk, because, well, I like the way the guy thinks. And recently, he posted an article that I agreed with so I finally mustered up the courage to kick down the closet door and post a response.

The basic premise of the article (which you can find here) was that we should occasionally entertain the notion of organizing project teams around features instead of skill sets. I agreed. In my response, I said:

Going against the grain is scary but sometimes vital in our industry. It takes courage, and always involves risk. But calculated risk isn't always a bad thing.

I'm not a fan of blindly adhering to "established best practices," "the Next Big Methodology (TM)," or established team models. Just because something is a best practice at Acme Corporation doesn't mean it's a best practice at Real World Inc. The business model is different, the staffing patterns are different, the problem domains are different, *everything* is different. You have to be willing, at some point, to accept he idea that a best practice for *them* isn't necessarily a best practice for *you.*

Be flexible. Do something different. Experiment. Find out what actually works, and makes you more effective. Make the leap. At the end of it all, if you find out that it didn't work, you'll at least come out of the experiment knowing something that you didn't know before. And the acquisition of knowledge is never a wasted effort.

(Note that that's not a vanity quote. It's done for clarity.)

The operative words in that post are "blindly" and "calculated." However, a subsequent poster referred to my opinion as "amateur."

Pecker-waving aside, how does one define an "amateur opinion"? I doubt very highly that the poster has any idea what my level of experience is. I also doubt that he carefully read my post, and skipped right over the key words, blindly and calculated.

For whatever it's worth, I'm always willing to accept the fact that there are an innumerable number of people out there that know far more than I do. They have scads more experience than I do. But the last time I checked, you didn't have to have a license or a degree to have an opinion.

So, despite the fact that I apparently lack the credentials to post an opinion, I'll simply refer the reader to the 1st Amendment. And then I'll ask the reader to read what I said, and then think about what I said, and then ask about my experience before labeling me an amateur.

I assure you, sir, I am not. I am familiar with the phenomena that have led me to this opinion.

I have worked for several companies where a methodology or practice was adopted simply because it was popular at the time; no thought was given to whether or not it was suitable for the environment. This was a failure on the part of management: they should have conducted a proper risk assessment beforehand to determine its feasibility and and suitability. An estimate of the ROI would have been handy. But instead, utter chaos ensued, schedules slipped, projects failed, and employees left because there was a sense that no one knew what the hell was going on and no one was in control.

Exceptions to the rule? Undoubtedly. But realistic examples nonetheless. It happens. But it shouldn't.

Many of the mainstream, established methodologies and best practices are popular and widely implemented because they do, in fact, work. But they don't always work everywhere. I refer you to the old adage, "There is no silver bullet." You have to find the one that works for you; you don't just pick one out of thin air (at least, I hope not): you do your homework, and find one that is suitable for your business model, and maximizes the return on your investment. Maximum return for the least amount of risk.

Case in point: at one firm where I've worked, post mortems were ruled out as a bad political move. The development team was so small that the only ones who would provide meaningful input was the customer. Management didn't want the customer critiquing the software development process. Yet a post mortem is a valuable tool, and an established best practice for improving your overall process when it's done right.

What about code reviews? If you've only got one developer in your company, they're not feasible because that developer can't be truly trusted to review his own code objectively. Who's going to do it? Are you going to outsource it?

Truly small companies can't always adopt methodologies and practices designed to support larger development teams. It doesn't make any sense. Companies with extremely short development cycles can't adopt the methodologies appropriate to those who have the time to implement BUFD, and so forth. You have to be selective. And sometimes, you have to develop a custom methodology that works specifically for your firm, that may entail bits and pieces borrowed from other established methodologies.

Trying to shoe-horn your company into a methodology that doesn't fit you will only give you calluses in the most uncomfortable places. I am certainly not advocating that every company in the world should do its own thing. I never once advocated that. What I did do was denounce blind adherence to TNBT and best practices that weren't well-suited to your business model. When the need arises to deviate from what everyone else is doing because what they're doing doesn't work for you, it makes sense to entertain the notion of doing something different. Even then, you should only engage a different way of doing things if the risk involved in doing so is manageable and acceptable.

But hey, if that kind of an opinion makes me an amateur, so be it. I can live with that.

—Mike

No comments: