Robert Hanson, who built the very useful GWT Widget Library, has an interesting post where he asks:
Let’s say that you are a developer, and you have been spending the past year or so really getting to know a given technology. Now you are being told that the technology you are using is inferior to this “other” technology. You take a look and realize that it might be best to switch. A year later you finally have a good understanding of the tool, and use it with great skill. Then someone tells you about this “other” technology.
How many of us built our own MVC frameworks only to move to Struts, then maybe on to Spring MVC. Sure, there are some improvements made in each technological step, but since you are spending most of your time really getting to know a product you often spend little time getting the most out of it. This is compounded by the fact that you often use several of these products at the same time, adding to what you need to learn.
So what is a dog to do? Although you are moving forward, you never quite catch the tail. Should you just stop moving forward, or run faster or slower?
Personally, I think that there is a middle ground. As a developer, you need to keep up on broad trends and tools, because they can make you so much more productive. The problem arises when you don’t know how much more productive you will be, until you use the technology or tool for a while….
However, just because there is a new tool around, that doesn’t mean you have to use it. In fact, if you have an existing technology that does the job, you should not abandon it just to move to the new technology. There’s always a cost analysis, because learning a new technology is not free. Your time is worth something.
This cost analysis is something that developers should learn to do and appreciate because that process is exactly what most companies need to do before they decide to implement or build new software. Just like a developer, most companies think that a new technology, or system, will help them, but are unsure how much it will help them, and how much it will cost them. Just as for a company, a developer deciding to learn and use a new technology is not solely a technology decision.
There are many ways to minimize the risk of learning a new technology–prototype, read documentation, be conservative and consult someone who’s an expert in the new technology (which means they’ve already made some of the mistakes). Each of these have benefits and detriments. Prototyping takes more time than the others. Reading documentation is great if there is documentation, and if the documentation is accurate, but might teach one as many lessons as using the technology. Being conservative means that you’ll probably miss out on some productivity improvements, just as you’ll miss out on some time sinks. Consulting an expert is great, if you have access and know what questions to ask.
I think the answer to Robert’s final question is intensely context sensitive. It depends on the following five considerations, among others:
- how crucial a new technology is to your productivity (ie, if you are a java business developer, learning GWT might be lower on the list than learning Spring)
- how easy you think it will be to learn
- whether you can be paid to learn it
- how much spare time you have
- whether you have a project to use the new technology on
[tags]tail chasing,technology[/tags]