Skip to content

Is the American tech worker obsolete?

I’ve been doing some thinking about offshoring recently. For those of you not in the IT industry, ‘offshoring’ is jargon for outsourcing of white collar jobs to developing countries like India and China. There’s been a lot of talk about this phenomenon: Salon (1 and 2), The Economist [sorry, it’s a pay article], InfoWorld, and CNET News.com have all commented recently. In addition to reading these articles and others like them, I’ve also talked to folks in the software industry, as well as friends who work in aerospace and packaged good manufacturing industries. And I’ve come to the conclusion that a certain amount of offshoring is inevitable, since the labor cost differential is so great.

However, like any other business fad (or any fad, for that matter), there are external costs that haven’t simply are not fully understood. Among these are

1. Loss of customers
The constant mantra of folks who are trying to tell IT workers how to adjust to losing their jobs is ‘retrain, retrain’. But retrain for what? What job isn’t offshorable? Nursing is the only one that comes to mind. If allvirtualizable (i.e. can be done without face time) service jobs where there is a price differential get pushed overseas, that’s millions of jobs lost. The American economy has been the engine for most of the world’s growth over the last 10 years, sucking up exports from other countries to the tune of billions of dollars. What happens when the consumers of the USA don’t spend money, either because they are out of a job, or afraid of losing their job soon? It takes a visionary like Henry Ford to realize that if you grow the amount of folks who can buy your products, you grow your business.

2. Loss of future business leaders
If you export big chunks of your IT department, perhaps keeping only senior folks who can help manage external projects, then you win in the short run. However, eventually those folks will retire (ask NASA). And if you haven’t brought any entry level talent in, where will you go to replace these needed folks?

3. Difficulty of managing far flung teams from different cultures
This is difficult in two ways. One is the logistical aspect. If you want a teleconference, you have to adjust either your schedule or theirs. And if the country is on the opposite side of the globe, possibly both folks need to be in the office at an awkward time. The second difficulty is related to quality. Just as the Japanese products in the 50s and 60s were thought cheap and low quality, some of the work done overseas today is a lower standard than expected. This is true of all software of course, but it’s harder to control the quality when you write a spec and throw it over the wall.

In short, offshoring is in its infancy. More of it is coming down the line, but, as the hidden costs are discovered, the benefits of local teams will come back into focus. I can do much of my consulting work from my house, yet I still find that most companies want me to come into their office. Why? To some extent it’s control (should be careful not to blog my way out of a job here), but it also is because communication, in each direction, is easier with someone who’s on site. There’s extra effort expended in any kind of virtual communication. And by meeting me, the customer and I build a relationship, which is perhaps more important for trust in business than anything else.

I also want to address Michael Yuan’s comments:

“3. Coding is a dying profession in the long run. If the jobs have not been outsourced to developing countries, the new generation of model-focused automatic code generation tools will eliminate the need for basic coders anyway. The jobs that have a future are system designing and architecting (the real engineering jobs). I think the ability to design end-to-end systems using whatever tools available is an important skill for the future.”

I’ve commented enough, I hope on, the idea that all coding work will or should be outsourced to developing countries. But I think that the idea that ‘the need for basic coders’ will be eliminated due to improvements in ‘automatic code generation tools’ is foolish for a number of reasons.

1. If you don’t understand basic coding on a system, when the automatic tool doesn’t do what you need it to, the means you’re screwed. Basic coding is not something you can pick up at school–you need to be out in the real world, working on crufty systems that are too expensive to change and older than dirt, to really appreciate what automatic tools can and cannot do, and what systems can and cannot do. In addition, easier code generation means more code out there, not less. And, easier code generation means more code out there. And, the easier the code is to generate, a la Visual Basic, the more likely the person doing the generation won’t get it right, or won’t document it, or won’t understand why there’s certain behavior exhibited. In this case, someone who has a deep, visceral understanding of the system and language will be called in.

2. Yuan talks about architecture and system design being the ‘real engineering jobs.’ I’ll agree that those are more stimulating and challenging than coding, and also have a higher value add. But again, how many folks are system architects right out of school? I certainly wouldn’t want to work on a project that had been designed by someone with exactly 0 years of real world experience. If you abolish entry level positions, you eat your seed corn, as I mention in my comments about offshoring above.

3. There is no silver bullet. The hard part isn’t writing the software, it’s determining what software needs to be written.

4. Someone’s got to write the tools. I remember an old science fiction story, by Isaac Asimov I think, about a world of taped learning. You took a test at 18 that determined what your skills were, and then you listened to a set of tapes that taught you all you needed to know. The protagonist was distraught because the test just didn’t seem to fit him–he was outcast and ridiculed for not knowing his profession. Of course it turned out that the folks that couldn’t be tape educated were exactly the folks who were able to write the tapes. Someone’s got to write the tools. (I don’t pretend, however, that anywhere near the same number of folks are needed to write tools as to use them.)

Michael and others are certainly correct that things are going to change in the software world. It’s not going to be possible to just know a language–you have to be involved in the business and maintain relationships. But change is nothing new for IT workers, right?

In short, it appears that Neal Stephenson was correct. I leave you with an excerpt from Snow Crash:

“When it gets down to it–talking trade balances here–once we’ve brain-drained all our technology into other countries, once things have evened out, they’re making cars in Bolivia and microwave ovens in Tadzhikistan and selling them here–once our edge in natural resources has been made irrelevant by giant Hong Kong ships and dirigibles that can ship North Dakota all the way to New Zealand for a nickel–once the Invisible Hand has take all those historical inequities and smeared them out into a broad global layer of what a Pakistani brickmaker would consider to be prosperity–y’know what? There’s only four things we [America] do better than anyone else

music
movies
microcode (software)
high-speed pizza delivery”

And none of the four above are guaranteed (except perhaps the high-speed pizza delivery).

Happy New Year.

Java Tidbits

Quick, take this true/false quiz and test your Java(tm) knowledge:

1. Private members can only be accessed by the object which owns them.

2. The contents of a finally block are guaranteed to run.

Everybody knows that both of these are true, right?

Actually each of these is false. In the first case, this program shows that objects can twiddle with private members of other objects of the same class:

class PrivateMember {
   private int i; 
   public PrivateMember() { 
      i = 2;
   }
   public void printI() {
      System.out.println("i is: "+i);
   }
   public void messWithI(PrivateMember t) {
      t.i *= 2;
   }
   public static void main (String args[]) {
      PrivateMember sub = new PrivateMember();
      PrivateMember obj = new PrivateMember();
      obj.printI();
      sub.messWithI(obj);
      obj.printI();
   }
}

and this program shows that finally blocks don’t run if the JVM exits:

class TryCatch {
   public static void main (String args[]) {
      try {
         System.out.println("in try");
         throw new Exception();
      } catch (Exception e) {
         System.out.println("in catch");
         System.exit(0);
      } finally {
         System.out.println("in finally");
      }
   }
}

Book Review: Sixth Column

If you’re looking for an introduction to Robert A. Heinlein’s vast corpus of fantastic science fiction, don’t read Sixth Column, read The Moon is a Harsh Mistress. If you’re a Heinlein junkie who’s read all his other stuff and you want a quick, fun read covering the typical Heinlein themes (the able man, war, gee whiz technology, “long live democracy”), then you’ll definitely want to pick up this book.

The basic premise is: the USA has been invaded by “PanAsians,” and the government effectively destroyed. Having subjugated India, the “PanAsians” know how to tie down the USA–lots of labor camps, citizen registration and public executions as punishment for any rebellion. But they also have learned not to interfere with their subjects’ religion(s). One small military base, a research laboratory, has escaped destruction; luckily the plucky soldiers have an able commander and lots of technology the invaders simply can’t match.

From there, it’s just a matter of time. The reader gets to watch how these men build a movement, screw with the “PanAsian” leadership, and eventually free the USA. Of course the technology is hokey and the dialog can be a bit offensive, but it’s realistic (yeah, I think slurs are allowable if they’re marching your family off to the labor camps). This book was written in 1949 and reflects some of the paranoia that Heinlein later gave voice to in Expanded Universe.

But, it’s a fun, quick read and if you like Heinlein, you’ll probably like it. It’s no classic, but not every book can be.

J2ME development advice

Here is some advice for any of you thinking about doing any development for J2ME. I’ve written before on J2ME considerations, but the project I’m working on is a bit further on now, and I thought I’d share some hard earned knowledge.

1. Get Windows.

Make sure you have access to a windows box, running a modern version of windows. While Sun’s Wireless Tool Kit supports Linux and Solaris, and IBM’s WebSphere Device Developer supports Linux, no other emulators do (nor do they work under wine).

Whether you want to support Motorola, Nokia, or NEC, you are pretty much need windows to run the emulator. And an emulator is crucial, because it allows you to rapidly develop an application. And testing on as many emulators as possible means that your application will be as tight as possible. However, when you get something that is close to finished, you’ll need a real phone to test it on.

2. Get (a) real phone(s).

While emulators can tell you a lot, they certainly can’t assure you that an application will run on a real phone. One project I worked on had an emulator that ran the app just fine, but the app was locking up on the real phone. It turned out that the networking code needed to run in a separate thread. There are many things that differ between an emulator and a real phone. Installation of a midlet is different (and differs between phones as well). Instead of accessing a file on disk, you have to use OTA provisioning (sure you can emulate that with the WTK, but that’s just an emulation. I’ve run into issues with DNS that just don’t show up on the emulator). Also, as mentioned above, the networking capability differs, and the connection is much slower. The amount of memory that you can use while developing on the desktop is effectively unlimited (some of the emulators let you monitor the amount used, but I don’t know of any that lets you limit it), but phones have hard limits. Don’t you want to know what happens when you try to use 101KB of memory on a device that only has 100KB? The limitations you face on user interface are also more real on a phone, when you can’t use the keyboard to enter your username or the backspace key to fix errors. For all these reasons, you should get a phone as soon as you can.

3. Explore existing resources.

A couple of good books: Enterprise J2ME is a great survey book, with some very good ideas for building business applications (with large numbers of users) but not a whole lot of nuts and bolts. Wireless Java: Developing with J2ME, Second Edition is a good nuts and bolts book (it explains how to do your own Canvas manipulations, for example). Check out what else Amazon suggests for other ideas.

A couple of helpful urls: Fred Grott’s weblog, MicroJava, the EnterpriseJ2ME site, Sun’s site, of course, and the javaranch saloon is pretty helpful too.

The various carrier websites are useful, if only to find out what kind of phones you want to target: AT&T, Sprint, T-Mobile, Nextel. (Verizon in the USA is BREW only.)

4. Have fun.

Comments on “Mobility is more than J2ME (and the job martket for 2004)”, pt I

Michael Yuan had a pretty inflammatory post recently. Here are the first wave of my comments on this interesting topic.

“1. The move to mobility is inevitable in the enterprise. The IT revolution has to reach hundreds of millions of mobile workers in order to realize its promise. There is no other way. However, the real question is how and when this will happen. With the IT over-investment in the last decade, this might take several more years.”

Agreed. Allowing mobile users to access the corporate datacenter is an inevitability. When it does happen, however, it certainly won’t have the sexiness or big bang of the internet revolution. In fact, it’s much more an evolution than a revolution. Folks already have access to corporate databases right now; the mobile revolution simply combines the portability of paper with the real time nature of laptops. However, letting knowledge workers such as sales people and truckers have real time information on such a cheap, reliable device really will change the nature of the business. But we won’t see sock rabbits or dot.com millions, since such changes will favor existing businesses.

“2. When enterprises move to mobility, a key consideration is to preserve existing investment. Fancy flashy J2ME games will not do it. The task is often to develop specialized gateway servers and J2ME integration software to incorporate smart mobile frontends into the system. That requires the developers to have deep understanding of both J2EE and J2ME. I think that the “end-to-end” sector is where the real opportunities are in the next several years. That is also what “Enterprise J2ME” is all about. :)”

Now, don’t be so quick to judge. Gamers pushed the boundaries of the PC in terms of computing power, and I wouldn’t be surprised to see the same thing happen on the MIDP platform. That said, I’m not a gamer. However, I still have issues with the idea of folks paying to play a game on a cell phone. I play snake, but that’s a simple, free game, and I’m certainly not dedicated to it. Fred Grott claims that MMORPGs are going to drive J2ME game development–I just don’t see folks doing that when you can get a much, much richer experience from the XBox or Game Cube or PC sitting at home.

And I don’t see why Michael ties J2ME and J2EE so tightly. The whole point of web services is to decouple the server and the client. I dont’ see any reason why you couldn’t have J2ME talk to a .NET server, or a BREW client talk to a J2EE server. To me, the larger issue with the mobile revolution is the architecture of the J2ME applications, since I think that such small, non networked, memory constrained applications (with either extremely limited portability or extremely limited user interfaces, take your pick) are going to be a world apart from the standard java developer’s experience (which is HTML generation, not swing).

I’m going to leave his third point for another post, as the outsourcing issue is…worthy of a separate discussion.

Book Review: The Mother Tongue

Well, I’ve figured out how Bill Bryson writes his hugely amusing tomes. Yup, I sussed out the formula:

1. pick a topic of interest that folks don’t know much about

2. research it well, finding both fundamental facts and interesting tidbits

3. present the research in a conversational manner, dropping witticisms left and right.

Let’s see how this applies to the latest Bryson book I’ve read: The Mother Tongue: English and How It Got That Way.

Point 1: Not many folks, least of all in America, know much about our language. This is kind of astonishing, given that it’s our primary means of communicating (trust me, I spent several weeks in small Swiss towns, and I can tell you from experience that language is the main method of communicating. Charades isn’t as fun when you want something to eat). But, other than the most common word (the) and letter (e), and the fact that English doesn’t have gendered words, I didn’t know much about English.

Point 2: This book shines here. Did you know that the word tits hasn’tchanged since the 10th century (page 215)? Or that Japan buys as many copies of the Oxford English Dictionary as the USA, and more than Britain (page 195)? Or that one sound (yi) in the Pekinese dialect of Mandarin stands for 215 words (page 86)? To support these sometimes absurd sounding claims, we get some footnotes and an eight page bibliography.

Point 3: As always, Bryson is prepared to take a pot shot at any ludicrous statement or proposition, and the English language provides plenty of those. Some of my favorites:

1. “…It is difficult to escape the conclusion that the British have such distinctive place-names not because they just accidentally evolved, but rather because the British secretly like living in places with names like Lower Slaughter and Great Snoring” (page 205).

2. Quoth a congressman, “If English was good enough for Jesus Christ, it’s good enough for me” (page 195). Bryson doesn’t even bother to comment on this, letting the absurdity scream for itself.

3. When examining the dictionary of Samuel Johnson, Bryson points out that Johnson defines “oats as a grain that sustained horses in England and people in Scotland” (page 153). He lets several of Johnson’s tart remarks speak for themselves, but he also examines why this English dictionary (which wasn’t the first) had such an impact.

Just the names of the chapters gives you and idea of the scope of this book. From “The World’s Language” to “Spelling” to “Names” to “Swearing,” Bryson leaves little out. Even the discussion of cross word puzzles and palindromes (hardly exciting stuff), in the “Wordplay” chapter, doesn’t bore.

Sure, it’s a formulaic book, but a damn good one.

Webhacking with CVS

In the latest edition of 2600, there’s an article about webhacking with CVS. The basic premise of this article is that if you do a cvs checkout of your static html site to your webroot, you let folks with inquisitive minds and an understanding of CVS know more than you intended about your IT infrastructure. Read the article for more information.

However, it’s easy enough to defeat. The answer is to use the cvs export command, which generates exactly the same files as a checkout except without the CVS directories. Rolling out updates toa a site via this command means, of course, that any changes you make to files in the web directory will be blown away. But, it could be argued that it’s a godd thing to force everything to go through CVS. It also means that you can’t make incremental updates as easily. It’s still possible, but you just have to check out the source to some other place and copy the file over manually. Another option, which lets you do updates more easily, is rsync --cvs-exclude, which does the same thing.

Using either of these solutions makes it a bit tougher to move content to the website. But it makes things a whole lot more secure.

Business Process Crystallization

I’m in the process of helping a small business migrate an application that they use from Paradox back end to a PostgreSQL back end. The front end will remain written in Paradox. (There are a number of reasons for this–they’d like to have a more robust database, capable of handling more users. Also, Paradox is on the way out. A google search doesn’t turn up any pages from corel.com in the top 10. Ominous?)

I wrote this application a few years ago. Suffice it to say that I’ve learned a lot since then, and wish I could rectify a few mistakes. But that’s another post. What I’d really like to talk about now is how computer programs crystallize business processes.

Business processes are ‘how things get done.’ For instance, I write software and sell it. If I have a program to write, I specify the requirements, get the client to sign off on them (perhaps with some negotiation), design the program, code the program, test it, deploy it, make changes that the client wants, and maintain it. This is a business process, but it’s pretty fluid. If I need to get additional requirements specification after design, I can do that. Most business processes are fluid, with a few constraints. These constraints can be positive: I need to get client sign off (otherwise I won’t get paid). Or they can be negative: I can’t program .NET because I don’t have Visual Studio.NET, or I can’t program .NET because I don’t want to learn it.

Computerizing tasks can make processes much, much easier. Learning how to mail merge can save time when invoicing, or sending out those ever impressive holiday gift cards. But everything has its cost, and computerizing processes is no different. Processes become harder to change after a program has been written or installed to ‘help’ with them. For small businesses, such process engineering is doubly calcifying, because few folks have time to think about how to make things better (they’re running just as fast as they can to stay in place) and also because computer expertise is at a premium. (Realizing this is a fact and that folks will take a less technically excellent solution if it’s maintainable by normal people is what has helped MicroSoft make so much money. The good is the enemy of the best and if you can have a pretty good solution for one quarter of the price of a perfect solution, most folks will choose the first.)

So, what happens? People, being more flexible than computers, adjust themselves to the process, which, in a matter of months or years, may become obsolete. It may not do what they need it to do, or it may require them to do extra labor. However, because it is a known constraint and it isn’t worth the investment to change, it remains. I’ve seen cruft in computer programs (which one friend of mine once declared was nothing but condensed business knowledge), but I’m also starting to realize that cruft exists in businesses too. Of course, sweeping away business process cruft assumes the same risks as getting rid of code cruft. There are costs to getting rid of the unneeded processes, and the cost of finding the problems, fixing them, documenting them, and training employees on the new ones, may exceed the cost of just putting up with them.

“A computer lets you make more mistakes faster than any invention in human history – with the possible exceptions of handguns and tequila.” -Mitch Ratcliffe, Technology Review, April 1992

A computer, for the virtue of being able to instantaneously recall and process vast amounts of data, also crystallizes business processes, making it harder to change them. In additional to letting you make mistakes quickly, it also forces you to let mistakes stand uncorrected.

Book Review: The Alchemist

This book, by Paulo Coelho, is, like all fables, written on many levels. Ostensibly the story of a shepard in Spain who, unlike so many people, follows his dreams. He does get a little help from the supernatural, but many of the stories most interesting thoughts come from his musings on nature. His travels take him across the Mediteranean into Africa, where he meets several archetypal characters (the Man Afraid of Change, the Waiting Woman, the Wise Shaman, the Warrior Chief, the Cynical Fool), learns about himself and his dreams, and finds his destiny.

An interesting way to look at this story is to ask the question: who is the title character? Alchemy is such a potent idea–the changing of one element into another has had a grasp on the human mind for as long as we have known about elements. But, of course, alchemy has secondary meanings–an alchemist transforms. Is the boy an alchemist, for transforming himself and the lives of those around him? Is God the alchemist, for transforming the destinies of humanity? Is the reader the alchemist, for taking the fable and transforming its words into something personally meaningful?

My favorite part about this book was its gritty reality. I like epics, but there were no sweeping vistas and no ubermensch heros in this book. Everything the boy does (and we never learn his name) is something you and I could do. I guess that’s the point of the book.

Update: As ihath commented you do learn the boy’s name. It’s revealed on the first page. But, as I remember, it’s not used much throughout the book, maintaining the everyman nature of the story.

“The Alchemist” at Amazon.

Book Review: The Long Dark Tea Time of the Soul

Updated 2/25/2007: Added amazon link.

Douglas Adams is amazingly whimsical. If the Hitchhiker’s Guide to The Galaxy didn’t convince you of that, the Dirk Gently novels will. Gently is a detective, but no Sherlock Holmes. No, rather than ruling out the impossible to leave only the improbable, Gently prefers to believe the impossible, because it makes so much more sense than the improbable. He solves his case through ingenuity, luck, and a belief in the interconnectedness of all things.

A highlight for me is Dirk’s method of finding directions. He just follows someone who looks like they know where they are going. This, he says, doesn’t always get him to where he wanted to go, but almost always gets him to where he needs to be. If only we all had such faith!

This book is the second of two about the private eye. I don’t want to give away too much of the story, as it is definitely a mystery, but it covers some of the same ground as American Gods in a much less sinister manner. Everything has a reason and a rhyme in this book, even if at first encounter, an event makes no sense, neither to the characters nor the reader. While the ending is a bit abrupt for my taste, if you like whimsy, you’ll get an ample helping with this book.

Link to this book on Amazon.