I have long been an advocate of not immediately upgrading software. Most people upgrade simply because they're expected to or because they assume that newer stuff must be mean better stuff -- it doesn't. This "newer is better" mentality is the driving force behind our economy and advertisers are juicing every bit of attention from the younger generations, molding them into the perfect consumer. Many times consumers are presented with no other choice than to accept what is being sold. For example, if they purchase a new computer it's going to come pre-installed with the "latest and greatest" -- that is, latest and greatest piece of junk.
Software is a tool, designed to help you get something done. If the software already does the its job as expected, why change it? There are clearly differences between changes that effect usability and those which are meant to increase stability (the stone wheel, when first invented, worked as a wheel, however improvements increased its practicality and usefulness). The problem is that software companies need to make money -- if they created perfect software that gets the job done and never needed to be upgraded, where would their profits come from? The driving force known as greed creates competition between software companies, who then rush to get their latest and greatest software out before their competitors. The end result? Buggy, rushed software which adds as little "help-you-get-stuff-done" functionality as possible.
I mean really, how necessary are the added features of Windows Vista and Apple OS X Leopard? When Windows 2000 was upgraded to Windows XP, I recognized lots of extra unnecessary junk in XP. Windows 2000 ran smoother and faster than XP (and still does!) on all the machines I installed them on. However, there were certain features of Windows XP that became a requirement for business use, namely Remote Desktop. So I've accepted that Windows XP is probably the best option for those business users who need a Windows machine. But Windows Vista? I have yet to find one thing about Vista that makes it a better option than Windows XP.
People have come to accept that their computers will become obsolete within a few years -- but why? The only reason a system becomes obsolete is if the software you wish to run on it needs better hardware. If properly taken care of, the hardware will last a very long time. But perfectly working hardware is useless if the software you need runs slow, if it even runs at all. The solution? Write better software that requires less processing power. But wait, that's not advantageous to software and hardware vendors. The more they can sell you, the better. And after all, "newer is better", right?
I use Apple OS X Tiger on my MacBook Pro and even though Leopard has been released, I don't plan to upgrade. The only thing that would force me to upgrade is if there was some piece of software that required the newer Operating System. But thats a whole other issue altogether -- software vendors updating their software to work with newer operating systems and dropping support for the older systems. I'm discovering that a great portion of the Mac user-base tends to jump on the newest OS X, which causes developers to drop support for the previous version, ugh!
When I use vi, cat, grep, ssh, or any other Unix command, I don't wonder if they're going to be compatible with my system. I don't fear that running one of those commands will crash my entire operating system. Why? Because they're small, proven pieces of software that do their job and do it exceptionally well. That's the way software should be -- it should just work.
The comments in a recent Slashdot article prompted me to write this post. Read the comments and you'll see how many people are sick of software vendors releasing crappy software.