Making the right choice for your large-scale corporate needs is, to say the least, very important. And it’s tougher than it used to be. For years, Unix servers from IBM, SUN, and HP have been the logical and popular choice for large applications. The power of a Unix server, with up to 64 processors, together with the scalability and stability of a Unix operating system has created a whole generation of IT professionals who would rather fight than switch. But, my, my – how things can change...
And then there is the issue of power. It used to be that Unix offered 64-bit chips and Intel only 32 bits. That made a monster difference in memory addressability for managing very large databases. Last year, however, Intel finally got it's Itanium 64-bit chip to market. And with the upcoming release of its successor chip, McKinley, Intel is expected to deliver a formidable, although relatively untested in the real world, 64-bit chip.
Microsoft has apparently done its part to improve the stability of the Windows operating system with NT. A warning is the concerns you used to have about how far you could scale it reliably on Intel-based servers. Alas, you will have to deal with software upgrades with Microsoft products and perhaps endure rebooting with seeming regularity. Of course, you could decide to run Unix on your Intel-based servers. Although its open-source cousin, Linux has been stealing the show primarily because you can download it free, Red Hat has done a decent job with support services, and you can tweak the code to better suit your requirements.
If you find yourself wrestling with what’s best for you, you may yearn for the simplicity of days gone by – when Unix held the keys to the castle for very large-scale server needs. But you can bet that the flexibility offered by today’s products is worth the additional time and effort you’ll spend making your decision.