I couldn’t resist pointing out this article about dropping Linux in favor of Windows. In the mid-2000s I made a concerted effort to learn and use Linux. I even went as far as running it on my laptop, and taking that laptop to some Microsoft executive presentations (done with Open Office I might add) to make a point. The laptop was one that the OEM (Dell) had refused to update to support Windows XP, but Linux ran just fine on it. However, my experience was much as David Gewirtz reports is still true today for Linux servers. You get a Linux distribution, you craft it together from the pieces-parts to get the system you want, and then you either never change a thing or you put in an enormous effort to make any change (even a security update) and still keep it running. My conclusion on the client side was that Linux was not something I’d ever give my mother, cousins, etc. to run. And while as a “hacker” I loved the idea I could customize the server to my heart’s content, it always seemed to me that the total cost of ownership was much higher than Windows. After all, it was many years ago where Hardware and Software became cheap (dirt cheap actually) and labor costs skyrocketed. So a solution that lowers software costs further, at greatly increased labor costs, doesn’t make economic sense.
The one place where Linux makes enormous sense to me is in embedded systems. That’s because you can apply some expertise to customization to get your product running exactly as you desire, and then pump out copies in volume. In other words, its leveraged just like any high-volume software business.
So here we are 6-7 years later and I would have expected Linux to have matured into something much less labor intensive for an IT shop to use on servers. Based on Gewirtz’s experience, if anything it seems like its gotten worse.