A long, long time ago in an operating system not so far away, UNIX was renowned for its bizarre commands. Oh, how the VAX VMS people would mock the elegantly named “biff”, the UNIX mail notification tool so-called because the author’s dog Biff happened to chase the postman. Or “awk”, the powerful text manipulation tool named after its authors Aho, Weinberger and Kernighan. Or “bash”, the Bourne-Again shell, Steve Bourne’s enhanced version of the original sh shell. Not to mention the ceaseless quantity of two-letter commands: fc, ls, du, df, wc, cc, ld – I could go on and on.
Of course, I don’t know who VAXherds were to mock UNIX; as sensible as VMS commands appeared to be, they could all be abbreviated to four letters. This meant I didn’t have to waste valuable keystrokes on the ANALYZE command when typing in ANAL served perfectly. And let us not forget the VAX vacuum cleaner advertisements which proudly proclaimed “
nothing sucks like a VAX.” Anyway, this is all moot; where’s VMS now? And do DECUS count their membership on two hands or just one?
Nevertheless, the point is UNIX – and, by the early ‘90’s, Torvalds spectacular Linux, putting power in the hands of the PC-buying public – was a command-line driven system. It used shells, or CLIs – command line interpreters. We wizards weaved wondrous works, with hands flying forth over the keyboard. We scoffed at the mouse-dependant Windows users – who frustrated us by their exasperating slow and painful speed. You could eat your own arm off in the time it took a Windows user to click in the username box and type, move to the password box, click and type, move to OK and click once more. “Just use tab and enter” we would think while smiling pleasantly.
Yet, all was not well: the WIMP revolution had begun. The average man on the street wanted to use their computer – not master it, but simply produce the outputs they had in mind. And fair enough, too. I’d certainly be lost if I had to know how my car worked. So, despite the coolness of having a free and powerful operating system, Linux was pegged as “an expert’s system.” The perception arose that it needed sheer arcane commands to be typed in and without this was not useful. The initially small driver base wouldn’t have helped matters much, but there’s no doubt whatsoever the requirement to execute ls –alf and ps –ejf steered away regular punters.
By contrast, Windows ’95 hit the stores and people were blown away by its solutions to problems which never existed in Linux anyway – like being freed from the 8.3 file name convention. It was Microsoft’s first GUI OS (Windows 3.1 being, of course, an application that ran on top of MS-DOS.)
Initially, Linux was no threat to Microsoft. And the pundits had been predicting the death of UNIX for decades. Yet, just like the Redmond giant realised they missed the value of the Internet early on, so too they saw the emerging threat of Linux.
The spin juggernauts steamed into action. “Linux is not a desktop system,” they said. It wasn’t user friendly. It didn’t have a wealth of productivity applications.
The marketing types patted themselves on the back but then gasped as it dawned upon them Linux was a stellar server operating system. It ran headless without a blink. It handled multi-user sessions. Indeed, it had a rich UNIX heritage of being the platform the Internet ran on.
Thus the TCO – total cost of ownership – argument was given life to counter this. “Sure,” the prophets of FUD would say, “Linux may save you $150 over Windows NT/2000, but you will find a gazillion Windows administrators to every one Linux administrator. And all they do is eat pizza all day anyway. And they don’t pick up the crumbs.”
The arguments began flying. Linux would cost more money in the long run, because staff required more training. There were fewer people with the appropriate technical talent. It was easier to find emergency Windows-centric staff than Linux people. The overall message was Linux was a risk, because Linux was more complex. It didn’t primarily use a GUI. People couldn’t just click “Next” all the way through an installation. (And I have to say, two of my biggest peeves in the Windows world stem from the “Next”/”Next”/”Next” method of program installation. I just can’t stand Windows operating systems with the wrong regional settings. And I just hate Microsoft Office installations where you have to keep supplying the disc because all the “install on first use” settings were left untouched.)
I taught Linux operating system fundamentals including creating user accounts by understanding the /etc/passwd, /etc/shadow and /etc/group files. Yet, a remarkable transformation happened.
UNIX did have a GUI; MIT’s X-Windows platform had long been around with countless window managers. Yet, it was just eye-candy. Everyone still used xterms to do the real stuff; even reading mail through elm (or pine if you were a newb.) The original Linux ran great in 2Mb RAM but if you wanted the optional X11 component then 4Mb, even 8Mb, may be needed – so it was usually ignored on low-spec hardware.
But then computers became more powerful – the bloatedness of Windows saw more powerful PC hardware at a lower price. Linux users didn’t have to pay any attention to how much memory their machines had. The punchy pair of GNOME and KDE appeared on the scene and rose to prominence.
I started to receive criticism; “Why are you talking about /etc/passwd when all you need is this widget” the spotty-faced sprog from the Windows ’95 generation would ask. Before my eyes, I almost became a dinosaur myself –entire suites of simple to use wizards, tools, applets, gadgets and widgets became mainstream fare.
With nary a blink, the hacker mystique of Linux was eroded; it turned into a system that was a doddle to use. Fire up a live CD, make sure it works on your hardware. Install it, and follow the bouncing ball through the setup process. Fire up your apps or control panels from the on-screen menus and icons. Sure, terminal windows were there, but you didn’t need it.
This was, of course, a good thing. Any IT professional believes in making things easier for the end user. Any Linux enthusiast embraces moves that foster adoption of their favourite OS.
But then a funny thing happened. The trend reversed. A new generation of Linux users who grew up in the world of Windows discovered the power of the command line. “This is so cool,” they gaped. “You can pipe commands together. And look how quickly my sed script transforms this text.”
By remarkable turnaround, a demand arose for meaty technical content. Sure, anyone could make a new login via a system tool that prompted for all the details. But what was really going on under the hood? My recent article “
UID and GID: the basics of Linux user admin” met with much positive feedback and, the web site hits show at least one University has linked to it from a computer science course page. Suddenly, the new breed of Linuxphile was hungry to know how the system worked and were enthused by the unbridled control the command line gave them.
But here’s something even more unexpected: Microsoft, making a mockery of their original TCO argument, have now reverted back to the command-line.
Microsoft dallied in the past with MS-DOS and later hoped Visual Basic Script (VBS) would catch on. Well, it certainly did – with virus writers and script kiddies, making VBS almost universally blocked by mail clients and systems admins worldwide. They’re back on the command-line and scripting bandwagon with
PowerShell.
PowerShell is touted as a really expressive, really formidable command-line shell environment for the power user and admin alike. It’s downloadable for Windows Server 2003 but really saw the light of day in Exchange 2007. Anyone who’s used the latest incarnation of Exchange will know that PowerShell is the real control mechanism for Exchange; the Exchange console is just a facade. It doesn’t offer full functionality and what it does do translates into PowerShell commands. Microsoft aren’t hiding it: you add a user, it says “Ok, I’m going to execute this PowerShell command to do it.”
PowerShell is going to come faster and faster. SQL Server 2008 and Windows Server 2008 are both due out soon. They’ll both be PowerShell-powered. Rumour even has it that Windows Server will fire up PowerShell then prompt you to drive the installation via the command-line.
Given how few people in the world would be solidly experienced PowerShell experts, the TCO argument now swings strongly in favour of Linux. But that aside, ask any Microsoftie what the inspiration for PowerShell was. It’s a blatant bash derivative. Even the help screens are obvious Linux man pages. They'll tell you straight up.
I bet those “I hate keyboards” Windows admins wish they’d spent more time with Linux now; the Microsoft roadmap involves a strong learning curve for anyone who can’t punch their way out of a command-line. And even in the Linux camp, the power of the CLI is coming back in vogue.
Such is the way of the CLI/GUI circle of life. Thus endeth the lesson.