I've been following with great interest a series of articles centred around what has become one of the most ubiquitous features of modern OS design: the shutdown dialog.
Joel Spolsky started it all off with his review of Windows Vista's take on the feature, which looks like this:
He comes up with reasons for every single one of XP's shutdown options being superfluous and recommends a single log off button, named "b'bye".
When you click b'bye, the screen is locked and any RAM that hasn't already been copied out to flash is written. You can log back on, or anyone else can log on and get their own session, or you can unplug the whole computer.
Inevitably, you are going to think of a long list of intelligent, defensible reasons why each of these options is absolutely, positively essential. Don't bother. I know. Each additional choice makes complete sense until you find yourself explaining to your uncle that he has to choose between 15 different ways to turn off a laptop.
Granted, Microsoft's new OS does take the idea of 'user choice' a little too far with "Switch User", "Log Off", "Lock", "Restart", "Sleep", "Hibernate" and "Shut Down" all listed in a bizarrely ordered menu, accessed from a little button next to other buttons which repeat the "Shut Down" and "Lock" options.
That said, I'm certain that this provision of user choice is more a result of bad design than of any particular desire to give users control over their machines. The fact is, I can think of defensible reasons for each of these options. The reason is, it's my computer and I'll do what I want with it.
The Apple approach
Arno came along to defend the concept of fewer options, citing his work at Apple. Apparently, Mac OS X only has "Restart", "Sleep" and "Shut Down" options. But Arno wasn't even happy with this.
How often do you need to manually set your computer to Sleep? I just close the lid of my MacBook and it goes to sleep: a simple mechanical, physical interaction: no need for a software command.
I'd always been somewhat wary of software engineers who 'decided' that a feature wasn't neccesary because you could bend down and press a physical button instead. Sure, in some situations it may well be true that there's already a mechanical way to perform a task, but — especially in the world of remote access — this cannot always be guaranteed. I believe that it's a battle between optimistic practicality, and what is 'right'. What is 'right' is that users should have full control of an OS that they are running.
I was, however, dismayed to learn that Arno doesn't agree with this, either.
Some will argue in fact that I sometimes go too far in my quest to simplify. I frequently argue that it is the job of the software designer to make choices on behalf of the user. That's what designing is all about.
I hate when software designers make choices on behalf of the user, unless they provide a way for the user to overrule that choice. It's what Microsoft does and, as I decided several months ago, it's infuriating.
Speaking of Microsoft, Moishe Lettvin decided to jump on the bandwagon with a reply piece explaining to Joel how the Vista shutdown feature was designed.
It's a mess, to say the least, with a tangled mess of concurrent development on various badly-named trees of products. Microsoft is falling victim to Gates's and Ballmer's dream of complete software unity.
Allow me to quote Moishe:
In small programming projects, there's a central repository of code. Builds are produced, generally daily, from this central repository. Programmers add their changes to this central repository as they go, so the daily build is a pretty good snapshot of the current state of the product.
In Windows, this model breaks down simply because there are far too many developers to access one central repository — among other problems, the infrastructure just won't support it. So Windows has a tree of repositories: developers check in to the nodes, and periodically the changes in the nodes are integrated up one level in the hierarchy. At a different periodicity, changes are integrated down the tree from the root to the nodes. In Windows, the node I was working on was 4 levels removed from the root. The periodicity of integration decayed exponentially and unpredictably as you approached the root so it ended up that it took between 1 and 3 months for my code to get to the root node, and some multiple of that for it to reach the other nodes. It should be noted too that the only common ancestor that my team, the shell team, and the kernel team shared was the root.
So in addition to the above problems with decision-making, each team had no idea what the other team was actually doing until it had been done for weeks.
The end result of all this is what finally shipped: the lowest common denominator, the simplest and least controversial option.
This is backed up by the evidence given on Vista's wiki article as to just how many 'cool' new features were eventually scrubbed, leaving Vista as an operating system upgrade consisting of a new kernel, a new graphical theme and exponentiated system requirements.
As Joel says:
Every piece of evidence I've heard from developers inside Microsoft supports my theory that the company has become completely tangled up in bureaucracy, layers of management, meetings ad infinitum, and overstaffing. The only way Microsoft has managed to hire so many people has been by lowering their hiring standards significantly. In the early nineties Microsoft looked at IBM, especially the bloated OS/2 team, as a case study of what not to do; somehow in the fifteen year period from 1991 – 2006 they became the bloated monster that takes five years to ship an incoherent upgrade to their flagship product.
So I learnt a fair bit about two organisations, albeit nothing I didn't already suspect.
Microsoft is structured abysmally and is on the verge of some serious managerial difficulty (as evidenced by the nightmare that was Vista's development), and Apple employees are still arrogant as hell:
After all, Arno said:
This also goes to show that the design process at Apple is not exactly perfect either 🙂
He was overruled on some of his opinions, and therefore the design process is further away from being perfect? Sounds to me like a case of someone with a weighty idea, who considers his way as 'the' way.
For what it's worth, it also sounds like the dev process at Apple is fairly well balanced: at least it successfully weeded out some of Arno's heavy concepts of removing user control.