Continuing a discussion of desktop UI from the previous post. Be clear that most of what follows applies equally to Windows and the Linux desktops; my point is that Apple popularized these ideas and the others—-misguidedly—-still follow Apple’s lead.
Compared to Microsoft, I don’t especially begrudge Macs for their existence, and I do recommend them over Windows to naive users who don’t have anyone to maintain and set up their boxes for them. Still, it annoys me how many people have drunk the Apple koolaid and believe that OS X has anything on the basic Windows experience beyond gloss (and an ‘it just works’ quality earned only by a controlled, minute hardware and software landscape); just about everything beyond a few features of the OS X interface is just arbitrarily different from other desktops. Macs wouldn’t annoy me so except for how their influence perpetuates through fashion stupid graphical interface design ideas which both Microsoft and Unix desktops have slavishly followed. Contrary to popular opinion, Macs are not the end-all/be-all of usability, and in fact, Macs have long perpetuated some erroneous thinking about usability. There are several things seriously wrong with the desktop/windows metaphor, and Apple is responsible for most of them.
1) the metaphor of files and directories as physical objects
It’s not 1985 anymore.
Compared to the detailed list view of files, the icon view is a paragon of form over function. Not only should icon view not be the default folder view, there shouldn’t be an icon view. It’s flat out stupid. The browse-ability of a list of words is far superior to a grid of words (let alone stupid non-grid piles of icons). Yes, some users have trouble aiming the cursor, so give an option to make the lines a bit fatter. Problem solved.
Seriously, is there any point to files/folders as icons other than gee-whiz factor?
(Thumbnail view is fine, but it’s really a totally different thing.)
Most things just can’t be conveyed pictorially (see earlier post, Why Icons Suck), especially with pictographs 1cm in diameter. To their credit, Apple has finally realized this and focused on making bigger icons. Still, Apple started this thinking, and many Apple apps still perpetuate illiterate, push-button design that values gloss over function.
3) allowing the desktop background to be used as a place to put icons and files rather than just a place to put wallpaper
- First, this just creates clutter because it encourages people to just lazily dump their files there (I’m guilty of this myself).
- Second, it confuses learners. I remember trying to explain Windows 95 to someone: ‘So the stuff on my desktop is in the C:\Program Files\Desktop\ directory on my computer…but isn’t My Computer on the desktop?’ Apple gets around this by generally keeping most users out of the real file system hierarchy altogether, but the desktop-as-folder still raises virtual-space disorientation: files on your desktop are also in your finder. So much for the metaphor of physicality.
- Third, the existence of the desktop as a working surface necessitates a way to get at it quickly, like shortcut keys and OS X exposÃ©. This added complexity wouldn’t be necessary if you got rid of the damn thing in the first place.
- Fourth, the desktop serves cross-purposes: Apple and Microsoft encourage users to put application shortcuts as well as files on the desktop, meaning users are presented with the silly choice of whether to put application shortcuts on their desktop and/or in the start-menu/dock, and then later they have to remember where they put it, or at least make an arbitrary choice of which to use.
4) free-floating windows & 5) drag-n-drop:
Free-floating windows work against two basic facts:
- In my experience, just about all real work is done with full-screen applications, but Apple seems to have something against letting you maximize windows. Just look at the screenshots here: http://www.apple.com/macosx/features/expose/ . Apple clearly thinks it’s neat to have non-maximized windows scattered artistically around the desktop, wasting screen real-estate, because it looks neat. [update: duh, bad example; that of course is the exposÃ© feature in the shot; see the first shot here for a better example]
- In the cases where I do want windows side-by-side, the scheme still takes a lot of work to get them that way, especially if I want to use my screen space well. And even once I have, say, two windows nicely side-by-side, switching between those windows and a third window is bothersome because I then have to make two dock/taskbar selections to switch back to the two windows.
Nearly as bad, Apple’s single-menubar design violates the classic rule against stateful interface: if a user momentarily forgets which window is active, they’ll get stymied when they go up to the menubar. (The way to mitigate this is to make the active window stand out glaringly, making it hard to not notice which window is active, but the standard OS X themes don’t make the active window stand out well at all).
What offends me most about the whole free-floating scheme is that it violates a principle: our interfaces should make the small annoying decisions for us; and allowing windows to float around overlapping each other violates that rule because it requires the user to make annoying random choices of how to get at the windows they want: shall I move or minimize windows to get at windows underneath? Shall I alt-tab directly to the window I want? Shall I use exposÃ©? Shall I use the taskbar/dock?
The real promise of putting applications in free-floating windows—-the whole point of the idea—-is that users will be able to put applications side-by-side. This goes hand in hand with drag-and-drop because, for drag-and-drop to be efficient, the source window and destination window should both be in view. In practice, this is usually not the case because it takes too much work to arrange windows side-by-side to be worth the bother.
6) window activeness
I want to click on something in a window. Do I need to click on the window first, then click the thing, or can I just click the thing? I have to think about which window is active to know.
7) sub-windows and pop-up windows
It’s bad enough that applications are free-floating amongst each other, but then sub-windows of an application are allowed to free-float independently of each other such that it’s possible to have the windows of an application confusingly split-off from each other by windows of other apps. Naive users are confused by this arrangement, and as a knowledgeable user, I’m simply bugged that I’m being presented such a useless possibility.
Then we have further free-floating window pollution in the form of pop-ups and dialogues. Many people have pointed out how annoying it is to have your focus stolen by these mini-windows or to lose one behind other windows. Less commonly cited, though equally detrimental, is how their lack of attachment to a fixed space in the application degrades the user’s mental layout of the application. It’s always best for things to live in their unique place and stay there rather than to exist outside of space-time, as pop-up messages and dialogues seem to do.
8) redundancy between the menubar and toolbars
In most apps, the application menu bars are considered too cluttered and inaccessible by their designers, so shortcut icons to a selection of menu items are placed below the menu bar, e.g. the back button is a menu item in a web browser but also an icon below the menubar.
In simple applications, like web browsers, this redundancy is not so bad, but as the application gets more complex and the number of convenience icons grows (think Word or Photoshop), the redundancy becomes a nuisance to both newbie users and experienced users alike. For newbies, the preponderance of overlapping choices are confusing; for the experienced users, having to make the arbitrary choice of whether to look in the menubar or the mess of icons becomes mentally taxing.
The basic flaw with Apple design philosophy is that it chases the chimera ideal that users shouldn’t need to know anything about how something works before using the system, i.e. ‘users should only have to know what they already know‘. This made some sense back in the 80′s when computers were still totally mysterious and had limited functionality, but computers are such a presence in most people’s lives today that it’s absurd to think that users should never learn the most basic concepts of their use.
Imagine a hypothetical Apple iCar: users step into an enclosed space with no windows and press a button that takes them where they want to go; iCar users have no conception how the iCar does what it does—for all they know, it teleports rather than traverses through space. The iCar may sound nice and dandy, but its users are totally at a loss when the system fails to always work perfectly as real-world systems always do.
So ironically, Apple and Microsoft interfaces don’t live up to the ideal of discoverability because their solutions encourage naive users to remain naive and think of every interaction with their computer as ad hoc, i.e. perform such-and-such ritual to do x. When the rituals change even slightly, users are left totally helpless.
Even experienced users can run in to this helplessness. For instance, I tried helping my sister use her Olympus digital camera with her macMini, but when I plugged it in, it wasn’t detected—no error message, friendly or otherwise, no anything; Google searches turned up absolutely nothing. Apparently, Macs ‘just work’ except when they don’t, in which case there is no appeal, even for experienced users.
When I say users should learn basic computer concepts, I’m talking really basic, such as a working conception of file system hierarchies and what a program is. Apple and Microsoft have tried hard to obfuscate these basic ideas. Apple thinks they created a desktop metaphor in which people don’t have to learn such basics, but they’re only half right because their metaphors don’t scale, e.g. the ‘files as objects’ metaphor works fine when you only have a few files, but more and more non-technical users are using more and more files and they’re making a mess out of their drives as a consequence. Similarly, the more and more uses we have for our computers, the more it matters that users rely upon a conceptual understanding rather than a memorized set of particulars. An OS interface, then, should be judged on both the learn-ability and generality of its concepts, not the gloss with which today’s finite set of computing tasks have been presented.
A better desktop (and better desktop apps) would embrace a few design principles:
- Software shouldn’t be totally transparent to naive users the way Apple thinks; rather, the mechanisms of discovery should be simply well-established and omnipresent so that once learners have been educated about them, they can apply them consistently to the unfamiliar programs they encounter and explore as needed.
- Interface redundancy is a nuisance and makes software look more complicated than it is, thus discouraging users. A good interface doesn’t force users to constantly make small, random decisions the way that giving users 5 ways to delete/move/copy files and 5 ways to close programs does. (Redundancy between mouse and keyboard is a special case.) A good desktop would adopt a Pythonic, one-way-to-do-it philosophy.
- When thinking about use cases, be conscious of keyboard and mouse use. In general, there should be a keyboard-only way of doing most tasks, a mouse-only way, and perhaps a mouse-keyboard combo. In particular, strive for one-handed usability where possible [no jokes here!]. However, the best designs often take a strong stance upon whether a particular task is best accomplished with the keyboard or/and mouse.
- Contrary to Apple’s opinion, features should not be cut just because they’re not instantly graspable by novice users. The real solution is to put advanced features out of sight while still making them discoverable.
- The greatest virtue of GUI’s is that they generally have great discoverability while CLI’s have virtually none. Still, there is a lot to be said for the syntactic power of the command line. I don’t foresee most users doing any scripting, but I think there is a wider userbase for scripting out there if it can be made simpler than AppleScript (Pygeon may be a candidate here).
My next post will present ideas on how to actually achieve these goals (for real this time!).