So, I have had the opportunity to compare-on the same computer-Windows 7 and Mac OS X ‘Snow Leopard’. Both operating systems are fully patched and current. While my personal preference is Windows 7, I tried to be fair and open minded about OS X. I REALLY wanted to like it. The jury is still out on liking, though. I have problems with it, namely performance. After running just a few days, the operating system became slow and unresponsive. To be fair, it could have been either iPhoto or iMovie having a memory leak. I don’t know enough about either to make that kind of judgment. All I know, is that I had to restart the computer to get it to respond normally. My biggest complaint, however, can be shared with Windows 7 as well and it got me to thinking (that and an email from a friend, thanks Sam.)
What got me thinking and is shared amongst the operating systems? Intuitiveness. Yes, the one thing that the Mac OS is supposed to be known for and where Windows 7 made a big push. Unfortunately, both fall flat. As for Linux, I’m not even going to discuss it, it is far from mainstream and, yes, I know all about Android. I don’t care. Ubuntu, et. al, are simply NOT for Joe or Jane user. Period.
Let’s start the discussion with Windows. Windows 7, in many ways, was a tremendous leap for Windows. It took the best from Vista and other versions of Windows and rolled them into an easier to use and nicer looking version of Windows. Aero Snap is a great addition and makes for easy side by side or top-bottom comparisons. Live Taskbar previews is awesome. The ability to hover over an object in the taskbar and SEE what it is doing is a huge benefit. Alas, Microsoft giveth and they taketh. Perhaps Windows biggest and most intuitive feature was removed and replaced by an orb. Yep. I am talking about the START button. A brand new user, someone who has NEVER used a PC before, at least knew what to click (provided they KNEW how to click, that is) to begin using the computer. It was simple: you click START, then you see a menu and, amazingly, there is a “PROGRAMS” menu option. How’s that for intuitive? Yes, that all is still there, differently, but there. Except, that is, for the START button. Microsoft replaced it with a Windows Logo Orb. So, now when that shiny new computer starts up, the new user is presented with a plain Windows desktop with a near empty task bar and that silly ORB. What to do? Well, most will certainly move the mouse around and click on things, so they will happen upon the orb unless, of course, they know someone who has used it before and can guide them.
Once past the orb ordeal, the user will, eventually, start an application. They are likely to be presented with something of a window, a menu and the rest of the application. Most windows will have the minimize, maximize and close buttons. Those will be foreign to the new user. The idea of maximizing a window will be unknown. I can’t tell you how many times I have seen someone using Word in a tiny little window and hearing them complain about it. Why? They didn’t know that they could make the window bigger. Having the mouse change shapes lets those of who do this everyday know that when it changes, we can generally do something. To the uninitiated, they may not even realize the mouse changed, let alone know they can do something more.
Mac OS X also fails on all of these points. With OS X, it is even worse. When it starts, you see a rather full dock. Again, the user will likely just start clicking things and will, inevitably, start an application. Probably not what they wanted, but they will do something. So, how do they actually start something that is not on the dock? Where’s the START button or the ORB? Nowhere. They do not exist. No, you click on a FOLDER called APPLICATIONS. Then you see this nasty big box pop up with more pictures. The user can probably figure out, from the text, which one they wish to start, but it is not that intuitive. And the bouncing icons are likely to amuse, but probably won’t let the user know that that icon is actually responding that way because the application wants your attention. Ditto in Windows 7. Just because the icon is lit up, does not mean the user will either notice or care.
There are a lot of cues in both operating systems that, on the surface, to the uninitiated, don’t mean anything and the user will be left clueless. Both operating systems are inconsistent in both presentation and capturing input.
With a quick tutorial, users can get around both operating systems with a certain comfort level and relatively quickly as well. However, the point is that they should not have to do so. I think this is one reason why devices like the iPhone and iPad have become such a success. You can pick up an iPad and start using it right away. Oh, sure, there are somethings about the iPad that are not crystal clear, but Apple did a really nice job with ease of use in the iOS. Just touch something. That’s it. Simple and elegant. Unfortunately, neither Windows nor Mac OS X are built to be used that way. And since devices such as iPad and iPhone are now mainstream, I am not sure we need our desktop computers to be dumbed down to the same level. A certain amount of complexity is not necessarily a bad thing.
BUT…maybe the desktops should have a ‘novice’ mode where you do away with menus and such things. Microsoft is on the right track with the Ribbon UI, but, even that is a bit too complex. They have a good feel for the most common functions. Put those up front and hide everything else. The UI should also have intelligence to figure out that the user is getting smarter and adjust the UI accordingly. I don’t think this would be too difficult to implement. Allow the user to choose, when they first setup the computer, a novice, normal or power user mode and set the UI accordingly. And hide the lingo. Things like ‘file system’, ‘bytes’ and ‘hard disk’ should not be required for the user to manage the computer. They don’t care about them. All they want to know is how do I continue working on that letter or change that spreadsheet. Apple, again, is on the right track with iPad. iPad does not have a user discoverable file system. That bothered the hell out of me at first. I don’t care about it now. My apps just know where my stuff lives and that is all I care about. Oh, once in a while I wish I could override that, but, overall, I just don’t care. It works. Period.
Apple’s iPad is pretty easy to master, but it is not perfect. All one has to do is open the Settings applet. This thing is a monster. It is confusing and a royal mess. Setting up the WiFi is also a pain in the rear if your WiFi network has ANY kind of security. In fact, WiFi on ANYTHING is a pain in the rear. Having to deal with all of those settings is enough to turn most people away. I fully understand why there are so many private but open WiFi networks. It is dead simple to set them up. Add any kind of security and you suddenly need a phD in networking. Ridiculous.
I don’t have the answers to these problems, but my experience with the Mac Mini did make me think about this stuff. Listening to and answering the questions of friends and family also me make keenly aware that this stuff really is still pretty complicated. It also points out that the best computer in most homes isn’t a traditional computer at all. No, I’m talking about the video game console. Yes, they are getting more complicated (PS3 is terrible: the menu system is just dreadful) but just to play a game? Brain dead simple. Controlling the games, while more complicated than they used to be, is still pretty easy. Us old farts may take issue with the number of buttons on most controllers, but almost anyone under 40 can pick up a controller and start playing. For the rest of us, well, that’s why Nintendo brought out the Wii: perhaps the easiest console to play. Ever. It has the simplest of controls, a simple menu and is very easy to play. That is the heart of its appeal. (It’s also why many ‘hardcore’ game fans hate it.) Computers would be much more accessible if they followed the Wii model. But, Apple’s iPad is great start.