So it’s finally happened. I bought a Mac. I’ve fallen to the dark side, bribed by the shiny glass and aluminum (and the semi-serious promise of cookies). However, once I get past the shock and surprise, I’m left with a new set of thoughts on computing, and where it’s headed. I’m not sure I fully subscribe to their line of thinking on this subject, but I must say, even if they’re ultimately really bad for me, these cookies are awfully tasty.
First things first, let’s think about the overall experience. It’s very obvious that unlike many other companies, Apple doesn’t think of consumers in aggregated demographics–they think of users as individual people. While this type of thinking spawns a narrow view of computing and what it means to be a user (we’ll get into that in a minute), on the flipside it means that Apple has taken care to provide for the user’s experience, from start to finish. While the average windows PC feels like a loose amalgamation of various services, my Macbook Air felt like a contiguous process from first boot-up to me typing this post. I’ve been known to mock this catchphrase from time to time, but it really does just work. Within an hour, I’d learned 80% of the ins and outs I’d need for a good experience, and I genuinely enjoy my computer.
Let’s talk for a minute about the more controversial features of OS X Lion–namely, the full screen applications. I’m still not sure they make sense for the desktop environment–it still seems a little silly to have a single application take up an entire 27" monitor. However, on this tiny screen, Apple’s got it pretty nicely figured out. My 11" air really doesn’t lend itself to using multiple applications simultaneously, so it’s nice to have multiple application screens that I can swipe back and forth between. When it comes to the ultraportable form factor, this really does beat anything that current Windows-based computers can offer. I should add that I’m looking forward to this changing in the future, when Metro applications in Windows 8 get traction.
Second, let’s talk about “natural scrolling” and auto-correct. Both of these are pretty subtle features, and both are on any competent smartphone. While I initially did find the scrolling silly, I did eventually get used to it. I don’t think it entirely makes sense in an environment where I’m not directly touching the screen, but I adapted quickly. In all honesty, if scrolling had worked this way from the start, I don’t think anyone would have been the wiser for it. As for the autocorrect, it’s very subtle, catching only simple typos. However, I’m not sure I really like it–it feels like the computer is attempting to express my thoughts for me. I’ve left it on, but the minute it gets in my way, I’m turning it off.
The discomfort I feel with this beautiful yet cloistered environment revolves around two themes. First comes the issue of control. One of the biggest things I’ve noticed in my jump to the new platform is that I have much less control over the system. By default, I don’t have root access. The Finder does its level best to obfuscate where my files actually are. Some automated actions within the system (e.g. system standby when I close the lid, auto brightness, etc.) appear to be either counter-intuitive or impossible to change. This computer really gives the vibe that it’d be best if I didn’t mess with it and used it as directed.
And at the end of the day, what’s wrong with that? The end user never really has full control over an operating system, nor should they. Should I really be worried about root access if the OS gives me enough permissions by default? Is it a big deal if the Finder won’t show me the contents of my drive the way I’m used to seeing it? How important are the use cases where I’d want to alter my computer’s behavior? I can’t really come up with good answers for any of these questions. The simple fact is that these were things I was used to in my Windows 7 environment, and it feels like a step backwards not to not to have them. I’m sure some DOS user with their fine-tuned environment wasn’t so chuffed when Windows 95 hit the scene, either. This is a secondary concern, honestly–it’s a natural consequence of the evolution of computing.
The second issue is a little more serious–competition. While the Mac experience is actually really nice, it’s not necessarily compatible with anything else. I can really only get the full effect of this computer when I use OSX/IOS devices exclusively. In such a case, everything about my computing environment, from the hardware, to the operating system, to the applications I run, all exist at Apple’s pleasure. Further, once I settle into this environment, I can’t really leave–many of Apple’s services are proprietary and incompatible with their corresponding services on other platforms (iWork, iTunes (AAC), AirPlay, Time Machine backup). Again, this is for a fairly good reason–the proprietary protocols work well together. However, it means that as Apple gets more and more marketshare, it and it alone will be deciding the pace of innovation going forward. While I’m happy with the Apple of the present, if Apple ever goes the way of Microsoft, it could be disastrous for computing.
Fundamentally, my objection lies in the collision of two different desires–the desire for a good computing experience, and the desire for better products through competition. While Apple’s policy supports the former, it files headlong into the latter. While I can’t argue with what I’ve got here in front of me, I worry that betting on Apple to continue this pace of innovation indefinitely isn’t necessarily a good thing.