chipotle: (Default)
[personal profile] chipotle

Against my better judgement, I’m going to write about the iPad. It’s been long enough that everyone’s already formed an opinion, I suspect; I’m going to start off by throwing a bit of cold water on some of the opinions I’ve been seeing.

The iPad is software upgradeable. Duh, you say? Well, given how much of the sneering going on is about things that are software-based, apparently this needs to be hammered home. The iPhone didn’t even have downloadable apps when it first shipped. In fact, it’s likely the iPad software demonstrated wasn’t finished yet: the iPad home screen is unusually—some might even say suspiciously—bare, and the keyboard accessory has a mysterious unlabeled white key on it. And while Apple has dissed multitasking on the iPhone because they think it makes phones unstable, the iPad is not a phone.

The iPad is not just a “consumption” device. I understand only Apple nerds probably sat through the whole presentation, but we saw a word processor with style sheets, multi-column layout and text flow around irregular objects; a spreadsheet program with well over a hundred functions, full charting capabilities, a “flat database” functionality that the desktop version doesn’t have, and a really interesting dynamic keyboard; a full-featured presentation program that looks like it’s easily the rival of PowerPoint for creation, not mere playback; and a fairly sophisticated-looking paint program. And OmniGroup has announced they’re porting all five of their major productivity apps to the iPad. If you want to claim that you can’t imagine ever using a device with a virtual keyboard for doing “real” work, fine—but don’t assume that everyone else will come to the same decision.

Policies are like software. One of the recurring complaints is that the iPhone and iPad ecosystems are “closed,” in that you can only install software from Apple’s App Store. And the recurring response to that is to parrot Apple’s line about why that’s not such a bad thing, and to point out that the evils of this are routinely overstated for maximum dramatic effect. Only a fraction of a percentage of apps submitted fail to make it to the storefront. The problem that it’s inconsistent and poorly-defined, and this is because the review model doesn’t scale even to the point it’s at now. This can only get worse. Apple’s corporate culture is certainly one of obsessive control freakishness, but it’s not one of stupidity. There’s no guarantee that this is going to always be Apple’s policy. (It’s also worth mentioning that Mobile Safari on the iPhone supports so many HTML5 features, including local storage, that Google was actually able to re-implement Google Voice as a web app. Just saying, don’t write that power off.)

Data is not like software. I want to violently shake the “information wants to be free!” types who launch into tirades about this. The rights being managed on digital media are the rights of the copyright holders and the copyright holders on music, movies, and books are not Apple. Also, anyone who writes something that implies Apple software and devices only work with Apple-provided media needs to be face-punched. Seriously. Yes, you generally have to work with iTunes to manage content on Apple devices, but iTunes and iPods have always played unprotected MP3 and AAC audio and H.264 and MPEG4 video. Which are open standards. As in not Apple’s. Yes, it’d be great if those DivX files of the current season of “Lost” you no doubt legally acquired from BitTorrent would play on the iPod without conversion, but inconvenience is not lock-in. (They will play in iTunes, since all it needs is a plugin. And if you’d really rather jam a fondue fork up your nose rather than use iTunes, there are alternatives.)

Hatred of Apple is largely irrelevant. If the iPad concept takes off, there will be alternatives, just like there are alternatives to the iPhone. If the iPad concept doesn’t take off, there probably won’t be, but nobody outside Apple will care.

So do I think the iPad concept will take off? Yes, because I don’t think the iPad concept is “iPod Touch with a bigger screen.”

The Mac, back in 1984, was the first mass market computer with a GUI and a mouse, with windows and a desktop metaphor and an insistence on WYSIWYG everywhere. Apple had tried this before with the Apple Lisa, which was more or less just a cheaper version of the Xerox Star—cheaper being very relative, since the Lisa was around $10,000. The Mac reimplemented those ideas from the ground up and slashed the price to… well, merely high. While a few people saw it as revolutionary, a lot of people thought it was frankly nuts.

Except all the metaphors Apple was pushing won. And it did revolutionize not just the computer field, but the publishing and design fields. The Mac never set the world on fire in terms of sales, but the operating system you are using right now—whatever operating system you are using—is the way it is, at least in part, because of Apple.

And love or hate them, Apple has a track record of shoving the industry forward. In addition to all that toy GUI stuff, the Mac did everything else its own way, from using RS-422 (“LocalTalk”) serial ports and SCSI hard drive interfaces to using 3.5” floppies at a time when everybody else was still using 5.25” disks. LocalTalk never left the Mac market, but 3.5” floppies became the standard… and were still the standard when the iMac came out in 1998 and said, “Screw floppies, you don’t need ’em.” That was also met with derision, of course—as was the iMac’s cavalier dismissal of standard RS-232 serial ports, PS/2 connectors and parallel ports, eschewing it all in favor of this crazy thing nobody else was using called USB.

And now Apple is trying to shove the industry forward again. They’ve been talking about “computing appliances” for years, going back to Jef Raskin and through the Knowledge Navigator concept. And now, finally, they’ve got something they think is worth bringing to market.

If you want to write that off as rampant Apple fanboyism, so be it, but read Fraser Spiers’ “Future Shock” piece for a good explanation of why I’m making that assertion. Or to put it in a more nerdly fashion: the iPad demo probably used clips from “Star Trek” for a good reason.

This is, as someone else put it, the first incarnation of the thing in the movies. In 2007 I wrote about ubiquitous presence, a variant on ubiqitous computing—the idea that networked computers will eventually be so much with us that they fade into the background the way electricity and indoor plumbing do for us today. We’ve had “smartphones” for a few years, but the iPhone redefined the category—even if you believe (and you may be right) that the Droid or the Nexus One or something else coming out in the next year is a better smartphone.

The iPad is setting out to establish a category. It may fail; in 1999 there was an awful lot of buzz about “internet appliances” which I chiefly remember as the chimera that killed BeOS. Maybe nobody really wanted computer-as-appliance—but maybe the infrastructure just wasn’t there for that then, and maybe it is now. And in the most insanely optimistic outcome the iPod will still have all the limitations of Things That Come First. The Catch-22 is that while we can see some of the potential limitations now, we won’t really know them until we have the Things That Come Next in our hands.

I was in on the ground floor for the original introduction of microcomputers, the TRS-80s and the Apple IIs and CP/M. And honestly, that was pretty cool. I wasn’t really in on the Mac, though; I didn’t like mice or windows and I was happy with the command line. I missed that inflection point until it was quite obvious it had passed. It looks to me like this may, may, be another inflection point. If this is the next Thing That Comes First since, well, the original Mac—I think I’d like to be in on the ground floor again.

Date: 2010-02-06 18:33 (UTC)
From: [identity profile] chipotle.livejournal.com
If you think Apple -- and even Adobe, really -- have done nothing innovative and that they've only been successful through marketing, then you have a very different view of computing history than I do. Then again, you paint yourself, more or less, as someone who thinks user interfaces got to about where they needed to be a quarter-century ago and everything since has been driven by marketing and flimflammery. :)

30 years ago in computing you picked between competing views of the universe presented by CP/M, the Apple II, the Atari, the various Commodore computers; you got someone else's vision of what a computer was supposed to be like. Today, whether I'm using a Windows machine, Linux machine or an OS X machine, I can put what I want on either one very easily, and with all respect, someone who tells me that's not the case is someone who really hasn't learned much about whichever platform they're dissing. I personally find Windows the most restrictive and OS X the least, for the fairly straightforward reason that I run a lot of Unix software but also want to have the option to run commercial software that exists on the Mac but not Linux/BSD. I can get Apache, MySQL, Postgres, Emacs, etc. etc. for the Mac; I can't get OmniOutliner and Scrivener for Ubuntu. The notion that my Mac is somehow held hostage to someone else's vision of how computers should work is, with all respect, simply false.

The iPhone and iPad are different animals; they're computing appliances, not general purpose computers, and I think a lot of the negative reaction to them comes from evaluating them in terms of things they're not. A Mazda Miata is going to do very poorly if you judge it by the standards appropriate for heavy duty pickup trucks. And if what you actually need or want is a pickup truck, a Miata is a terrible choice--but that doesn't make the Miata bad at being a small, spry roadster.

Date: 2010-02-07 00:01 (UTC)
ext_39907: The Clydesdale Librarian (Miktar's plushie)
From: [identity profile] altivo.livejournal.com
Well, now, I don't demand a return to CP/M or DOS. It's true though that one of the things I absolutely hated about the Mac when I was forced to use one for several years was the total lack (at that time) of any command line interface. Obviously, that was before Apple made the paradigm shift to a Unix style kernel. More important to me though was the difference between cooperative multitasking (the style of both Mac and Windows back then) and pre-emptive time slicing (the style of Unix and the Amiga at that time, then not long after the basis of Linux.)

I don't mind a graphical display as long as it isn't dragged down in speed by excessive and useless ornamentation and "effects" but for most tasks I really hate mice. Being forced to move my hand off and back onto the keyboard dozens of times in order to perform simple tasks makes me want to throw the computer out the window. I prefer to keep my hands on the keyboard. I'm a reliable, fast touch typist. The mouse is OK when you need it for efficiency, as when cropping images for instance. "Drag and drop" vs. a verbal command, though, has no appeal to me.

Commercial software generally disappoints me. I find it limits me to an interface designed by people who have totally different perceptions and needs than my own, and is often so bogged down by its own copy protection and licensing routines that a lighter and more nimble open source application can run rings around it.

Yes, I understand the difference in function and applicability between say a netbook and a graphics workstation. However, I am likely to find that both of them are far more usable than iPhone, iPod Touch, or iPad. I find touch screen interfaces to be the most irritating and least precise of all. They also generally require straight-jacketed designs in order to be at all functional, since touch commands are necessarily imprecise.

My distaste is not limited to Apple products, either. For example, the Kindle is just as unpleasant in my opinion as any of Apple's recently lauded productions, and furthermore suffers from the same monopolistic mentality as regards content and distribution.

Date: 2010-02-07 04:17 (UTC)
From: [identity profile] aliasisudonomo.livejournal.com
While he wasn't totally accurate on every point, if you've never read it, look up Neil Stephenson's "In the Beginning there Was The Command Line".

If you have, then in a very real sense, you understand a graphical interface is not designed for you. The skills needed to manipulate the various abstract symbols and concepts that make up computer programming do not particularly overlap with the skills most people have.

The strength of Apple - and the weakness of most open source apps - is they don't let the programmers handle the UI. This inevitably results in a programmer-annoying UI, but one that other people can use and even enjoy using.

Date: 2010-02-07 05:29 (UTC)
From: [identity profile] chipotle.livejournal.com
I certainly remember it myself. IIRC, Stephenson wrote it on BeOS; at the time he was very much a huge BeOS fan:

The ideal OS for me would be one that had a well-designed GUI that was easy to set up and use, but that included terminal windows where I could revert to the command line interface, and run GNU software, when it made sense. A few years ago, Be Inc. invented exactly that OS. It is called the BeOS.


The reason many (not most, but many) BeOS fans ended up migrating to OS X is pretty much because it's the only other operating system that really meets that criteria. In some ways it's still not as elegant as BeOS was, but it's... well, the best way of putting it for me is that OS X can get out of my way better than any other operating system I've used, and that's really important to me in day to day use. This is why, I confess, I often find conversations like the one with Altivo baffling: I've been using computers an awfully long time myself and have a lot of experience with a lot of various UIs. Any assertion of the form "Macs are so restrictive and make you do things their way" is so far removed from my experience it's hard for me to grok that we're talking about the same machine.

(I should note that I got pretty good at getting Windows 2000/XP to get out of my way, too, and was a perfectly happy user of Win2K for quite some time. It and FreeBSD just both required a greater amount of fiddling to get me to my happy place.)

Date: 2010-02-07 13:27 (UTC)
ext_39907: The Clydesdale Librarian (radio)
From: [identity profile] altivo.livejournal.com
Note that when I say Macs are restrictive I am referring to the Mac I knew, which is certainly not today's Macintosh. It is the the MacOS 6 and MacOS 7 environment, which is long dead and unlamented.

I was not familiar with Stephenson's commentary on operating systems until now. (I found his fiction unreadable and haven't paid much attention to him.) I looked up the essay and found at least the beginning of it on the web. Certainly what he has to say at the beginning is in agreement with my own subjective experience. My derision for Apple is built of many factors, but there are three elements that I find repellent. One is the "hermetically sealed mechanism" as he puts it, another is the absurdly high price in actual dollars, and the ultimate coffin nail is the emphasis on a point and click environment that I despise, one that was originally forced upon me with no alternative at all other than to deal with it on its own terms. This is rather like being dumped into a world in which one must speak Inuit in order to survive after having spent a number of years becoming fluent in six other languages, all of which are much more widely spoken and understood.

Ultimately I have chosen to use Linux for the past ten years, not because it is lacking in warts but because it does what I want and is affordable. Apple products are not cost effective to my way of thinking, and Microsoft products are too unreliable to contemplate. I am forced daily to support users who know only Microsoft's paradigm and understand that only dimly, along with the occasional Apple user who is convinced that everything else is so inferior as to be unworthy of their notice. This has done nothing to improve my opinion of either environment.

The ease with which I personally use Linux stems no doubt from a background in managing Unix-based systems and particularly IBM's AIX. I understand why people say it's not for them, and though Ubuntu has gone a long way toward making Linux actually competitive with OS X and Windows, in doing so it has acquired many of the same qualities I dislike most intensely.

Yes, I'm a curmudgeon. I drive a manual transmission car, which most people seem to consider an alien or antiquated concept. I prepare and eat food from fresh ingredients and do not use frozen convenience products or buy restaurant fast food meals. I do not have digital television (or watch television, for that matter) and though I have a cell phone it spends at least 99% of its time turned off.

Date: 2010-02-07 13:38 (UTC)
ext_39907: The Clydesdale Librarian (radio)
From: [identity profile] altivo.livejournal.com
Indeed, the GUI is not designed for me. I resent being forced to use it against my will. And therein lies my deep-seated dislike for Apple, whose products forced me to use a GUI without any alternative in order to make a living for several years.

I don't care for Stephenson's fiction and was unaware of his dissertation on operating systems. I found part of it on the web and you are correct in guessing that I would agree with him, though it seems pretty dated now and needs to be brought up to date on a decade more of developments. Many of those developments are bad, in my opinion, not necessarily because of what they are but because of the effect they have on users.

Date: 2010-02-07 14:10 (UTC)
From: [identity profile] aliasisudonomo.livejournal.com
http://garote.bdmonkeys.net/commandline/index.html

This is the complete essay, along with comments from another author to update it to 2004.

However, the bit that I considered his real point - not the distressingly technical minutiae but the fundamental bedrock of the essay - was the whole 'distillation' concept, or Morlocks/Eloi as he put it, the whole section on Disney World. Nobody has an infinite amount of time to learn everything, so being able to skillfully boil a topic into the essentials is an ability much in demand. I consider myself a technically-adept user. I am working towards a Information Science degree. I can program well in C, COBOL, Java, and an assortment of lesser scripting languages.

Despite all of this, a good 80% of my computer time at home is well covered by "Start Firefox", "Start a game", and "Start my IM client". My life would not be noticeably improved by typing the commands in as opposed to clicking an icon. I do not think I am atypical in this regard, and you could really simplify things past even point into something more akin to the PS3/PSP interface and cover most of what I do, and sell the result to more people... which is exactly what happened.

The upshot is you aren't a lone outcast, and even if hacker-friendly system becomes the exception more than the rule, it'll still EXIST. The Morlocks of the world have money too, after all. It's not a coincidence that the very first thing Apple's own team of programmers did with the original Macintosh was add in a proper command line interface.

Date: 2010-02-07 15:44 (UTC)
ext_39907: The Clydesdale Librarian (Miktar's plushie)
From: [identity profile] altivo.livejournal.com
That allusion to Wells' Morlocks and Eloi is much too close to the truth in my view. That's exactly what I mean about what the oversimplified and "directed" interface does to the end users. We have to remember what the Morlocks and Eloi really were. The Eloi were helpless sheep, the prey (and food) of the Morlocks and utterly dependent upon them for survival. Indeed, the corporate world of both Apple and Microsoft is something like that of the Morlock, as both strive to take advantage of the ignorant Eloi for their own profitability. And that's the bottom line, literally. Profit wins out over quality, appearance over function, and they continue to invest heavily in promoting this situation as "good."

What I see as a result is lines of people trooping into the library wanting to file their income taxes or handle their banking transactions online, with no conception of the inherent security issues of doing that on a public machine over the public internet... AND who walk away from the terminal to use the rest room, leaving their bank account logged in an vulnerable on the screen. Why? Because they have no idea of the mechanisms that underly what they are doing, and they've been taken in by heavy publicity campaigns to encourage this behavior as "easier" and "faster."

I see corporations requiring ALL job applicants to file a resume electronically over the web, even if the job for which they are applying is housekeeper or bus driver and has no connection to IT at all. They are turning away their best candidates, who will do the job well and be happy doing it. Those candidates have never used a computer in their life, and find it frustrating and incomprehensible IN SPITE of the supposedly wonderful GUI that is supposed to make it easy as pie. (Hint: many of them are utterly disconnected from the idea of typing on a keyboard, reading instruction or responses from a screen, or using a mouse or other pointing device.)

I do not see how this is improving anything for anyone other than a few fat cats who make cash profits that they don't need. Meanwhile we have the computer and entertainment giants (who are more and more the same people) thundering "You need a thneed" at everyone continuously, and making sure it's true by creating situations that are untenable for anyone who isn't carrying around a megapowered computing appliance connected by ultrawide bandwidth to a network filled with garbage. This is not progress, it is devolution.

Profile

chipotle: (Default)
chipotle

August 2017

S M T W T F S
  12345
67 8910 1112
13141516171819
20212223242526
2728293031  

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated 2017-09-25 02:31
Powered by Dreamwidth Studios