Yesterday, after many days of trying and failing to find one in stock, I went out to the Bellevue Square Apple Store, and bought my MacBook Air. The box alone is something to see. It's like a jewelry case. Thick box top that lifts off, and the Air is nestled into a display tray, ready to be admired. Lift that out, and the accessories are down in the second tray. It's the most technoerotic computer packaging I've ever seen.
It is a truly beautiful machine. It doesn't even seem to be a computer, it's just a wafer-thin keyboard with a wafer-thin screen attached. If you're not using the ports or power, there are no ugly breaks in the line of the case. While the MBP is a fine piece of hardware, and makes a great portable desktop computer system, it's still a giant technical-looking block of cable ports and slots and stuff. The Air looks like a magical artifact, like the Glaive, or Excalibur (maybe Thomas Malory gave Jonathan Ive some design advice). And of course, picking it up, you know it's magic; 3 pounds isn't even noticeable, it's like a stack of paper.
While all the media trolls were whining about missing ports and how the Air doesn't give them handjobs and $200 kickbacks, they missed one: there's no antitheft cable slot on the Air. This would actually be a real problem on the MacBook and MacBook Pro, because those are always going to be tethered to ethernet cabling and power and a half-dozen life-support cords, so taking it with you when you go somewhere is a pain. The Air doesn't care. Unplug the magsafe power cord, and take it. I'd still rather have one, but enough to break the lines of the machine? Maybe not. I should take a closer look at what they do for security at the Apple Store (besides having very polite armed guards at the door).
The keyboard is more solid and has a crisper feel than the MacBook's mushy, wiggly cheap plastic toy keycaps, which is good--I prefer the MacBook Pro's keyboard, but this is actually quite pleasant. The screen is almost too bright. I dropped the brightness a bit after I found the world outside the screen was getting darker. The VGA-quality iSight is pretty lame, but it does work with iChat videoconferencing, even with special effects (I iChatted myself across the room with Bonjour, got some wicked feedback). The little teeny tinny speaker under the arrow keys is awful. To sound good, you will need headphones or external speakers (audio jack, unless you like wasting your USB port on sound).
Performance is okay. I'm able to run a basic Eclipse Europa (NOT MyEclipse), and work on my hobby project. Second Life gets <10 fps fullscreen with all settings at minimum, in a sim where the MBP gets >30 fps windowed with everything maxed out, but it does run. It's not unusable, just really slow at the 3D graphics (which you'd expect, since it has a crappy Intel onboard graphics thing). For anything that doesn't do a lot of graphics, it's a pretty snappy machine. It's still a 64-bit, Intel Core 2 Duo, even though it's only 1.6GHz, and 2GB RAM is as much as I have in my MBP (yeah, I should upgrade it to 4GB... Not real motivated to spend money on it now).
vSide runs great, though. Right after I bought it, I couldn't wait to get home to pop it out, so I went to a Tullys and used their electricity and wifi, and went into vSide to goof off while waiting for power to charge up, and found a music listening station to see if there was any good music... They had Feist, playing "1 2 3 4", the song in the iPod Nano ad. Apple is everywhere!
.Mac is a life-saver with this device. All of my personal information just syncs across. As long as I'm using IMAP, I don't care which machine I run email on (my POP accounts are going to be a problem until I set them up to forward to another account that can do IMAP). iDisk lets me move files into a common space that all my machines can access. I already used it for that somewhat, but now I'm going to be using it as my primary "drive". Guess I'd better upgrade to 20GB. Alternately, I could just put more stuff on my little portable USB drive, but that's more stuff to carry around. Meh.
NetNewsWire could be better at .Mac syncing. Half the time, it fails. When it succeeds, it's really slow (okay, I have 1100 feeds, guess it might take a while). I guess I could go back to Bloglines, but I do like the NNW experience better. This is, of course, why they released NNW for free: to get people like me to buy NewsGator.
No idea yet how I'm going to set up Time Machine (Apple's is nice, but I prefer the original); my big firewire drive won't connect to the Air. I may have to wait for the Time Capsule gadget (I don't own an AirPort Extreme yet, so it's a practical solution to the networking problem anyway), and back up onto my USB media drive until then.
I don't have a bag small and light enough for the Air; my WWDC07 laptop bag is great for the MBP, but about as heavy as the Air and 10x more volume. I shopped around for a bag, and there's nothing. While using a padded manilla envelope is an amusing idea, I actually need a bit more protection and handles and/or a strap. I'd love to have the Crumpler Winston Fleece in a smaller size. Instead, I bought a cheap SwissGear Angle that's still too big by several inches, and weighs half as much as the Air. MADNESS! I'm sure there'll be some nice cases in a few weeks or months, but until then, it's going to be hard to carry the Air correctly. Ideally, the case should be just a layer of neoprene with a zipper, and a carrying strap that can be tightened to use as a handle or loosened to use as a messenger bag. And it should be black with aluminum-colored trim.
So, worth the $1799? Yes, absolutely. I don't know that I'd buy the $3098 64GB SSD version, regardless of how much faster the drive is; 16GB less space would be crippling on such a small device. For me, this machine only makes sense as a travel computer, but it's SO good for that task that anything else seems archaic, it makes every other laptop look as lame as the iPhone makes every other smartphone.
Anyone who develops web pages woke up to some nasty news about Internet Explorer 8 today, according to a Microsoft developer's press release on A List Apart. The one-stop response is John Resig's X-IE-VERSION-FREEZE post, and see also Surfin' Safari's "we don't need it" response.
IE6 and IE7 don't render web pages correctly, so Microsoft won't have IE8 won't render them correctly, either. They want developers to insert a new tag in their page <head>:
<meta http-equiv="X-UA-Compatible" content="IE=8" />
This will make the page render using the marginally-less-incompetent IE8 rendering engine. If you leave it off, it renders like IE6 or IE7. If IE9 ever comes out, the page will still render like IE8, so you'd either have to change the content to "IE=9" on every page you ever wrote, or use the forward compatibiity content, "IE=edge". Naturally, being Microsoft, they discourage the use of forward compatibility. If you were so stupid that you still worked at MS, you certainly wouldn't want the world to get any more useful and functional in the future, after all.
So Microsoft has opted out of following web standards in the future; they want to condemn the entire world to render like IE6 did, even when they "upgrade" their browser.
Well, enough of this. The answer is not to write version-specific code and leave the browser to stagnate; MS already tried that with IE6, and it's rotten. The answer is to write pages according to the HTML specification, test on real browsers like Safari and Firefox and Opera, and if Microsoft is so incompetent that they can't handle the standards, they should be ignored. Put in "IE=edge" if you feel generous. I suppose I will for work, but never for my own pages.
The longer-term solution is to stop enabling Microsoft. Option 1 is to simply redirect IE users to a download page for a real browser. Option 2 is to write an ActiveX plugin for IE that will let us embed WebKit to render pages, much like the Tamarin-on-IE7 hack. Then we can shove a proper rendering engine down at people who are still foolish enough to keep using IE.
- Everyone competent has already left for Google or Amazon or startups. I live in Seattle Eastside and get to regularly see the kind of sub-room-temperature IQs who still work there. I have actually seen these people using shit-brown and puke-green Zunes. Yes, seriously!
I saw Cloverfield last night, and I regret it. I must warn you all to avoid this film, don't waste your time and money, and do not encourage this bullshit.
If this had been filmed with a traditional camera on steadicam, it might be a reasonably good monster movie with a better-than-average character story. Sadly, however, hack pseudo-reality TV writer J.J. Abrams watched The Blair Witch Project FAR too many times, and tried too hard to imitate it.
The entire film is shot from the POV of a hand-held camera owned by one of the characters. When the characters are running, which they are for most of the film, the camera shakes constantly, veering off to the sides or down at the ground instead of UP at the action, so nothing can be seen. Even when at rest, the character doesn't hold the camera straight. It's nauseating to watch.
Blair Witch kept the running to a minimum, had a high-res camera on a tripod as well as the videocamera, and despite the "found footage" premise, was competently shot. Cloverfield is the worst-shot, most aggressively audience-hostile, incompetently-filmed piece of footage (I cannot even bring myself to call it a "movie" as it is) that I have ever witnessed. There are home movies of 5-year-olds having a birthday party which are better filmed.
The story itself is fine; I liked the characters and they mostly acted with some sense, though why Marlena follows a group of near-strangers into certain doom instead of escaping is unclear, but she is mentally damaged by that point. The monster looks good. The baby monsters/parasites/low-level enemies are an unnecessary distraction to the story, but are well-executed. The special effects and the military presence are great. The product placements for Nokia were very aggressive, and clearly the film should have just stopped for a Nokia commercial rather than shove excess placements in.
I was immensely happy to see that the ending didn't flinch away from what really happens to anyone stupid enough to stay in a city where a giant monster is attacking.
But for the filming, this film deserves to die, and J.J. Abrams needs to be busted down to making commercials for Nokia.
"separate fanbois from cash.", as DarkMark said of it yesterday. And yes, it does that. And I need one. My back needs me to get one.
I currently have a MacBook (5.0 lbs), and a MacBook Pro 17" (6.8 lbs).
I use the MBP for software development, Second Life, and as my connection point to the Net, because I always have it on me. This kinda sucks, because it's HEAVY. It's HUGE, too, but it fits nicely in my WWDC 2007 bag. I'm more concerned with the effect that carrying 7+ lbs (once I add the power supply, because it has maybe 2 hours of battery life, MAYBE).
The MB used to do this job minus the software development and Second Life, and also served as my writing tablet. But I can't carry both the MBP and the MB (12 lbs?!?), and the weight difference isn't much, so it made more sense to carry the slightly heavier one that was so much better.
The MacBook Air is 60% of the weight of the MB, and 44% of the weight of the MBP. I could happily carry the Air all the time, assuming it met my needs... The price is high compared to a MacBook, but not unreasonably so, compared to other ultra-light laptops. The sealed memory, disk, and battery are fine, since I'd just take it back to an Apple Store in event of emergency anyway. This is why you should always buy AppleCare.
What I actually need, day to day away from my desk, is Safari, Mail.app, NetNewsWire, BBEdit, and iTunes. MyEclipse is too heavy for use on the MB, and the MBA has the same performance, but MochaCode is pretty freaking amazing, light, and fast, albeit still in pre-release, and Xcode runs fine on the MB; as I move to more and more Objective-C, I care less and less about the Java stuff. When I need to do serious Java development, I can do that at work (increasingly, that's the only Java code I want to touch) on the company's iMac workstation, or at home on my MBP.
My one concern at this point is how to sync my iTunes library. It's too big for any one drive, especially a dinky little 80GB drive. I have separate libraries for music and TV/movies now, but at least they're on one computer and don't have to sync. Actually, I could just use the iPod all day, and keep my actual music library at home. Ultimately, I should put that on some network-visible system, and use Back to My Mac to find it.
The larger implication of this gadget is that my world splits into "permanent storage at home" and "portable ubiquitous wireless device". The only remaining problem is the lack of a cell-based Internet connection, so it's only live near a wifi access point. I've tried setting up my Treo as a bluetooth modem, and have always failed utterly. This is the fault of the Treo, Palm, and Sprint. I hope Palm festers and dies, at this point; they're so grossly incompetent they deserve to suffer on the way down. Sprint's network is pretty good, though their customer service is atrocious. If only Apple would let me hook up an iPhone to a MacBook Air as a bluetooth modem...
During the MacWorld Expo 2008, Twitter died. Too many people, hammering it with updates, too many people reading it. And in posts to their blog, MacWorld
Why We Are Focused on Engineering and Operations, Twitter is pointing at high traffic as the cause.
Except, it's not, it's just something that exposed the inherent weakness in the system. The cause of the failure is that Twitter is written in Ruby on Rails. And it worked fine for a few hundred people and their friends, so they scaled it up to a few million. And it has been an unmitigated disaster, getting slower and slower no matter how hard they whip the servers. They can keep adding hardware, but that only delays the inevitable with the poor design of Rails, and the inherent slowness of Ruby. Alex Payne, one of the Twitter developers, gave an interview which explains just how bad the Rails situation is.
This is the Rails trap. Setting it up is so easy! You can roll out a basic site in a day! But when you need to add unusual (for Rails) features that aren't just database-to-web-forms scraping, or when you need more performance on the front-end boxes, or, Why forbid, change the way you interact with the database, then you're in for a world of hurt. And that hurt is an order of magnitude worse than the hurt you'll get by doing it "right" in the first place.
I certainly won't argue that building a big webapp in Java with JSP/Servlets or JavaServer Faces and maybe Hibernate is easy, because it's not. It takes a good week to have a basic site that does something, and that's if you know what you're doing (for an average Java programmer, a month of studying Marty Hall's Core Servlets books and building test applications should be enough to get up to speed). But adding more functionality that works to the JSP or JSF site is pretty easy after that, and it'll be tens to hundreds of times faster than an equivalent Rails app, and can support far more users, and won't require you to write entirely new database drivers to handle caching.
For a site that's meant to support just a small group of people with light usage, Rails can be okay, though it'll always frustrate you when you intend to expand it. A software service like Twitter is the archetypical example of something that would be vastly better off if based on a more robust platform.
Tim Bray's predictions for 2008 include the rise of Ruby on Rails. I think not.
The best reality check is to search on Dice.com (or other job site):
java: 15018 jobs
c++: 7363 jobs
c#: 7017 jobs
perl: 5220 jobs
php: 2176 jobs
python: 1233 jobs
ruby: 637 jobs
delphi: 139 jobs
smalltalk: 44 jobs
lisp: 22 jobs
objective-c: 16 jobs
haskell: 2 jobs
Java is the dominant language for solving problems that are worth paying money to solve. Nobody else is even close. Java is almost as popular as everything else combined.
Ruby on Rails is excruciatingly slow, and requires far more hardware to scale up than other tools. Using a tool that makes it easy to get started but costs more and causes more pain down the line is not good business sense. The disaster that has been Twitter trying to scale up is going to be repeated over and over, until people quit being penny-wise and pound-foolish, and learn to invest a little in more serious technology.
This might take a while, and certainly Ruby's going to get more popular in the next year, but in the long haul I think it's headed down again, sharply. In my experience, managers are extremely tolerant of senior engineers recommending and using whatever language or framework or technology they like--that's the point of being a senior engineer--but by the end of the development cycle, 12-18 months later, they had better have a working product, or they're out of a job.
The Rails hype machine started up nearly 2 years ago. Now we're seeing a lot of the apps built on it reach the market and creak and groan under the weight of even modest usage; they collapse utterly under Slashdot-type loads. This is going to lead to a lot of fired senior engineers who took a chance on the wrong language/framework, and Ruby will quietly sink back down to its natural place around Lisp's popularity.
I wrote this blog and site in PHP, and I've written tens of thousands of lines of PHP code (not counting the HTML); I can speak from hard experience now that PHP is shit. It's an awful language, with horrible syntax and semantics that must have been dreamed up by a madman. It's insecure, and the PHP team doesn't care. It is sheer folly to perpetrate new code in PHP, use something better if you value your data and time.
My experience with Ruby is more limited; I've done the tutorials and read Why's Poignant Guide to Ruby (which at least has good cartoons and chunky bacon, pity about the language), and tried building stuff in it, and got annoyed by its lack of new features and its ugly, Innsmouth-look, "dear god what abomination did you crossbreed with?!?" syntax compared to beautiful, graceful, simple Python. But I've read the project postmortems like everyone else. Ruby isn't going to fix your problems for you, it just gives you new problems.
While I love Python for quick problem-solving and for writing beautiful code, there's no way I'd ever again use it for an app that would be run by more than a half-dozen people in the world. It's faster than Ruby, but still 10-100x slower than Java. If it was half the speed, that'd be okay, but it's just not there. Even if speed was not an issue, writing big tools in Python sucks. Compilers turn out to be really handy at catching stupid mistakes, and everyone makes stupid mistakes.
There's nothing magical about what dynamic languages do. If you want dynamic web code in Java, make some JSP pages, and use JSTL with the <sql:> tags, or wrap your database code inside JSP beans. This works at least as well as PHP, maybe better, and will have several orders of magnitude better performance than an equivalent site in Ruby, because it's really compiled into pure Java servlets.
[update 2008-01-11: added Perl statistics.]
An open letter to Rob Enderle and IT Business Edge:
I would like to draw your attention to Rob Enderle's latest article:
Falling for the Dan Lyons Apple Hoax: Implications and Portents
In it, he discovers that he was reacting to a fake article on the Fake Steve Jobs blog, which had been revealed to be a fake days before his original post; even if he couldn't recognize parody when he read it, if he'd bothered to check back at FSJ before publication, he'd have discovered that he'd been fooled.
But his incompetent pretense of journalism is not the problem here; everyone laughs at Rob Enderle, and treats him like the perpetually-wrong joke that he is, and no harm done. However, his explanation points to a bigger problem, one of true malfeasance:
I clearly was drinking way too much eggnog to see the joke for what it was.
Now, we could take that as self-deprecating humor, as parody... Except that we know that Enderle is not capable of parody, not capable of self-awareness, and not capable of writing, especially when drunk. So clearly this must be taken seriously.
You have an alleged analyst filing reports while drunk, while clearly too incapacitated to understand simple parody. This is the behavior of an alcoholic; today he's filing bad tech analysis, tonight he's driving drunk and killing people.
For the good of everyone, I urge you to stop publishing Rob Enderle, and perhaps you can persuade him to enter an alcohol treatment program.
Thank you, and let's all hope for the best outcome for Rob and his long-suffering family.
Developers can get a developer preview now at the Apple Developer Connection.
Yes, it's 64-bit Intel only. No, it doesn't run Eclipse (because their SWT library is only 32-bit right now). But you can run Eclipse in Java 5, and use Java 6 for the compiler and app runtime, and it works. I'm sure some of the whiny Java dorks will be complaining about this anyway. I'm not with them.
[Update] As I predicted, the whiny Java dorks are, in fact, complaining, saying things like "What????? Only 64 bit???? You have GOT to be freaking kidding me!!!! My MacBook is a 32 bit Core Duo one.
The arrogance is just dripping off left and right, I don't know HOW people put up with it. I don't."
Really, this is why we can't have nice things in Mac-Java-land. Even when we get them, most of the Java devs bitch and whine like little girls. They bitch and whine for a pony, and when they get it, they're upset that it's not the right color. YOU GOT A PONY, YOU WHINY BITCH! SHUT UP! If I was Apple, I'd have stopped supporting Java entirely, it's just not worth dealing with these whiny little bitches.
Closure is the aspect of communications design that causes the greatest problems. The concept is best explained with an analogy. The user is at point A and wishes to use the program to get to point B. A poorly human-engineered program is like a tightrope stretched between points A and B. The user who knows exactly what to do and performs perfectly will succeed. More likely, he or she will slip and fall. Some programs try to help by providing a manual or internal warnings that tell the user what to do and what not to do. These are analogous to signs along the tightrope advising "BE CAREFUL" and "DON'T FALL." I have seen several programs that place signs underneath the tightrope, so that the user can at least see why he failed as he plummets. A somewhat better class of programs provide masks against illegal entries. These are equivalent to guardrails alongside the tightrope. These are much nicer, but they must be very well constructed to ensure that the user does not thwart them. Some programs have nasty messages that bark at the errant user, warning against making certain entries. These are analogous to scowling monitors in the school halls, and are useful only for making an adult feel like a child. The ideal program is like a tunnel bored through solid rock. There is but one path, the path leading to success. The user has no options but to succeed.
The essence of closure is the narrowing of options, the elimination of possibilities, the placement of rock solid walls around the user. Good design is not an accumulative process of piling lots of features onto a basic architecture; good design requires the programmer to strip away minor features, petty options, and general trivia.
-Chris Crawford, De Re Atari: Appendix B: Human Engineering, 1982
A couple more thoughts on the Kindle.
- It's not obvious from the demos, but the screen has 4-color grayscale, like the old Sony Reader PRS-500. That's slightly less terrible, but still grossly inadequate. The new Sony Reader PRS-505 has 8-color grayscale, and is still a little jaggy and dull. It appears to be just as dim a display as the old Sony, and not as bright as the new one, which means it's very dark and hard to read in anything but perfect lighting.
Outside of a dog, a book is man's best friend.
Inside of a dog, it's too dark to read.
After seeing a couple more video demos, I'll add to my litany of contempt:
- The scroll bar controlled by a paddle for choosing menu options is one of the worst user interfaces I have ever seen in my life with technology. We have these things now called "touch screens". On the iPhone/iPod touch, you just touch the screen to do something, or flick the screen up or down to shift the page, and it moves like a physical thing. Scroll bars were cool and new on the Macintosh 128K in 1984, and paddle controllers haven't been cool since Pong, and the combination just stinks. No, I don't have one of those primitive 20th Century scroll wheels on my mouse; I have a Mighty Mouse with a mini-trackball for scrolling, and I now expect all small-screen displays to respond to touch and drag. Wake up and join the 21st Century.
- There's no way to select a single word for dictionary lookup or clicking to a URL. You select an entire line with the paddle and it lists everything, slowly.
- The next/prev buttons and the keyboard are incredibly sluggish and unresponsive. You can literally watch it stop and think half a second per keypress, or a few seconds for a new page or a popup dialog, before responding. This is a thing that would drive anyone who isn't on heavy sedatives to start using heavy sedatives. Good interfaces must respond instantly, at least within 1/10th of a second, or people become increasingly frustrated and begin to hate your product. Sluggish response produces loathing. I've done user testing and watched this process turn ordinary, happy people into raving maniacs. I expect Amazon will have a lot of returned Kindles that have been thrown into walls.
It's obvious to anyone who has worked at Amazon (I served an 11-month tour of duty inside the 'Zon) why the Kindle is so awful. The psychology of every company is set and shaped by the psychology of the founders. Jeff Bezos runs things on the cheap; he still acts like he's eating ramen at a struggling startup.
When he started Amazon in an empty warehouse, he couldn't afford real desks, so he bought some door blanks and 2x4s, and made some cheap, nasty desks out of them. To this day, all desks and conference room tables in Amazon are Door Desks, made custom for Amazon to remind you to be frugal and not spend any money on anything that doesn't help the company meet next payroll, to hell with aesthetics or the long-range future.
Similarly, the Amazon infrastructure is lashed together from cheap-ass Linux servers that just fail over and get replaced, rather than buying anything quality, because that wouldn't be "frugal".
Amazon doesn't pay people to do site technical support, they just issue the engineers pagers and make them do it. It's not like you've got work to do, or wanted to sleep or something, right?
You don't even want to know what the software is like inside, but it's the same principle, exacerbated by Amazon's obsessive Not Invented Here culture, which leads to Amazon-created/-modified incompatible versions of every tool.
So naturally, the Kindle is cheap-ass white plastic, with no aesthetic design or usability engineering, just whatever the cheapest possible components were, lashed together by an engineer who hasn't been let out of his ugly, flourescent-lit lab for years. Spending money on artists and designers, or buying slightly better components, would violate the Door Desk Principle.
Mike Arrington was right in his video with Scoble: the Etch-A-Sketch is a better device. Despite the equally primitive controls and nearly equivalent display, the Etch-A-Sketch is fast and responsive, and just plain works. The Etch-a-Sketch knows what its technological limitations are and still produces a good user experience. The Etch-A-Sketch even has style; it's bright red, because it's for children who are attracted to bright colors, and it has a smooth case and big chunky knobs for awkward young hands. The Kindle is trying to be something that it just can't be with the crippled technology Amazon chose and Amazon's inability to make classy devices.
The Amazon Kindle is out, and it's about as useful as any previous e-book reader. Which is to say, not at all.
- The Kindle does not support PDF books. Full stop. Without PDF, the device is useless. Almost all existing non-DRM'd ebooks are sold in PDF format. You can supposedly convert your PDFs into the proprietary Amazon format, but it'll cost you $0.10 per book, for something you already own, and destroy the existing page layout.
- It's $400, plus $10 per DRM-crippled Amazon book, plus $2/mo per RSS feed. They don't charge for the network access because they've already extracted your wallet.
- It's hideously ugly. It looks like some cheap office supply tool, like a barcode printer, not a $400 piece of electronics that you'd want to curl up with day in and day out for the rest of your life. Since becoming a Mac, iPod, and Nintendo DS owner, I won't own an electronic device that ugly. Aesthetics matter, and this thing has none; it makes the shit-brown and puke-green Zunes look tasteful.
- The keyboard is unnecessary, large, and awkwardly placed. The scroller can only be manipulated with the right hand while the left hand holds it. This is unlike a print book, where you can hold a paperback and thumb through it with one hand, ambidextrously. The angled front will dig into your hand and be hard to hold on to. The usability is, shockingly, even worse than the nonexistent aesthetics.
- The screen is monochrome. 16-color grayscale would have been enough to have decent font antialiasing, and a 256-color web palette would have been far superior. As it is, it'll be quite unpleasant to read on (the Sony reader has an identical screen, and it was terrible). Compare this to an iPhone, which has similar DPI, but has bright, sharp color and the Mac's perfect font rendering. I'd far rather read on my little iPod Touch, if only it had local file storage.
- Luddite print fetishists are addicted to the smell of rotting wood pulp and the feel of leather or hard rotting wood pulp. Sony understood that and put a good cover on their e-book reader, but it's not enough. If you want the fetishists to convert, you'll have to wrap it in a few sheets of paper, or perhaps some artificial scent dispenser. I really don't think they know or care about paper as such, they just fetishize the smell and feel.
So why is anyone pushing it? Well, the big-name bloggers and newspapers are pushing it because they expect to get 30% of the $2/mo fee for subscribing to their RSS feeds. PAYOLA. I cannot believe that anyone who isn't getting a kickback from Amazon honestly likes this device.
It has been noted that I'm... intemperate, let's say... with bad design, and an obsessive fanboy for good design. When people identify too strongly with the systems I say have bad design or no design at all, like Linux, they take it very poorly indeed, and think it's a personal attack. It is rarely personal, and even if you have mal-designed one of these programs I scorn, it's just strong encouragement for you to do better. I don't wish you ill, I just want you to learn from your mistakes. Of course, I have only good will to those who share my madness...
I'm finally starting to see a method in this madness, and to organize my thoughts about it. So, below the break, a few thoughts on user interface design.
There are three ways to write a GUI program. Because they are most commonly associated with specific platforms, I'll call them the Unix Way, the Windows Way, and the Macintosh Way, but any can appear on any platform.
The Windows Way
First, check MSDN, see if Microsoft has written a library for your task. If not, write a library. Don't bother with tests, or any interface, you just want a bunch of code. Then throw together some dialogs, maybe in Visual Basic, and push the buttons to see if they work. Don't verify any results, or rethink the design at all. Ship it.
Sure, the app is unusable trash, it's full of bugs, eats your HD, and loads viruses in place of your family photos, but it was easy to produce, huh?
Most Java apps, I would note, are written the Windows Way. Except Java doesn't really have Visual Basic, so the programmers make the most bare-minimum GUI possible, using the default unspeakably hideous "Metal" look and feel. I remember back in JDK 1.1, when Swing first got the Metal theme, and I thought it was a programmer joke that'd gone too far, and surely they'd never release it with that theme, but I was wrong. Never attribute to excessive humor what can be attributed to bad taste, I guess.
The Unix Way
Write a command-line app that takes text on standard input or a filename as an argument, with a minimum of 20 command-line options. Make sure it works perfectly on the 2 or 3 regression test cases you set up, but don't worry about anything else.
Now throw together a dialog box with a field or checkbox for every command-line option, and a big GO button. If you use Tk for this, you can make the most absolutely hideous interface possible on every desktop, and thereby drive people to the command line instead. Since the command line is superior, this is the best outcome. Ship it as a bundle of source code, a make file, and a GPL license. You don't want anyone using your program who can't compile C code.
Sure, it's unusable trash, but at least it works, right?
(Note: In my former life as a Linux-based Java/Python programmer, I was guilty of this myself. RandPod is a hack I threw together in an hour to get music onto my Treo, before I got an iPod. Nobody should use RandPod unless they're totally desperate.)
The Macintosh Way
Think about what you want the user to be able to accomplish, and what kind of screen interfaces they might interact with. Design this in Interface Builder (even if you're going to write it in some other language), but don't wire it up to any code yet. Just play with it in IB.
If you haven't already, go read Jakob Nielsen's web site useit.com and the Nielsen Norman Group books, especially Usability Engineering, which is, I would say, the best book on the science of measuring usability ever written.
When you're ready, show it to people who aren't programmers (programmers are not people, we think in a different way than humans, and are unsuitable for testing), and get their opinion, and PAY ATTENTION. You can't just take them at their literal word, because users aren't quite speaking the same language you are, but you can translate. My first pass of translation from user-ese to designer-ese is:
"I can't figure out how to X" means I need to bring feature X or the object it manipulates to the top of the interface, it's almost certainly buried too far down.
"X sucks because it's not like MyFamiliarProgram" means I should make my app even less like MyFamiliarProgram so there's no confusion.
A good app pays attention to what the users want to do, and drives them like a piledriver into the app to accomplish their task. Once you pick a path, you should have no choice but success or returning unharmed to the start, not wandering aimlessly through the app with the task half-finished. A great app does that and has one strong vision behind it, but until you develop a consistent style, the users know better than you do when an app is pleasant to use or not.
Okay, so you've tested the UI. Throw out your current NIB files, and rebuild the UI and hook it up to some prototype code, that just uses test data. Now run your usability tests again, and incorporate any changes needed.
You're getting close, now. You can start implementing real functionality. Big chunks of code. If you set up your app with proper Model-View-Controller separation, it's easy to write unit tests which are driven by a test harness and not the "real GUI", so you should do that.
When the entire app is hooked up, you can finally begin the last round of user testing, to make sure nothing slipped through. When the users have the reactions of "OMG I'm totally addicted, here, take my credit card and my first-born child", you can ship. If not, go back to work.
The Despot Sociopath Theory
Today, "miclorb" in #javaposse said "I have noticed good UI deisgns usually have some despot sociopath behind it. Someone with a single vision who just pushes and pushes." Certainly that has the ring of truth to it. We all laugh about Steve Jobs and his "Reality Distortion Field" and call him "Pope Steve" when he makes big pronouncements to the faithful, but "sociopath with impeccable taste" is probably the best description there can be of him. Jean-Louis Gassée is similarly an obsessive perfectionist, and made Newton and later BeOS have singular design visions. Every designer (or manager of designers) I've seen who was any good was that obsessed and mad.
What drove me insane, made me the annoying, obsessive design freak I am now, was using Linux. I'd had years and years of being indoctrinated by the Macintosh Human Interface Guidelines and the Atari ST design guidelines, then OS/2 and IBM's VERY focused attention to user interface flow (it didn't have to be pretty, just responsive and consistent).
Even then, it didn't really sink in until I had to use Linux day in and out, and got more angry, HULK SMASH! angry, every day at how awful it was. Edit /etc/X11R6/xf86config with some parameters you can only find by reading the source code for the video card driver, then recompile your kernel, and maybe you can get video working. Repeat for sound, but you can't get two channels of sound, even though the card supports it. Now pick one of the dozens of window managers, pick one of dozens of themes, and try to run a few apps. Oh, but this one's KDE, this one's GNOME, this one's Motif, and soon you're out of memory, and they all look and act completely different. And they're all designed the "Unix Way", so they suck.
The problem is the lack of organization to the Linux desktops. The kernel is a despotic kingdom, and while I'm no more a fan of monolithic kernels than Andrew S. Tannenbaum is, it's a consistent piece of software. But on the desktop, there is nobody saying to the Linux developers, "This is what a Linux app looks like, this is how it should act. This app is friendly, this app sucks." There are no market pressures, because it's impossible to make a living selling Linux software when people will just recompile it for free. Sometimes Linus will say "GNOME sucks, I just tell people to use KDE", but that's not really guidance.
When I switched to Mac OS X, suddenly everything worked, and worked in a consistent, user-friendly manner. I turned over a new leaf, and started making my software no longer suck to use. It's a long, slow process to learn UI design, but the end result is worth it: making insanely great software, not crappy software.
Where Do We Go Now?
Part of the solution is simply training. Learn how to make usable software. It's painful at first, especially when you start getting user feedback and it crushes your ego, but when you make usable software and people tell you they love your software, it's the best feeling in the world.
Part of the solution, though only secondary to actually learning usability design, is technological. Some platforms are not viable for making good, modern, usable software. You can fight the local idioms and do so despite them, but why not go with the good stuff in the first place?
Obviously, the first and best choice is Mac OS X native apps, written in Cocoa, in Interface Builder and Xcode. Hit the Apple Developer Connection, get a free Web membership, start reading the docs and sample code, and try it yourself. Interface Builder is worth using even if you never ship a Cocoa app.
Desktop Java apps and Java applets have recently staggered, zombie-like, back to life (I never quit making them, but I'm a freak), and Java cell-phone apps like the Blackberry have done extraordinarily well. Whether Sun can do anything about the ugly look-and-feel of Swing and the AWT remains to be seen. JavaFX is a neat framework for making GUIs, except that A) it's a nerdy programming language, not a graphical designer like a real UI designer would want, and B) it has no human interface guidelines. JavaFX may turn out to be worse than Flash for usability, but there are good designers who are embracing and using it, and it can be used for good instead of evil.
It's too late for Linux. It's poisoned, full of garbage apps, and the native population are almost entirely programmers without any taste or design sense, so they won't appreciate it if you do write good apps.
Windows seems to deliberately go against every principle of good user interface design more aggressively and hideously with every new version; Vista is full of shiny eye candy, but it's rotten and poisonous to actually use. It's presumably possible to make good apps for Vista, but few people ever try, and even fewer of them succeed. Microsoft's current strategy is to push desktop Windows apps into C# and .NET, but they offer no guidance or tools for making anything pleasant to use. They also push "web apps" of a sort with Silverlight, but that's just another Flash-type thing, but without even Flash's limited attention to usability.
Mozilla wants to revitalize their old XUL technology. This is not a terrible idea, and good, usable software can be built with XUL. But XUL is unpleasant to develop in, and is not as powerful as modern AJAX and web toolkits. Why tie yourself to slow, bloated Firefox?
I started playing Pirates of the Caribbean Online last week, and it's an extremely fun, simple little MMO (for Mac and that other platform) about killin' British Navy guys wit' their fancy red coats an' all, and shootin' undead pirates what ain't even alive no more, and sinkin' ships of the British Navy, the East India Trading Company, and the undead. It be, in short, a whole rum barrel full of awesome. Arr!
But one behavior of online players weirds me right out. Not the obsession with talkin' like a pirate, that be perfectly normal, landlubber! No, it be when some pirate ye have ne'er fought alongside or spoken to wants ye to join their crew, or be their friend (even if, as in POTCO, it's just as a "Pirate Friend", not a "True Friend"). That's just way too intimate, way too fast. If we fight in the forest together and I see that ye be a true pirate with cannonballs of solid brass, if'n ye know what I mean, I'll offer me dialog box of friendship to ye. But ye're just askin' for some treacherous devil like me to stab ye in the back, steal your coin pouch, and make off wit' yer girl if ye ask a stranger to be yer friend. What be ye thinkin'?
Oh, and last night I learned voodoo powers. I can now make someone be attacked by a swarm of bees just by waving a voodoo doll in their face. Muahahahahaha!
If you have your firewall active on Leopard, WHICH YOU SHOULD, DO IT NOW, Skype will only launch once. Second time, it'll just die.
It turns out that Skype modifies itself, and Leopard's firewall sees that as a virus and won't let it run again. The solution is simple: Run Skype from the DMG every time, and it can't modify itself. You'll have to hit "Allow" on the "launch this scary new app" and "bypass firewall" dialogs, giving you the Vista experience, but at least it works.
Skype's soi-disant "technical support" has a different suggestion. They say you should just turn off your firewall!
The best response to people this
malicious incompetent would be to quit using Skype, but sadly it's our office IM system. <sigh>
Menu Apple | System Preferences | Security | Firewall, select "Set access for specific services and applications".
In Leopard, select someone you're iChatting with, and hit Buddies | Ask to Share X's Screen.
Once they accept, you are now sharing control of their screen. That's it! Your screen is minimized in the corner, where you can flip back to it, (drag & drop? I didn't try that), whatever.
This is, with no exaggeration, magic.
I've used VNC, Remote Desktop, X11, and so on for years and years. Sharing a screen to fix someone else's computer or see what they're seeing was a total pain in the ass. Until now.
Leopard Day came (I went to the Bellevue Apple store to get an iPod Touch and a free Leopard t-shirt!), and I started my Leopard download from the ADC (yes, Leopard is "free" for developers, but $500/year "free")... And a mere 12 hours later, I had a DMG... With an invalid checksum. I was heartbroken. Crushed. Thankfully, yesterday I was able to download it at work, got a good copy, and am now running Leopard. Still, that the consumers got Leopard days (for some developers a week+ before the November Dev DVD arrives) before the developers, that's a serious problem. Apple, please address this in the future. Developers pay serious money for ADC and WWDC passes because they need to know the future before the consumers.
The "Upgrade" process worked almost flawlessly for me, despite this being a heavily-used development machine. Make a good backup (I know, without Time Machine that's asking a lot...) and try it. If it fails, the worst that can happen is you have to erase and install, but you still have a backup.
All of my apps Just Worked in Leopard. I shouldn't really be surprised by this, but it's a pleasant change from the beta seeds, which were not so great at running every random thing.
In Console.app, I saw a lot of "com.apple.launchd (com.apple.dyld) Throttling respawn: Will start in 60 seconds". See this MacRumors thread for a solution:
sudo update_prebinding -force -root /
and then reboot.
Xcode 3.0 initially refused to install. I've had 2.4, at least one beta, and 2.5 on this box, so that's unsurprising. Running
to clean that out, and then re-installing Xcode 3.0 worked nicely.
I was able to subscribe to the Apple documentation set this morning and get an update from them, instead of having to download the dmg from developer.apple.com as before, so finally that's working. From now on, Xcode should just update itself on a regular basis; I'll have to check back and see if that's still true in a month, but it looks promising.
Those who've used Terminal in previous seeds know how great the tabbed Terminals are: no more running 'screen' and having to hit ^A<ESC>^B to scroll back, etc., just use it like a normal tabbed app.
'ls' has been upgraded: it now shows "@" after the permissions if the file has metadata, and you can use
xattr -l FILENAME
to list the metadata and contents. I can't find a proper man page for xattr, but 'xattr -?' gave basic instructions. This is actually kind of a big deal; with these tools, metadata is now easy for developers to find and work with.
'ant', 'mvn', and 'svn' are now installed standard. This will make it quite a lot easier for developers to set up a new machine and get working; a stock, unmodified Leopard has all the tools you actually need. And oh, yeah:
Python 2.5.1 (r251:54869, Apr 18 2007, 22:08:04)
[GCC 4.0.1 (Apple Computer, Inc. build 5367)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import turtle
A standard, current production version Python! YEE-ha!
Now, the one dark spot: Java. Java 6 is not on Leopard. I believe, from things said by Apple engineers, that there were some issues that kept it from an initial release, but it'll be out soon. In the mean time, just be happy that Java 5 works. They cleaned up the Swing look & feel quite a bit, they made it 64-bit, seems pretty stable; I've been working in MyEclipse last night and this morning, and had zero problems.
I really wouldn't suggest deploying for Java 6 yet, anyway, but it is inconvenient for developers to not be able to develop against it. Still, this is going to be a short-term problem, and freaking out or saying you're leaving Mac (as James Gosling has, in favor of Solaris, of all the ludicrous choices possible), that's an overreaction. Patience, people.
If you are going to complain, please complain in Radar, where Apple will read it and may even give a damn, NOT on the goddamn email@example.com mailing list, which is for technical issues. This post by the infamous Hani Suleiman summed it up nicely.
The Java "community"'s hysterical reaction to only having Java 5 has been so immature, so totally ignorant, that I'm seriously reconsidering my use of Java; I have been for a while, since native Cocoa apps are far superior technology for desktop apps and games, but this is driving another nail in the coffin. I don't want to be associated with these people in any way.
Hey, Whiny Java People: Why didn't you whine when Vista shipped without any JVM at all? (or does it still have J++ 1.1?)
- Me: 29 hours to go.
- min: got your tennis shoes so pope steve can pull a heaven's gate?
- Me: If Pope Steve promised us a spaceship, it wouldn't be some preposterous suicide pact. It'd actually be a real spaceship, with real aliens.
They might require us to submit to anal sex to get on-board, but fuck, it's a real spaceship.
I've finally worked out what I think is a reasonably robust logging system for my debugging code, which I can disable when I'm ready for production:
In the project's Debug build configuration, set preprocessor macro DEBUG=1
// DLOG takes a format argument and 0 or more args:
// DLOG(@"%d", x);
#define DLOG(fmt, ...) NSLog(@"%s: " fmt, __PRETTY_FUNCTION__, ##__VA_ARGS__)
Now I can use
DLOG(@"foo"); instead of
NSLog(@"foo"), and the console contains:
2009-02-03 04:05:06.789 AwesomeApp[6109:a0f] -[AwesomeAppDelegate someMethod]: foo
__PRETTY_FUNCTION__ inserts the current classname and method selector.
##__VA_ARGS__ removes the preceding comma if there are no arguments.
Chris Stone wrote a while back:
The GPL effectively prohibits any sort of commercial use. With version 3 due out soon, it gets even more restrictive because of the Microsoft/Novell patent tax pact. The BSD and MIT licenses do not prohibit commercial use. That means that it is possible for someone to make money off of them, i.e., to eat, buy clothes, buy plasma TV’s.
This is the enlightened view. The MIT, BSD, and X11 licenses are the only true "open source" licenses. They are clear, unambiguous, and allow you to share your work in the spirit of scientific research. When others improve on your work, they are free to either continue contributing to that body of knowledge, or go make some money from their unique contribution.
I don't feel bad when someone uses my BSD-licensed code to make money, I feel happy. I intended to give it away, so I'm going to honestly give it away, without lying to you and attaching strings to the gift.
The GPL is bigoted hate speech from a bitter dork who was upset that all his Lisp buddies were leaving the AI lab and getting real jobs at Symbolics, so he first tried to destroy them with an inferior copy of their work, and when that failed, he made it his life's work to destroy the commercial software industry. The GPL is solely about jealousy: they aren't making any money, so they hate anyone else who is making money. It's the same mental disorder that makes hippies hate all corporations and businesses, because they're dirt-poor and stupid, so they resent anyone else who isn't.
The GPL is discriminatory. It is biased against anyone who wants to actually produce commercial software and make a living from it. It is difficult, bordering on impossible, to make money from GPL software. Red Hat, Novell, and a handful of others sell service, because Linux is so hard to use and maintain that most people need service to use it. But if they tried to sell licenses without support contracts for Linux, they'd be crushed by Ubuntu giving it away for free. The most successful business model Novell's found has been getting bribed by Microsoft. Neither of these companies make any new software, they just repackage someone else's software, and then try to extort money for it. The GPL has led directly to extortion.
The GPL is a bait-and-switch. It shows you functional code that might very well solve your problem, and then says, "Oh, no, you can't actually use this, because you work for a living."
It's difficult to impossible to use open source software in many interesting ways. Readline is a totally useless library for many projects, because the GPL license is poison; if it was LGPL or BSD, it would be ubiquitous. A normal, sane mind would be enthusiastic about that, about seeing their tool be used and make others happy.
And Stallman's not getting more sane with age, either. In a Groklaw interview, he says:
"Q: One final question. We're seeing more and more devices, and I'm thinking specifically of games consoles -- I know that my kids have one in the house -- where there is no --"
"Richard Stallman: I wouldn't. You have to learn how to say no to your kids."
"Q: That's true, that's true, I wouldn't deny it. Now, there is no free software at all for devices like this [correction: Yellow Dog supports some console(s)]."
"Richard Stallman: That's why there is no possible ethical way you could use one, and so you shouldn't have it."
Great. Now he's calling everyone who plays videogame consoles unethical. Is there no end to this blackguard's insults against people who just want to use some fucking software? As Thomas Becket asked, "Will nobody rid us of this turbulent homeless loser?"
But then, let's look at all the great games which were written as GPL... Oh, wait, there aren't any. If you're a totally obsessive GNUtard (or the pitiable child of such a GNUtard), you can't play any new games, only crappy ripoffs of commercial software. There are no equivalents to Nintendo or Square/Enix in the GNU world. Even id software, who always release Quake and Doom on Linux, don't do it as GPL. (Some developers, including id, do release end-of-life software years later as GPL, just as I do with BSD, but they weren't written and released under GPL initially). And why is that? Because if you release a game as GPL, someone else will give it away for free, and you'll go out of business. There will then be no more new games. At least with commercial software, you can fight the pirates with the law. But if you GPL your game, then the pirates are protected by the law.
Chris's thought that the GPL causes slow-downs in development of open source is exactly right. When have you EVER seen a truly innovative piece of GPL software? Everything in GPL is a bad copy of some other software that was developed under a commercial license or a true open source license like BSD. Worse, GPL software damages and even drives out commercial competitors; it doesn't have to be any good, it just has to consume resources, like rabbits in Australia or pigeons in any city.
- Linux is a copy of Unix. BSD Unix is years more advanced than Linux, and MacOS X (which is based on BSD Unix) is 10-20 years ahead of Linux.
- gcc is just another C compiler, and not a very good one. The Intel compilers compile significantly faster and produce faster and more memory-efficient code from the same source. I'm sure Borland's compilers are still faster and more efficient than gcc, too. There used to be many others, but the widespread availability of a shitty but "free" gcc has poisoned the market. There are alternative CPUs for which gcc is the only real compiler, but that's not a positive feature, that's a tragedy.
- KDE and GNOME are hideous, difficult, and unstable desktop environments. I'm appalled that these are what pass for a desktop environment on Linux. While I have few kind words to say about Windows, at least their desktop is better than KDE. There's no comparison at all to Mac OS X. GNOME isn't even basically functional... GNOME is one of the worst pieces of software I have ever seen in my life.
- The GIMP is... almost decent. It's not innovative in any way, it's still an inferior copy of Photoshop, but at least for once a GNU program isn't a complete piece of shit. I include it in this list because it shows the best possible result for a GPL program: not a complete piece of shit, but still a ripoff.
- The FSF is trying to make "Gnash", a replacement for Flash 7, and it's apparently as attractive and functional as the name makes it sound. Adobe already has a free version of its Flash 9 player for Linux. Not that I understand why they bother; they get nothing but hate from the FSF and a lot of the Linux community for providing Flash and Acrobat, even though they give away free client software.
- OpenOffice.org is scarily ugly and barely functional. Now, it's interesting that Sun's found a way to use GPL as a weapon against all other office suites, and put out this crappy free version and then charge for the slightly less terrible StarOffice version. But it's ultimately just another MS Office clone. Compare that to, say, Apple's iWork suite (Pages, Keynote); Pages is graceful and attractive and works in a very different way from Word. It's certainly not as complex, and that's a virtue. That's something that would never happen with GPL software.
Yes, there is bad closed-source software, too, every Microsoft product being the canonical example... But normally the market weeds them out, and for every bad closed-source commercial product, I can show you a dozen crappy GPL equivalents.
Because he lives in the Bizarro universe where black is white, up is down, and cats and dogs live together, Stallman doesn't even care about functionality or originality:
"Write letters to the editor whenever you see a newspaper or magazine praise non-free software, by judging it according to shallow criteria, only caring what job it would do and what's the price and not caring whether it respects your freedom."
Functionality is not shallow. Functionality is the purpose of software. To get the job done. To do it cost-effectively, efficiently, reliably, in an easy-to-use and attractive manner.
Whether or not you can modify a piece of software is meaningful only to a handful of programmers. It makes no difference at all to the users. To a working programmer, GPL software is useless, because you can't include it in your software at work. So the only ones who find GPL software's "freedom to modify" useful are bored college students and useless hippies.
Ultimately, the GPL is about restricting the rights of programmers to do as they wish with the software they write. Someone who loves liberty would allow and encourage every programmer to release their software under terms that they find acceptable for their own needs. For some people, that'll be commercial; for some, BSD. But if he had the political power, Stallman would put a gun to the head of every programmer and force them to use the GPL, and would put a gun to the head of every user and force them to only use GPL software. His motivation is to steal your software and make it part of the FSF, so that all new software development ends, all commercial software goes out of business, and finally the demon of jealousy screeching in his head can stop.
Stop giving a crazy person power over you. Don't use the GPL.