.................with apologies to Alistair Cook
Friday, 29 January 2010
Wednesday, 27 January 2010
All in all, the iPad turned in a pretty exciting product debut. I don't think Apple will have any trouble selling these things, and it can't be a very good day in Amazon, Sony or Barnes and Noble's executive suites.
While all the final info is not out yet, there were a few major omissions from the iPad hardware. Here's the highly desireable stuff that came out missing :
- No camera, which means no video conferencing. No quick shots for blog posting. No videos.
- No Verizon. The AT&T pricing looks good, but is it really unlimited or is there a 5GB ceiling? Many users are pretty desperate to get away from AT&T, so it was surprising Apple went for another partnership with them.
- No notifications. Not a word was said about them. They might be in there, since the iPad clearly runs iPhone apps (and what iPhone app doesn't notify you these days?) but nothing was demoed.
- Enhanced multitouch. As far as we can tell, it works the same as the iPhone -- no dynamic tactile interface, no pressure sensitive screen, nothing special that we know about yet.
- No TV content. Of course there's the iTunes deals, but Apple has apparently been scrambling around to make so DVR deals as well. So far, nothing.
- No multitasking. Perhaps the biggest disappointment: no streaming media apps while punching out a document in Pages. No MLB video running in a corner while you read your mail, or pulling up a PDF while chatting with a friend.
I think the iPad will be a superior device, and will sell like the proverbial hotcakes. Apple will certainly extend and enhance the iPad over time, but it would have been great to see some of these things in the initial release.
Anything else we missed that they missed?
Saturday, 23 January 2010
Tech Reflections - Digital Muse for Beat Poet - NYTimes.com: "Why I Take Good Care of My Macintosh
By Gary Snyder
Because it broods under its hood like a perched falcon,
Because it jumps like a skittish horse and sometimes throws me,
Because it is poky when cold,
Because plastic is a sad, strong material that is charming to rodents,
Because it is flighty,
Because my mind flies into it through my fingers,
Because it leaps forward and backward, is an endless sniffer and searcher,
Because its keys click like hail on a boulder,
And it winks when it goes out,
And puts word-heaps in hoards for me, dozens of pockets of gold under boulders in streambeds, identical seedpods strong on a vine, or it stores bins of bolts;
And I lose them and find them,
Because whole worlds of writing can be boldly laid out and then highlighted and vanish in a flash at ‘delete,’ so it teaches of impermanence and pain;
And because my computer and me are both brief in this world, both foolish, and we have earthly fates,
Because I have let it move in with me right inside the tent,
And it goes with me out every morning;
We fill up our baskets, get back home,
Feel rich, relax, I throw it a scrap and it hums.
Copyright Gary Snyder, used by permission"
Thursday, 21 January 2010
Some people want the Apple Tablet to run Mac OS X's user interface. Others think its UI will be something exotic. Both camps are wrong: The iPhone started a UI revolution, and the tablet is just step two. Here's why.
If you are talking hardware, you can speculate about many different features. But when it comes to the fabled Apple Tablet, there are basically three user interface camps at war. On one side there are the people who think that a traditional GUI—one built on windows, folders and the old desktop metaphor—is the only way to go for a tablet. You know, like with the Microsoft Windows-based tablets, and the new crop of touchscreen laptops.
In another camp, there are the ones who are dreaming about magic 3D interfaces and other experimental stuff, thinking that Apple would come up with a wondrous new interface that nobody can imagine now, one that will bring universal love, world peace and pancakes for everyone—even while Apple and thousands of experts have explored every UI option imaginable for decades.
And then there's the third camp, in which I have pitched my tent, who says that the interface will just be an evolution of an existing user interface, one without folders and windows, but with applications that take over the entire screen. A 'modal' user interface that has been proven in the market battlefield, and that has brought a new form of computing to every normal, non-computer-expert consumer.
Yes, people, I'm afraid that the tablet will just run a sightly modified version of the iPhone OS user interface. And you should be quite happy about it, as it's the culmination of a brilliant idea proposed by a slightly nutty visionary genius, who died in 2005 without ever seeing the rise of the JesusPhone.
This guy's name was Jef Raskin.
The incredible morphing computer
Raskin was the human interface expert who lead the Macintosh project until Steve Jobs—the only guy whose gigantic ego rivaled Raskin's—kicked him out. During his time at Apple, Raskin worked on a user interface idea called the 'information appliance,' a concept that was later bastardized by the Larry Ellisons and Ciscos of this world.
In Raskin's head, an information appliance would be a computing device with one single purpose—like a toaster makes toast, and a microwave oven heats up food. This gadget would be so easy to use that anyone would be able to grab it, and start playing with it right away, without any training whatsoever. It would have the right number of buttons, in the right position, with the right software. In fact, an information appliance—which was always networked—would be so easy to use that it would become invisible to the user, just part of his or her daily life.
Sound familiar? Not yet? Well, now consider this. Later in his life, Raskin realized that, while his idea was good, people couldn't carry around one perfectly designed information appliance for every single task they can think of. Most people were already carrying a phone, a camera, a music player, a GPS and a computer. They weren't going to carry any more gadgets with them.
He saw touch interfaces, however, and realized that maybe, if the buttons and information display were all in the software, he could create a morphing information appliance. Something that could do every single task imaginable perfectly, changing mode according to your objectives. Want to make a call? The whole screen would change to a phone, and buttons will appear to dial or select a contact. Want a music player or a GPS or a guitar tuner or a drawing pad or a camera or a calendar or a sound recorder or whatever task you can come up with? No problem: Just redraw the perfect interface on the screen, specially tailored for any of those tasks. So easy that people would instantly get it.
Now that sounds familiar. It's exactly what the iPhone and other similar devices do. And like Raskin predicted, everyone gets it, which is why Apple's gadget has experienced such a raging success. That's why thousands of applications—which perform very specialized tasks—get downloaded daily.
The impending death of the desktop computer
Back in the '80s, however, this wasn't possible. The computing power wasn't there, and touch technology as we know it didn't even exist.
During those years, Raskin wanted the information appliance concept to be the basis of the Mac but, as we know, the Macintosh evolved into a multiple purpose computer. It was a smart move, the only possible one. It would be able to perform different tasks, and the result was a lot simpler than the command-line based Apple II or IBM PC. It used the desktop metaphor, a desk with folders to organize your documents. That was a level of abstraction that was easier to understand than typing 'dir' or 'cd' or 'cls.'
However, the desktop metaphor still required training. It further democratized computing, but despite its ease of use, many people then and today still find computers difficult to use. In fact, now they are even harder to use than before, requiring a longer learning curve because the desktop metaphor user interface is now more complex (and abstract) than ever before. People 'in the know' don't appreciate the difficulty of managing Mac OS X or Windows, but watching some of my friends deal with their computers make it painfully obvious: Most people are still baffled with many of the conventions that some of us take for granted. Far from decreasing over time, the obstacles to learning the desktop metaphor user interface have increased.
What's worse, the ramping-up in storage capability and functionality has made the desktop metaphor a blunder more than an advantage: How could we manage the thousands of files that populate our digital lives using folders? Looking at my own folder organization, we can barely, if at all. Apple and Microsoft have tried to tackle this problem with database-driven software like iPhoto or iTunes. Instead of managing thousands of files 'by hand,' that kind of software turns the computer into an 'information appliance,' giving an specialized interface to organize your photos or music.
That's still imperfect, however, and—while easier than the navigate-through-a-zillion-folders alternative—we still have to live with conventions that are hard to understand for most people.
The failure of the Windows tablet
As desktop computing evolved and got more convoluted, other things were happening. The Newton came up, drawing from Raskin's information appliance concept. It had a conservative morphing interface, it was touch sensitive, but it ended being the first Personal Digital Assistant and died, killed by His Steveness.
Newton—and later the Palm series—also ran specialized applications, and could be considered the proto-iPhone or the proto-Tablet. But it failed to catch up thanks to a bad start, a monochrome screen, the lack of always-connected capabilities, and its speed. It was too early and the technology wasn't there yet.
When the technology arrived, someone else had a similar idea: Bill Gates thought the world would run on tablets one day, and he wanted them to run Microsoft software. The form may have been right, but the software concept was flawed from the start: He tried to adapt the desktop metaphor to the tablet format.
Instead of creating a completely new interface, closer to Raskin's ideas, Gates adapted Windows to the new format, adding some things here and there, like handwriting recognition, drawing and some gestures—which were pioneered by the Newton itself. That was basically it. The computer was just the same as any other laptop, except that people would be able to control it with a stylus or a single finger.
Microsoft Windows tablets were a failure, and they became a niche device for doctors and nurses. The concept never took off at the consumer level because people didn't see any advantage on using their good old desktop in a tablet format which even was more expensive than regular laptops.
The rise of the iPhone
So why would Apple create a tablet, anyway? The answer is in the iPhone.
While Bill Gates' idea of a tablet was a market failure, it achieved one significant success: It demonstrated that transferring a desktop user interface to a tablet format was a horrible idea, destined to fail. That's why Steve Jobs was never interested. Something very different was needed, and that came in the form of a phone.
The iPhone is the information appliance that Raskin imagined at the end of his life: A morphing machine that could do any task using any specialized interface. Every time you launch an app, the machine transforms into a new device, showing a graphical representation of its interface. There are specialized buttons for taking pictures, and gestures to navigate through them. Want to change a song? Just click the 'next' button. There are keys to press phone numbers, and software keyboards to type short messages, chat, email or tweet. The iPhone could take all these personalities, and be successful in all of them.
When it came out, people instantly got this concept. Clicking icons transformed their new gadget into a dozen different gadgets. Then, when the app store appeared, their device was able to morph into an unlimited number of devices, each serving one task.
In this new computing world there were no files or folders, either. Everything was database-driven. The information was there, in the device, or out there, floating in the cloud. You could access it all through all these virtual gadgets, at all times, because the iPhone is always connected.
I bet that Jobs and others at Apple saw the effect this had on the consumer market, and instantly thought: 'Hey, this thing changes everything. It is like the new Mac after the Apple II.' A new computing paradigm for normal consumers, from Wilson's Mac-and-PC-phobic step-mom to my most computer-illiterate friends. One that could be adopted massively if priced right. A new kind of computer that, like the iPhone, could make all the things that consumers—not professionals, or office people—do with a regular computers a lot easier.
This was the next step after the punching card, the command line, and the graphical desktop metaphor. It actually feels like something Captain Picard would use.
Or, at least, that's how the theory goes.
Potential UI problems they need to solve
For the tablet revolution to happen, however, the iPhone interface will need to stretch in a few new directions. Perhaps the most important and difficult user interface problem is the keyboard. Quite simply, how will we type on the thing? It's not as easy as making the iPhone keyboard bigger. You can read our analysis of the potential solutions here. The other issues involved are:
• How would Apple and the app developers deal with the increased resolution?
• How would Apple deal with multitasking that, in theory, would be easier with the increased power of a tablet?
• Where would Apple place the home button?
The resolution dilemma
The first question has an easy answer from a marketing and development perspective.
At the marketing level, it would be illogical to waste the power that the sheer number of iPhone/iPod Touch applications give to this platform. Does this mean that the Apple Tablet would run the same applications as the iPhone, just bigger, at full screen?
This is certainly a possibility if the application doesn't contain a version of its user interface specifically tailored for the increased screen real state. It's also the easiest one to implement. The other possibility is that, in the case the application is not ready for the extra pixel space, it may run alongside other applications running at 320 x 240 pixels.
Here is a totally made-up example of home-screen icons and apps running on a tablet at full screen:
However, this would complicate the user interface way too much. My logical guess is that, if the app interface is not Tablet-ready, it would run at full screen. That's the cheapest option for everyone, and it may not even be needed in most cases: If the rumors are true, there will be a gap between the announcement of the device and the actual release. This makes sense, as it will give developers time to scramble to get their apps ready for the new resolution.
Most developers will like to take advantage of the extra pixels that the screen offers, with user interfaces that put more information in one place. But the most important thing is that the JesusTablet-tailored apps represent an opportunity to increase their sales.
From a development point of view, this represents an easily solvable challenge. Are there going to be two applications, one for the iPhone/iPod touch, and another one for the tablet? Most likely, no. If Apple follows the logic of their Mac OS X's resolution-independent application guidelines—issued during the World Wide Developers Conference in June—the most reasonable option could be to pack the two user interfaces and associated art into a single fat application.
How to multitask
Most rumors are pointing at the possibility of multitasking in the tablet (and also on the iPhone OS 4.0). This will bring up the challenge of navigation through running apps that take all over the screen. Palm's Web OS solves this elegantly, but Apple has two good options in their arsenal, all present in Mac OS X.
The app switch bar or a dock
They can implement a simple dock that is always present on the screen or is invoked using a gesture or clicking a button or on a screen icon. This is the simplest available method, and can also be made to be flashy and all eye candy.
This is one of those features that people love in Mac OS X, but that only a few discover on their own. Once you get it, you can't live without it. I can imagine a tablet-based Exposé as an application switcher. Make a gesture or click on a corner, and get all running applications to neatly appear in a mosaic, just like Mac OS X does except that they won't have multiple windows. The apps could be updated live, ready to be expanded when you touch one of them. Plenty of opportunity for sci-fi'ish eye candy here.
A gesture makes sense for implementing Exposé on the tablet—as you can do on the MacBook Pro—but they could also use their recently-patented proximity sensing technology. In fact, I love this idea: Make the four corners of the tablet hot, making icons appear every time you get a thumb near a corner. The icons—which could be user customizable—could bring four different functions. One of them would be closing the running application. The other, call Exposé and bring up the mosaic with all running applications. The other could invoke the home screen, with all the applications. And a fourth one, perhaps, could open the general preferences. Or bring a set of Dashboard widgets that will show instant information snippets, like in Mac OS X.
Here's an illustration—again, totally hypothetical—of what this sort of Exposé interface might look like:
The trouble with the home button
The physical home button in the iPhone and the touch plays a fundamental role, and it's one of the key parts of the interface. Simply put, without it, you can exit applications and return to the home screen. On the small iPhone, it makes sense to have it where it is. On this larger format—check its size compared to the iPhone here—things are not so clear.
Would you have a single home button? If yes, would you place it on a corner, where it could be easily pressed by one of your thumbs, as you hold the tablet? On what corner? If you add two home buttons, for easier access, wouldn't that confuse consumers? Or not? And wouldn't placing a button affect the perception of the tablet as an horizontal or vertical device? This, for me, is one of the biggest—and silliest—mysteries of the tablet.
What about if Apple decides not to use a physical button? Like I point out in the idea about Exposé, the physical button could be easily replaced by a user definable hot corner.
Revolution Part Two
With these four key problems solved, whatever extra Apple adds—like extra gestures—is just icing on the iPhone user interface cake that so many consumers find so delicious. The important thing here is that the fabled Apple Tablet won't revolutionize the computing world on its own. It may become what the Mac was to the command-line computers, but the revolution already started with the iPhone.
If Apple has interpreted its indisputable success as an indication about what consumers want for the next computing era, the new device will be more of the same, but better and more capable.
Maybe Apple ignored this experience, and they have created a magical, wondrous, an unproven, completely new interface that nobody can imagine now. You know, the one that will bring universal love, world peace and pancakes for everyone. I'm all for pancakes.
Or perhaps Steve Jobs went nuts, and he decided to emulate el Sr. Gates with a desktop operating system.
The most logical step, however, is to follow the iPhone and the direction set by Raskin years ago. To me, the tablet will be the continuation of the end for classic windowed environment and the desktop metaphor user interface. And good riddance, is all I can say.
Wednesday, 20 January 2010
Friday, 15 January 2010
A very funny post:
For reasons that would take too long to explain here, I moved to New Zealand about six months ago. I brought my life with me, including, among goods and chattels more varied than I had realized, my trusty Mac mini, which has been doing sterling duty as a Web and mail server for a year or more. My life also includes a wife and daughter, and they, not surprisingly, came with me too.
This has been an almost entirely unqualified success. The people in New Zealand are friendly, the food is astonishing, and the wine is spectacular. But, even in God's Own Country, nothing is perfect. New Zealand is a truly splendid place to live in many, indeed almost all, regards. But for a techie - and I am, quite unashamedly and unabashedly, one of that number - there are definite quibbles, of which by far the largest is bandwidth, or the lack thereof.
When I lived in America, I was undeniably spoiled, as many Americans tend to be. Life, however shallow it may have been in other regards when one lives in Florida, was certainly easy from a connectivity point of view. My home office had a broadband connection with, as I simply took for granted, took for my birthright, unlimited data. I could slurp down, and throw up, all the data I wanted. The Internet was mine, all of the time.
But when we signed up for our New Zealand connection, we were stunned - stunned, I say! - to discover that the Internet, in New Zealand, is a highly limited and finite resource. We went from 'all you can download' to 'you get 20 GB a month, you'll pay $100 a month, and you'll be grateful for it' in the time it takes to fly from Los Angeles to Auckland (which is, now I come to think about it, a horrendously long time). This was a most atrocious imposition for the Internet junkies that my wife and daughter had become (not me, though, of course - I was far too virtuous, too self-restrained). For all that New Zealand had to offer, the narrowness of its Internet pipes was simply intolerable.
We opted for the 'double your data' option (and the additional $30 per month that wasn't optional), but we still find ourselves limited by 40 GB per month. I check the online usage-meter every few days (using, in the process, a few more precious bytes; oh, the cruel, vicious, bitter irony!), and issue imprecations to Wife and Daughter, reminding them that Facebook is a luxury, not an absolute necessity; they, as addicts always do, try to justify their endless status-checking as being entirely reasonable, indeed essential. I calculate the bandwidth usage of Skype and of YouTube; I flinch when I see Daughter download another Mary-Kate and Ashley movie from iTunes (that's not really a bandwidth issue; that's just on general principles -
I'd cringe if that were happening if we had a free and entirely unlimited T3 connection direct to the trans-Pacific backbone). I have developed new and careful Internet habits: I use the 'Open link in new window' option if I think there's any possibility that I might want to visit a second link from the same page, to avoid potentially having to load the original page a second time, and Apple Mail no longer checks automatically every minute - each check uses several dozens of bytes, I'm sure, and they all add up. I even avoid visiting Japanese or Chinese sites, conscious of two-byte character sets using more than their fair share of bandwidth.
I check my Google Analytics numbers with conflicted emotions: every page view for our various blogs and online presences is, on the one hand, a cause for celebration - more visits, more revenue, more Internet fame and glory. On the other hand, those page views are also an occasion for more hand-wringing, since they were served up from my Mac mini, over my desperately and mercilessly limited Internet connection. I post photography from the beautiful country we now call home, but wince when I see that I've had visits to my site. Even the very act of visiting the Google Analytics Web site eats up a handful of kilobytes that I can scarce afford. Even writing this article is a painful experience; while the catharsis of venting about the
primitivity of our connection is undeniably therapeutic, every adjective, every atom of invective, every single character I devote to letting the world know how abjectly deprived we are is one fewer byte that can be used elsewhere.
The reason for this caution is simple. As soon as we reach our allocated 40 GB - think about that for a second; it's only a gig and a third per day, and the lovely and talented Mrs. McCabe, with whom I share everything, including my bandwidth, is a Web designer - a Gollum-like finger, somewhere in a dungeon buried deep in darkest Auckland, reaches out in the gloom, flicks a switch, and says 'It's dial-up for you. Your bandwidth is mine, it's mine, my precioussss.' And that's it. We're reduced to an Amish connection, one so slow it would be more efficient to hand-write packets of data and strap them to the legs of carrier pigeons. Web pages load - if they load - in minutes, rather than seconds. YouTube is a pipe dream. Downloads, well,
downloads don't. There has been much discussion around the blogosphere in the last month about when the first decade of the 21st century will end. Here in New Zealand that discussion is academic - we're still, at least in terms of Internettery, stuck back in the 1990s. My connection today is so slow that I half-expect to hear the dolphin-screech of a modem actually dialing in to Vodafone as I try to connect, and I'm grateful that I'm not on deadline for this article. Looking at the cave paintings of Lascaux would represent a faster data transfer than the one I'm hobbled with right now.
I have, I would like to stress, been more than diligent in my attempts to figure out where our precious data might be going. My first thought was Skype, given that Daughter spends much of her time video-chatting with friends back in the Northern Hemisphere. I installed iStat Menus; as far as I could tell, a two-way video conference was using only around 120 KBps. But Vodafone's (for they are our current Internet provider) online 'check your usage' tool was reporting that there were days when we used as much as 6.5 GB of data. The day we reached this number (our record so far, by the way) was a school day - I doubt, then, that Daughter's Skyping can be the culprit (she would have
needed 15 hours of non-stop chatting, and while she's good, even she's not that good).
I suspected that it might be my server. I was reluctant to give up running my own server after moving to New Zealand because I've localized a handful of my domains - mccabe.net.nz, threelions.co.nz, astralgraphics.co.nz - and it's hard to find U.S.-based hosting services that handle .nz domains. I host my personal site, stevemccabe.net, as well as my clients' sites, through a European hosting-and-reselling service, but they don't offer anything in the Kiwi domain space, so I've bought my domains through GoDaddy. I've become familiar with GoDaddy's DNS setup system, and so, frankly, it's just convenient to register with them and then host myself. That said, GoDaddy's pricing structure for hosting is Byzantine beyond belief (I've had
clients in the past want me to set up their sites on GoDaddy - oh, the power of advertising, especially if it involves scantily clad ladies with large chests - and I now make it a condition of service that I provide hosting as well as design) and life was so much easier when I knew that I had all the Internet connectivity I wanted.
So I looked at the traffic stats on my server. This was a bittersweet experience because on the one hand, no, I wasn't ploughing through my data, which was good, but on the other hand, this meant that my sites weren't getting the traffic I would have liked. Still, at least that was another possible culprit struck from the list.
I issued the sternest of imprecations to my girls, and, to all intents and purposes, stopped using the InterWebs. But no matter how much we cranked back our usage, we still found that we were using - or, at the very least, we were being reported as using - at least several hundred megabytes a day.
It was time to talk to Vodafone. I contacted them several times, and received several different bogus explanations: I had viruses (_ahem_, my network is Apple-only), I had moochers (WPA2 password, a house built of brick, a large garden) - basically, it was my fault, one way or another. It certainly couldn't be Vodafone's fault. I pushed a little further. I was told to install a data tracker - I was even sent Vodafone's recommended monitor, SurplusMeter. I installed it across my network, and it reported, of course, that I was using monstrous amounts of data. The reason was simple - it meters not only wide-area, but also local-area network traffic. My iMac, for example, was
pushing through megabyte after megabyte, even though I had no applications open at all. Well, none that would use the Internet.
Except iTunes. But I wasn't downloading anything. What I was doing was streaming music to my AirPort Express. SurplusMeter was recording every last packet that went out of the data port it was charged with monitoring - in this case, my AirPort card. I called Vodafone again, and explained that the numbers SurplusMeter reporting were meaningless. They said I should shut down my local network for a day and see what my numbers were like. I did - and on that day my wife's iMac managed not to report a single bit going in or out. Not bad for a Web designer who telecommutes between New Zealand and Florida.
Vodafone's next suggestion was that we had a line fault. This was a possibility - I live in a very old house (we think it's pre-war, but we're not sure which war; my money's on the Boer War) - and one of the call-centre people I spoke to noticed that, while a DSL modem typically reconnects four or five times a day, mine had already reconnected over a dozen times - and I still hadn't finished my first cup of coffee. They assured me that they would look into this, but in the meantime I'd need to disconnect my phone line (a service, mind you, that I pay for) for a day in case there was a problem with my DSL filters. This may, or may not, have been the problem; I have no way of knowing. Maybe they're still running tests. At the very least,
they haven't replied.
Finally, I wrote to Russell Stanners, CEO of Vodafone NZ, at the end of last month. A week or so later, I got a phone call from Vodafone, and, after a long chat, the rep who called me (also called Russell; hmm...) agreed to waive the $199 early termination fee and release me from the one-year contract that we would have been bound to until June 2010.
We're switching to TelstraClear. I'm not doing this because they're particularly brilliant, but because they do one thing that Vodafone don't - instead of dialing us back to pecking-out-bits-on-a-Morse-Code-tapper speeds, they'll keep on selling us more gigabytes. I'm willing to pay for a service (especially a service that I actually receive), but the idea that I only get my 40 gigabytes, and, regardless of whose fault it is, that's it, I'm cut off like a naughty schoolboy, well, that really chafes.
So now we're waiting. Our Internet connection went back to last-millennium speeds after only a fortnight this month, so we're struggling - some evenings we can't tell whether we're offline, or just really slow. And although I signed up to TelstraClear over a week ago, I just had a phone call from one of their reps letting me know that, because of the Christmas and midsummer holiday backlog, they won't flip our switch for another week.
I'll be emailing this article off to TidBITS World HQ shortly. I have no idea when they may get it. The Word document that contains this piece is 41 KB, which, at my current Internet speeds, could take until March to send. It might be quicker for me to save it to a CD, swim to California with the disc between my teeth, walk across the country, and hand it to Adam Engst personally.
[Bio: Steve McCabe is a Mac consultant, tech writer, and teacher who moved, for reasons that have but the most tangential connection to this article, to New Zealand in April 2009. He writes about his adventures in New Zealand, he blogs about tech, and he is currently rebuilding his personal Web site.]
Create a complete social network with your company or group's
own look. Scalable, extensible and extremely customizable.
Take a guided tour today <http://www.webcrossing.com/tour>
Copyright © 2010 Steve McCabe. TidBITS is copyright © 2010 TidBITS Publishing Inc. If you're reading this article on a Web site other than TidBITS.com, please let us know, because if it was republished without attribution, by a commercial site, or in modified form, it violates our Creative Commons License.
Wednesday, 13 January 2010
I’m not sure what to say about the devastating earthquake in Haiti, other than to encourage you to help with a donation. (The form includes an Amazon payments options — super easy if you have an Amazon account.)
Another good option: Doctors Without Borders.
(Via Daring Fireball.)
Tuesday, 12 January 2010
Google senior vice president David Drummond:
We have decided we are no longer willing to continue censoring our results on Google.cn, and so over the next few weeks we will be discussing with the Chinese government the basis on which we could operate an unfiltered search engine within the law, if at all. We recognize that this may well mean having to shut down Google.cn, and potentially our offices in China.
He also revealed that Google was the victim of a large-scale security attack last month, aimed at getting access to the Gmail accounts of Chinese human rights advocates.
Good for Google.
(Via Daring Fireball.)