A Glimpse At the Future
I first saw the movie Minority Report in 2002. Overall, just an OK movie. What struck me as absolutely amazing, however, was the manner in which all these 'people of the future' used their computers. They didn't sit at their desks pounding away at keyboards while moving a little cursor around their screen. Instead, they manipulated what the computer threw their way as if these digital creations were physically present. It's as if their computers could on-the-fly construct something with real volume and mass ... then you reach out and touch it.
If you haven't seen the movie or don't recall what I'm describing, watch the clip below.
I walked away from the movie thinking Why in the world doesn't this technology exist?. Up to that point in time there had been no significant change to a computer's form-factor since Steve Jobs and his buddies whipped up the Apple I in 1976. Sure, our computers have gotten faster, flat screens made them look sexier, laptops made them (somewhat) portable and the internet gave us amazing new ways to use them. But at each iteration you always used the computer the same way: typing on a keyboard and pointing with a mouse.
Change Is-A Commin' ... At the Speed of Flowing Lava
I felt that change is just around the corner and that these 'futuristic' computers are bound to start popping up soon. Not wanting to miss the next wave of innovation, I monitored all related developments closely.
That proved to be as exciting as watching grass grow. There looked to be no real advancements in HCI (Human-Computer Interaction) models with any potential commercial success. The keyboard and mouse just would not budge.
Dethroning the keyboard and mouse proved to be an ultra-marathon instead of a sprint, but I definitely wasn't alone in cheering on the effort. The computers depicted in Minority Report were influenced by active research being done at Microsoft. Over time, the whole concept even got a name: NUI - Natural User Interface (as oppose to the traditional GUI - Graphical User Interface). Communities sprung up on the internet (such the NUI group started in 2006), people with Fry's tabs and ample supply of duct-tape started building their own next-generation computers.
As interesting as these developments were, they weren't exactly something you could easily explore in hopes of creating the next exciting commercial offering. These were things better left to ultra-geek hobbyists and PhDs with corporate and academic R&D budgets.
2007 - First Signs of Life
In 2007 things started to heat up a bit. While computer interaction using 3D gestures was still far, far away; computer interaction using 2D gestures started making its way out of the labs. Microsoft announced their Surface Computer with no mouse or keyboard in sight (see it in action in the video blow). At ~$10K, though, Surface did not target your average gadget-obsessed consumer. It started out and remains to this day a 'pie-the-sky' device for most (all sane) people. I'm actually yet to see one in the wild.
Then there was the iPhone.
With hind-sight being 20/20, the iPhone is, in my opinion, the most significant step towards changing how we interact with computers to date. Oddly enough, I didn't see it that way at first. I wasn't looking for a change in how we interact with cell-phones, I was looking for a change in how interact with our PCs. Turns out they are one and the same. Cell-phones and PDAs became smart-phones and smart-phones became super-smart-phones and now I can do 80% of what I did on my laptop on my iPhone. The iPhone brought the first step towards those Minority Report computers to the masses - Multi-Touch.2009 - I Make My Move
Being too short-sighted to start developing iPhone apps in 2007 (after all they, they were just phones. Right? Wrong!) ... I waited ... and waited. I finally stopped waiting in the second quarter of 2009 after sensing what I thought was the series of events that would lead to this much anticipated paradigm shift: Windows 7 Beta and .NET 4.0 Beta became available for download.
If your a casual Windows users; Windows 7 has native OS support for multi-touch input and .NET 4.0 offers an API for developing applications capable of responding to multi-touch input (both borrowing technology from Microsoft's Surface efforts). With native multi-touch support enabled in an OS with such a huge market share, I figured it was just a matter of time (2 years max) before everybody and their mother had a multi-touch monitor. I dumped two grand on a prototype multi-touch monitor, a library of books on .NET and WPF, and started cranking out code.
Throughout 2009, everyone was talking about an Apple tablet. With Window 7 on the horizon, other computer vendors (Dell, HP, Asus, Lenovo, etc) started promising tablets as well. MS was leaking info about their own tablet computer, the Courier. For God's sake, even Michael Arrington (blogger extraordinaire) was going to build one! 2010 was branded as The Year of the Tablet and I was feeling pretty good. I had a well thought-out product concept and progress on the prototype was moving along well. All I needed was to have the software ready when multi-touch monitors feeding Windows 7 PCs were on everyone's desks.
The iPad Changes Everything
January 2010 came. Apple announced the iPad. Everything changed.
All this talk about tablets in 2009, yet no one really figured out what the hell a tablet really is. I certainly didn't pretend to know exactly what a tablet would look like or how it would function. I just knew one was coming and had a product in mind that leveraged multi-touch to help remote knowledge workers communicate more effectively.
No one knew except for Apple. With the iPad, Apple told everyone what a tablet is: A highly portable, relatively affordable device whose primary purpose is personal content consumption and (according to a new study) gaming. A tablet is something you leave on your coffee table, not something you take to work. Consumers agreed and Apple sold an iPad every three seconds. When all the other would-be tableteers got the memo, they went back to the drawing board.
The first thing they cut out was Windows. Such a bloated, general-purpose OS had no place on a light-weight device expected to have a 10+ hour battery life (I can go days without charging my iPad). As for usability, no matter how many muti-touch bells and whistles MS adds, Windows 7 is still busting at the seams with a 'click-me' interface. Windows 7 (in my opinion a very good keyboard/mouse PC OS) can't even begin to compete with the tight iOS/hardware integration necessary in a mobile device and available in the iPhone and iPad. Even MS hopped on the band-wagon by pulling the plug on the Courier. 2010 did not end up being The Year of the Tablet, rather The Year of the iPad.
After seeing the iPad, it's all blindingly obvious. I wish it would have been so obvious when the iPad was just a rumor ...
Back to the Drawing Board
Once multi-touch application on Windows stopped making sense, I found myself staring at eight months worth of work that I could only label as a 'learning experience'. Two days after the iPad announcement, I drove to the nearest Apple store and started acquiring the tools of a new trade: iPhone OS development. I jumped on Amazon and order a new library of books on the iPhone SDK, Objective-C, and the App store ecosystem and started climbing what would prove to be a surprisingly steep learning curve. It's now been 7 months and there's still much to learn.
Coming to an iOS Device Near You
Coming to an iOS Device Near You
I'm a handful of weeks away from submitting my first iPhone app to the App Store. If you own an iPhone and enjoy playing golf, you're gonna love it. I'll then take all the bumps and bruises from this effort and start working on an iPad app that I'm very excited about. If the iPhone app gets enough traction, who knows, maybe I can even make this a day-job rather than my 9:00 PM to 3:00 AM shift.
All I know for sure is that tablets are here to stay and if you allow yourself to daydream a bit you can start to see some pretty amazing new ways to use and play with all that silicon based horse-power.
Stay tuned.
Here's something to ponder:
ReplyDeleteActive Contact Lenses: put them on and start controlling the screen, blink left eye - select, blink right eye - open context sensitive menu, open your eyes like you just saw Martians land in your backyard - zoom in, narrow your eyes like in "what the heck is Alex talking about" - zoom out... you get the idea