Wednesday, December 26, 2007

Hardware Leads in 2007

In some ways you could say 2007 was a year for hardware.

Twenty years ago, Bill Gates and others in the computer world made the bold statement that the future is about software and that this strange world of command and control, embedded deeply in the guts of the wires and boxes we all had on our desktop was really just an intermediary to the software that make us more powerful.

For the next ten years we stagnated in an ocean of beige boxes, monitors, and boxy peripherals and unimaginative hardware.

And since then, software has dominated the evolution of the computer world. Computer conferences and trade shows were filled with the latest and greatest "next revision" and version updates and what they meant for end users.

Then, a few years ago the cutting edge innovation was wifi and distributive computing.

However, 2007 could well be regarded as the year of hardware. Apple's iPhone, iPod Touch and touch screen technologies could be named the biggest product advancements of the year. And touch screen technology could well be at the leading edge of a whole new wave of development and change in our computing experience.

Sure, people will tell you that iPhone wasn't the first to have a touch screen or that Apple didn't lead the way. Enthusiasts have long talked about touch screens going back to and before the Newton and Palm Pilot. Every since the advent of the mouse, techies have envisioned a keyboardless future but instead the typewriter-style input method proliferated rather than died as some had predicted in the 1980s.

Yet, the touch screen technology employed on the iPhone and iPod Touch make these devices truly innovative and redefine the user interaction with them as opposed to either working poorly or simply being an afterthought.

When Jobs squeezed his fingers together to zoom in, tapped to enlarge, rotated to change orientation and flicked his finger to scroll - everything about touch screen and interface suddenly came alive with potential for average users and the marketplace was magically infused with new life. A new field called haptics made this tactile virtual reality a innovation for the future rather than a mishap of the past.

Scroll wheels and QWERTY keyboards might finally become a less dominant means for inputting data to your computer, music player, cell phone or palm device. Fortunately, for us as computer users, hardware manufacturers are thinking outside the beige box and will stop thinking of computing as someone sitting at a desk, hands on the keyboard, eyes looking straight ahead and start fitting the hardware technology to us rather than us to the technology.

And the iPhone was not the only new wave product using touch screen emerging in 2007. Microsoft's multitouch computing table, while not selling as many units world-wide as the iPhone and iPod Touch it is clearly a part of this new trend. There have been rumors that Apple is developing a multitouch Mac with a larger screen than the iPhone but using the same direct interaction of tap for click, pinch for zoom and flick for scroll that exists with iPod Touch and iPhone.

In Jobs' coy way, he has hinted at the future of the Mac interface by saying, there are no “verbs” in the iPhone interface, alluding to the way a standard mouse or stylus system works. In menu command systems, users select an object, like a photo, and then separately select an action, or “verb,” to do something to it.

Technology is in the works for wall sized touch screens that could start to be seen in office buildings, conference rooms or designed into flat screen TVs, gaming modules, and home movie theaters. There is even talk about kitchen refrigerator touch screen magnets that will connect via wifi to the internet. Bath-mat and shower stalls next?

Wednesday, December 5, 2007

Apple 23 Years Later

When the first iPhone came out I was amazed at how much computing power Apple had created for a persons pocket and how, as an electronic device, it has no comparison. And yet, there is still something remarkably Apple about the iPhone. The iPhone brings back all the same thrills and chills of the original Macintosh unveiled during the half time commercial during the Superbowl in 1984.

Not unlike the first Macintosh, I was also astonished at how much Apple left off the iPhone. When the Mac first come out it was crippled with one disk drive and no internal storage. When the iPhone came out it was using the antiquated Edge protocol for communicating with the internet. Finally in 2008, Apple will upgrade the iPhone hardware to 3G

When the first Mac came out it was limited with 128K of RAM and the iPhone clearly has too little memory -- 4GB was discontinued almost immediately and 8GB is way too small at a time when 16GB (already available on the iPod Touch) and 32GB are already becoming widely available in flash drive devices. No doubt, the new iPhone coming in January 2008 will have 16GB and hopefully a 32GB version also.

When the Mac first arrived many complained about it having only a one button mouse (one can argue this philosophically) and the iPhone is limited by having no video capture ability, no Flash or Java and most vociferously, the iPhone is extremely crippled by its exclusive contract with ATT. As Walt Mossberg exclaimed, "Break the lock" and now European countries have joined a chorus of complaints against the way Apple has sought to limit the technological freedom of its users.

However, Apple triumphed over all these drawbacks with new releases and bleeding edge technology by making computing devices that completely redefine how we use them. Apple makes devices with jaw-droppingly cool features and easy to use interfaces that are simple yet powerful as tools.

Monday, December 3, 2007

Beautiful Harmony - Apple and Microsoft


The first application I ever bought for the Mac was Microsoft Word and it cost me $24 from MacWarehouse. My friends at Microsoft dispute that Word cost so little in those days but most Mac applications could be had for a 20 dollar bill. That's the truth ruth.

When I entered the world of Apple Mac consulting I quickly became a Apple/Microsoft expert. I taught courses in major accounting and law firms on how to use Microsoft enterprise software on a Mac and set up databases and networks. Together Apple and Microsoft radically invented an user interface that was graphical, used standard file formats and made computing easy for millions rather than complicated and code based command line computing for a few techno-geeks.

But it wasn't easy. First there was the ignorance of so many in the executive suits and mid-level management that really didn't even believe Microsoft applications ran on a Mac. Not only did they run on a Mac, they were invented on the Mac. Excel was written first and specifically for the Mac. Accounts can't believe this when you tell them.

Then, second, came the wholesale hostility between Apple and Microsoft. Yes, there were developers inside Microsoft who loved and emulated the Macintosh and used its interface as a foundation stone for working out there ideas, adding and subtracting, improving upon and not coming up to measure of the Mac GUI. But in particular divisions and among the leadership a Microsoft, Apple became the enemy and Steve Jobs often did and still does thumb his nose at the folks up in Redmond, Washington. Jobs main complaint is Microsoft has no taste.



For a time it was hard to be both a Microsoft and Apple lover without apology. But I do recall the days when every new product announcement, on either platform, was exciting and welcomed in the computing world as a sign of innovation. Apple would release new products and they would be insanely great. Microsoft would bring out their apps with incredible new features that further empowered the user and people thought -- go faster, bigger, and play harder. All within a tiny footprint of memory and RAM that by today's standards is almost unbelievable.

Saturday, December 1, 2007

Buy a Mac

In 1984 I bought my first Macintosh computer. The odd looking beige box with a nine inch screen called the Macintosh Plus. The crazy machine did not have a hard disk, only one non-standard 3.5" 800 MB floppy drive, a one-button mouse, and the most astonishing feature at that time was a incredible 1 MB of RAM. One megabyte! Who would ever need more?

One of the two Steve's who founded Apple Computer, Inc., Steve Jobs liken the Macintosh to a toaster - a people's appliance that would be easy-to-use yet highly functional and purposeful. The Mac would be an insanely great tool but a simple one to use also.

My good friend Mike had convinced me to buy this scary toy personal computer against my better judgement. After leaving my full-time job for a life as a work-from-home freelancer in New York City, I had narrowed my search to two other computers at the time that were better hardware devices. The other two boxes were the Atari 1040 ST, and the Apple IIgs. These desktop machines had big color monitors, they had better audio capabilities and some serious gaming potential.

But Mike insisted: "Buy a Mac!"

I asked, "How can you compare a Macintosh with it's 9 inch B&W screen with an Atari color monitor?"

Mike's answer: "Atari is dead!" and "Nobody is writing software for the Atari TOS" ( which meant the Tramiel Operating System)

"But, but... Apple is the only company writing software for the Mac, isn't it?"

"No, he answered on an inside tip, " A company in Seattle called Microsoft is going to revolutionize the software industry and they are writing an application called Excel for the Mac. I have beta copies of Works and Word. And soon I will be getting a Mac program called File."

"What do I care about spreadsheets? I want a computer that will allow me to make and produce graphics, compose and play music, and possibly edit video or at least act as a controller for the tape deck. Even Commodore is promising these capabilities."

At the end of the day, I bought the Mac. Probably the most abstract and unprovable reason Mike gave that finally convinced me not to buy Atari, Apple IIGS, Commodore or even an IBM PC XT was:

"Mac is the future of computing."

And Mike was right.