The G4 Cube, Airport technology, the iMac G4, the 12″ PowerBook, iTunes for Windows, the G5, the iPod mini, Safari for Windows, the first MacBook Pro, the iPod touch: these are very different products belonging to different product lines and different strategies. But they do have one thing in common. All of them have been introduced as… One More Thing.
For more than a decade Steve Jobs has topped off his already impressive presentations with a final jolt, using a technique made famous by actor Peter Falk in his Columbo role: employing a carefully staged offhand remark at the end of the show he revved up again the audience’s attention and ended the presentation with a bang.
Not all of Jobs presentations included a “One More Thing” or used these exact words, but many did and the technique proved very popular and iconic and the phrase was used – in 2005 – as the motto for one of Apple’s Special Events.
Jobs first used this technique in January 1998, at the MacWorld San Francisco. (more…)
Apple considered it to be its biggest opportunity since the introduction of the Macintosh and a chance to reinvent itself. But after ten years of development, spending more than 100 USD Million and five years on the market selling just 300000 units, it was clear that the Newton was not a new device “for the rest of us” and definitely not Infinite Loop’s future.
The project, started in 1987 as a pen computing platform and focused on a smaller size and scope after 1991 pitch by Product Marketing Manager Michael Tchao to Apple’s CEO John Sculley, was launched in 1993 and was killed in 1998 by the new interim CEO, Steve Jobs, who discontinued the last products to use Newton technology, the MessagePad 2100 and the eMate 300. [For the record, five months before Jobs had stated in an email that the eMate had a "bright future", and it looked like both the State of Texas and Australia were planning of adopting the device to replace students textbooks and aging PC computers, respectively.]
The Newton had failed on the market and Apple was betting all of its resources and money in the evolution of the Macintosh and a new NeXT-based operating system. Tchao had already left Apple, in 1994, as had the Newton’s main developers, Steve Capps and Walter Smith, who in 1996 seeked refuge at Microsoft. (more…)
In July 1993 Apple introduced two very special Macintosh models, the Centris 660AV and the Quadra 840AV. Although seemingly belonging to different product lines and featuring radically different cases and expandibility options, they shared most of their technology, and even more importantly, their raison d’etre.
The two Macs sported an unseen integration of audio, video and voice, setting a new market standard. The “AV” monicker after the model numbers meant that professional audio and video input and output capabilities were already included by Apple and there was no need for third-party expansions.
Both the Centris 660AV (code-named Tempest) and the 840AV (code-named Cyclone) were the first Macintoshes to support 16-bit 48 KHz stereo audio, and could record and play back sound at CD-quality. They also had S-Video and Composite video ins and outs so one could use them to digitize video from a camcorder or other source and route their video signals to a TV set or video recorder. They were also the first Macs that supported Apple’s speech recognition (PlainTalk) out of the box.
All of these amazing capabilities were possible thanks to new and more powerful hardware which had custom circuitry to handle the AV features. (more…)
In May 2001 Apple announced its intention to become the first computer vendor “to move to an all LCD flat panel display pro lineup“.
The company discontinued its last CRT (Cathode Ray Tube) display, the Apple Studio Display 17″ ADC, which had been introduced less than an year before, in July of 2000, and replaced it with the Studio Display 17″ LCD. This new LCD, priced aggressively at just 999 USD, completed its offer which now featured at the top a 22″ Apple Cinema Display, sold ay 2,499 USD. On the low end of the lineup was the Apple Studio Display 15″ LCD, now upgraded to the same “plexiglass” design as the others and offered at 599 USD.
From the tone of the press release one would be led to believe that Apple had completed its transition to LCD and abandoned the old and less efficient display techology. Actually, this goal was still far and it took quite a few years until all Macintosh products were actually CRT-free.
In June 2004 Grady Booch interviewed Bill Atkinson and Andy Hertzfeld for the Computer History Museum. His last question for the creators of the Macintosh was about how we will use computers in the future:
Booch: [...] Predicting the future has always been impossible, but what’s your hope, and vision for what computers- and what’s it like at 20?
Atkinson: The first thing is, get rid of the keyboard and the mouse, and replace them with a smooth plate called a touch plate; it’s responsive to gentle, stroking touch. We can enter characters by touching on a character and stroking through others around it, and we can enter whole- instead of typing i-n-g, we’re typing “ing” – we’re entering text at a higher level. Shift key and the whole thing becomes a touching-pointing device. Forget this teeny little scratch pad on your portable. So, one is the touch plate. Another avenue is that as computers get smaller and ubiquitous, working with computers has to enter into a dialogue, not this thing where you’re commanding the computer, but where actually you’re having a conversation. The computers have to get to where they actually understand the flow of a conversation. It’s not just a speechrecognition thing. It’s a language understanding problem. And when it does, we can have, like in the Ender series, a little crystal in our ear that’s listening to everything we’re listening to and whispering little things in our ear, and it’s connected into the wireless net.
Booch: You’re referring to Ender’s Game?
Atkinson: Yes, Ender’s Game and the whole series of Orson Scott Card books. I think the smaller we go, the more we need to rely on audible interfaces. And I think that programming needs to be a conversation of what it is you’re looking for. You say, well I want to do this, and then that’s ambiguous, but whatever you’re talking about, this other entity is asking you some questions about it, and you refine it.
Booch: Very cool. Andy, what are your thoughts?
Hertzfeld: Oh, a lot of what Bill said. I think that clearly the next really big frontier in user interface is going to be the language understanding. That really is as much of a leap as the graphic user interface was. There are lots of problems to be solved but it’s pretty fair, as Bill said, for the mobile application, the keyboard. I’ve been experimenting with little ideas in that space myself. The voice recognition software is really good today, but it doesn’t make sense to use it at your desktop because you can still do it better with the keyboard and the mouse. But suddenly, when you’re standing up, the keyboard and mouse are useless. Of course, computers are going to be everywhere. You’re going to need to have the computer on your person. So, the speech recognition I think is important. But I would say the next really important thing is getting the software industry on a level playing field, a place that’s really open to innovation. I think the way that’s going to happen is if the shared infrastructure becomes available and is owned by the community of its users and developers instead of a psychotically driven business force. And that’s going to happen, and it’s going to happen over the next year. So, it’s not the far years. I think once we have a level playing field open to innovation, we can start to really explore the possibilities better. And then I’ve been thinking about agent-type user interfaces. Bill’s into that too. But the graphic user interface, there’s always direct manipulation. If you’re steering it at the steering wheel, you can only do things while you’re at the steering wheel. Eventually we’re going to want to set up policies where the tireless computer can execute our policies for us, continuously, especially with the network.
Atkinson: Yes, drive me to work and let me know when we get there.
Hertzfeld: I think there’s a very fertile area right now in exploring and getting it right – like Apple more or less got right with the Mac – getting the agency-user interfaces right. Things that can happen while you’re not directly controlling it will be a frontier in the next few years.
In July 2010 Apple introduced, as a standard accessory, the Magic Trackpad, “the first Multi-Touch trackpad designed to work with your Mac desktop computer [which] uses the same Multi-Touch technology you love on the MacBook Pro [and] supports a full set of gestures, giving you a whole new way to control and interact with what’s on your screen”.
Also in 2010 Apple bought Siri, Inc. and in the October of 2011 integrated into the iPhone its product, the intelligent voice assistant, called Siri, which “lets you use your voice to send messages, schedule meetings, place phone calls, and more” without looking on the screen. Thanks to the “Eyes free” feature, Siri can also be used in a car, activated through a voice command button on the steering wheel, making users able to ask questions without taking their eyes off the road.
In 2012 Siri was also made available on other iOS devices such as recent iPod touch and iPad models.
It’s also worth mentioning that at least since October 2010 Google (for which Hertzfeld now works) has being testing the Google Driverless Car and as of September 2012 three U.S. states, Nevada, Florida and California, have passed laws permitting driverless cars.
Note: all pictures are “Courtesy of Apple”.