The future of personal computing according to Bill Atkinson and Andy Hertzfeld

Filed under: People,Software

In June 2004 Grady Booch interviewed Bill Atkinson and Andy Hertzfeld for the Computer History Museum. His last question for the creators of the Macintosh was about how we will use computers in the future:

Booch: […] Predicting the future has always been impossible, but what’s your hope, and vision for what computers- and what’s it like at 20?
Atkinson: The first thing is, get rid of the keyboard and the mouse, and replace them with a smooth plate called a touch plate; it’s responsive to gentle, stroking touch. We can enter characters by touching on a character and stroking through others around it, and we can enter whole- instead of typing i-n-g, we’re typing “ing” – we’re entering text at a higher level. Shift key and the whole thing becomes a touching-pointing device. Forget this teeny little scratch pad on your portable. So, one is the touch plate. Another avenue is that as computers get smaller and ubiquitous, working with computers has to enter into a dialogue, not this thing where you’re commanding the computer, but where actually you’re having a conversation. The computers have to get to where they actually understand the flow of a conversation. It’s not just a speechrecognition thing. It’s a language understanding problem. And when it does, we can have, like in the Ender series, a little crystal in our ear that’s listening to everything we’re listening to and whispering little things in our ear, and it’s connected into the wireless net.

Booch: You’re referring to Ender’s Game?
Atkinson: Yes, Ender’s Game and the whole series of Orson Scott Card books. I think the smaller we go, the more we need to rely on audible interfaces. And I think that programming needs to be a conversation of what it is you’re looking for. You say, well I want to do this, and then that’s ambiguous, but whatever you’re talking about, this other entity is asking you some questions about it, and you refine it.

Booch: Very cool. Andy, what are your thoughts?
Hertzfeld: Oh, a lot of what Bill said. I think that clearly the next really big frontier in user interface is going to be the language understanding. That really is as much of a leap as the graphic user interface was. There are lots of problems to be solved but it’s pretty fair, as Bill said, for the mobile application, the keyboard. I’ve been experimenting with little ideas in that space myself. The voice recognition software is really good today, but it doesn’t make sense to use it at your desktop because you can still do it better with the keyboard and the mouse. But suddenly, when you’re standing up, the keyboard and mouse are useless. Of course, computers are going to be everywhere. You’re going to need to have the computer on your person. So, the speech recognition I think is important. But I would say the next really important thing is getting the software industry on a level playing field, a place that’s really open to innovation. I think the way that’s going to happen is if the shared infrastructure becomes available and is owned by the community of its users and developers instead of a psychotically driven business force. And that’s going to happen, and it’s going to happen over the next year. So, it’s not the far years. I think once we have a level playing field open to innovation, we can start to really explore the possibilities better. And then I’ve been thinking about agent-type user interfaces. Bill’s into that too. But the graphic user interface, there’s always direct manipulation. If you’re steering it at the steering wheel, you can only do things while you’re at the steering wheel. Eventually we’re going to want to set up policies where the tireless computer can execute our policies for us, continuously, especially with the network.

Atkinson: Yes, drive me to work and let me know when we get there.
Hertzfeld: I think there’s a very fertile area right now in exploring and getting it right – like Apple more or less got right with the Mac – getting the agency-user interfaces right. Things that can happen while you’re not directly controlling it will be a frontier in the next few years.

Magic TrackpadIn July 2010 Apple introduced, as a standard accessory, the Magic Trackpad, “the first Multi-Touch trackpad designed to work with your Mac desktop computer [which] uses the same Multi-Touch technology you love on the MacBook Pro [and] supports a full set of gestures, giving you a whole new way to control and interact with what’s on your screen”.

Also in 2010 Apple bought Siri, Inc. and in the October of 2011 integrated into the iPhone its product, the intelligent voice assistant, called Siri, which “lets you use your voice to send messages, schedule meetings, place phone calls, and more” without looking on the screen. Siri on an iPhoneThanks to the “Eyes free” feature, Siri can also be used in a car, activated through a voice command button on the steering wheel, making users able to ask questions without taking their eyes off the road.
In 2012 Siri was also made available on other iOS devices such as recent iPod touch and iPad models.

It’s also worth mentioning that at least since October 2010 Google (for which Hertzfeld now works) has being testing the Google Driverless Car and as of September 2012 three U.S. states, Nevada, Florida and California, have passed laws permitting driverless cars.

Note: all pictures are “Courtesy of Apple”.

Flattr this

Friday 26 October 2012, 1:05 pm
Tags: , , , ,
Any comments?


Based on WordPress and the Conestoga Street theme by Theron Parlin