The innovation that propelled Apple into a globally dominant, high-revenue, and culturally transformative titan was not a portable computer or an audio player; it was the iPhone. In 2007, it seemed to materialize, fully perfected, brilliantly envisioned, self-assured, and intuitively clear in its concept.
Yet, behind the curtain, the iPhone as we now know it owed its existence to more than audacious risks, meticulous attention to specifics, ingenious design, and a forward-thinking outlook; there were also initial missteps, eleventh-hour modifications, and several fortunate occurrences.
To begin with, the device Apple initially aimed to develop was not a phone. It was a tablet.
Cross-functional groups at Apple are perpetually exploring nascent technologies. “There’s hundreds of little startups that are just poking around, doing stuff,” observes sensors VP Myra Haggerty. “Sometimes someone’s like, ‘Hey, come look at what we’re working on!’ Then you go into some random lab somewhere, and they’re doing this really cool thing. ‘What could we do with this?’”
Consider, for instance, Duncan Kerr’s projector demonstration.
In 1999, Kerr, a British designer with diverse design expertise—spanning engineering, technology, industrial design, and interface prototyping—had joined Jony Ive’s studio, the chief of industrial design.
In early 2003, he initiated Tuesday gatherings with interface designers and input engineers to investigate novel methods of engaging with computers; after all, the established “point-and-click” mechanism was a quarter-century old. Kerr’s team tested technologies such as camera-driven systems, three-dimensional sound, haptics (vibratory sensations), and 3D displays. “We’d invite research people in, or companies who had some curious technology. We did a lot of demos, tried stuff out,” he states.
Kerr was particularly captivated by the concept of manipulating on-screen elements with fingers. However, sketching concepts on paper could only offer limited progress for the team. He, alongside interface designers Bas Ording and Imran Chaudhri, sought to construct a tangible multi-touch display to further their investigations. Introducing: the iGesture NumPad mouse/touchpad.
It was a sleek, dark trackpad, measuring 6.25 x 5 inches, produced by a Delaware firm named FingerWorks. Wayne Westerman, a pianist and individual experiencing repetitive strain injury, had, with his professor John Elias, devised a series of keyboards that demanded merely a feather’s touch. Because they could recognize and monitor multiple finger inputs concurrently, they could also interpret movements traced on the surface, substituting traditional mouse functions. For “Open,” for example, one could rotate their fingertips on the surface as though unsealing a jar.
In late 2003, Apple contracted FingerWorks to construct an expanded model of their multi-touch pad: 12 x 9.5 inches, a more accurate representation of a computer screen’s dimensions. Kerr’s team established a testing apparatus in the design studio of Infinite Loop 2. They positioned an LCD projector on a tripod, shining directly down onto the trackpad. A sheet of white paper was affixed atop it so that the projector’s image—generated by a nearby Power Mac—would appear vivid and distinct. Then the exciting phase commenced: devising methods to interact with the visible components. One could glide a finger to relocate a symbol in the projected image. Separating two fingers allowed for the expansion of a diagram or picture. Employing both hands, users could select, manipulate, and resize items. It was extraordinary.
In November 2003, Kerr’s team presented the demonstration to Ive, who, in turn, showcased it to Steve Jobs. Everyone who witnessed the multi-touch demo adored it, proclaiming its visionary nature. Its precise application, however, remained uncertain.
In late 2005, Jobs was present at the 50th birthday party of a Microsoft engineer, the husband of a friend of his wife, Laurene. Over dinner, the man expounded to Jobs on how Microsoft had resolved the path for digital advancement by developing a stylus-controlled tablet: mobile, robust, and independent.
“But he was doing the device all wrong,” Jobs later remarked, as recounted in Walter Isaacson’s book Steve Jobs. “This dinner was like the 10th time he talked to me about it, and I was so sick of it that I came home and said, ‘Fuck this. Let’s show him what a tablet can really be.’”
{content}
Source: {feed_title}
