Alan Kay (Disney Imagineering Fellow) – Alan Kay Interview (Jan 1990)
Chapters
00:00:00 Early Perceptions and Experiences of Computing
The Size and Perception of Early Computers: Early computers were large and expensive, primarily used for calculations. Some believed they would always remain large, while others at Bell Labs saw the potential of transistors as switches. The idea of computers becoming small gained traction in the 1960s.
Batch Processing Experience: Batch processing resembled a factory or railroad system. Users had limited access to their results, with runs scheduled once or twice a day. This institutional approach to computing was common at the time.
Impact on People: Despite the limitations, people were generally satisfied with the technology. Business computing was mainly done with punch card machines, creating a significant gap between the potential of computers and their actual usage. The large physical size of early computers invoked religious-like mechanisms, leading to diverse reactions and mythology.
Potential for Partnership: Towards the late 1950s, people began to recognize the potential for computers to become partners rather than mere tools.
00:03:49 Early Conceptions and Challenges of Personal Computing
People-Centered Visions of Computing: The early visionaries of computing explored various concepts beyond amplifying intelligence. John McCarthy, a notable figure at MIT, proposed timesharing the computer and envisioned a future with networked information utilities. McCarthy also suggested the idea of an “advice taker,” an intelligent agent that could assist individuals in finding information.
Early Computing Costs and Timesharing: In the late 1950s, computers were extremely expensive, costing several million dollars. Timesharing emerged as a solution to address the high costs by allowing multiple users to share a single computer’s resources. Timesharing facilitated debugging processes, a notoriously tedious task in programming.
Anecdotes and Historical Context: Grace Hopper, a pioneering computer scientist, famously discovered the first “bug” in a computer, which was literally a moth. A recent anecdote highlights the ongoing challenges of debugging, even with modern technologies like laser printers.
00:06:29 Early History of Time Sharing and User Interaction
Early Mouse Problem: A mouse had moved into a computer circuit board, causing confusion and misunderstanding among computer users.
Basic Concept of Timesharing: Timesharing aims to optimize computer usage by dividing the available processing power among multiple users over extended periods. This approach addresses the slow typing speeds and frequent debugging needs of humans. It enables incremental and interactive computing, reducing the need for extensive planning.
Challenges of Timesharing: Providing individual and exclusive access to users in a multi-user system was initially a challenge. The high cost of computers led to the concept of rolling in multiple jobs and allocating processing time in small increments to each job.
User Illusion and Reliable Response Time: Timesharing systems aimed to create the illusion of exclusive access for each user. Achieving reliable response time proved challenging in practice, leading to frustrations in personal computing.
JOS System at Rand Corporation: JOS was a single-language timesharing system that offered a user-friendly experience. It provided a warm and intuitive feeling, inspiring users to desire a similar experience in other computing systems.
Desired Computing Features in the Late 1950s: Clear-thinking individuals in the late 1950s envisioned computing as small, interactive, and real-time. The emphasis was on incremental and collaborative computing. The concept of electronic mail emerged early at MIT.
Ivan Sutherland’s Contribution to Computer Graphics: Sutherland’s invention of modern computer graphics in 1962 revolutionized computer interaction. It influenced perceptions and set a new standard for user experience. His work continues to inspire modern computing and interaction design.
Sutherland’s Background and Motivation: Sutherland was a graduate student at MIT, advised by Claude Shannon and Marvin Minsky. He was influenced by Jack Licklider’s vision of the computer as a symbiotic partner. The TX-2 computer at Lincoln Labs played a significant role in his research.
Sketchpad: A Revolutionary Drafting System: Ivan Sutherland’s creation, Sketchpad, transformed computer graphics and laid the foundation for modern computer-aided design (CAD) systems. Sketchpad allowed users to sketch objects, which the program would then automatically complete and adjust, making the design process more efficient and precise. It introduced concepts like windows, icons, and direct manipulation of objects through a graphical user interface (GUI).
Sketchpad’s Simulation Capabilities: Sketchpad’s problem-solving capabilities extended beyond drafting. Users could create simulations of real-world phenomena by defining constraints and parameters, such as forces and dimensions. The program would then calculate and animate the results, providing insights into the behavior of complex systems.
Sketchpad’s Lasting Impact: Despite its limitations compared to modern systems, Sketchpad’s influence is still felt today. It inspired generations of computer scientists and engineers, shaping the development of computer graphics and user interfaces.
Doug Engelbart’s Vision of the Future: Doug Engelbart’s work was inspired by Vannevar Bush’s Memex concept, envisioning a future where information could be easily accessed and interconnected. Engelbart proposed a system that would allow users to navigate through large amounts of information using multiple screens, pointing devices, and hyperlinked trails.
Engelbart’s Invention of the Mouse: Engelbart’s pursuit of a better pointing device led to the invention of the mouse in 1964. The mouse proved to be a more efficient and user-friendly alternative to light pens, which were commonly used at the time.
The Impact of Engelbart’s Demonstration: Engelbart’s large-scale demonstration of his system in 1968 was a pivotal moment in the history of computing. The demonstration showcased the power of graphical user interfaces, windowing systems, and hypertext, captivating the audience and influencing the direction of future research and development.
00:19:10 Engelbart's Pioneering Vision of Personal Computing
Engelbart’s Demonstration: Engelbart showcased a system with a mouse and a black-and-white display, demonstrating how to point and interact with information on the screen. He presented early forms of hypertext, allowing users to click on items and see related information pop up.
Collaborative Work: Engelbart introduced live collaborative work, where he and Bill Paxson, located 45 miles away, used mouse pointers on the screen simultaneously, demonstrating a groundbreaking vision of shared workspaces.
Reception and Impact: Engelbart’s demonstration received a standing ovation and won the best paper award at a major computing conference. It had a significant impact on the ARPA community and influenced the future of personal computing. Despite its revolutionary nature, Engelbart felt that its impact was limited due to his strong belief in his specific vision, which did not align completely with the broader computing community’s goals.
Ivan Sutherland’s Sketchpad System: A revolutionary computer-aided design system that allowed users to create and manipulate 2D drawings using a light pen. However, it had a steep learning curve, requiring users to memorize hundreds of commands and keystrokes. The system lacked a transitional period for document production and desktop publishing.
Sketchpad’s Drawbacks: It was not a particularly good simulation of paper, as Sutherland intentionally avoided this goal. The system was complex and challenging to learn, acting as a barrier to widespread adoption.
The Link: Considered by Alan Kay as the first personal computer due to its compact size. It was designed to fit on a desk, allowing users to see over it, unlike larger systems of the time. The Link was a significant step towards the development of modern personal computers.
Alan Kay’s Influence: Kay’s early experiences with Sketchpad and the Link influenced his subsequent work on the Flex machine, an early prototype of a personal computer. His focus on learnability and user-friendliness would later shape the design principles of the Dynabook and the graphical user interface (GUI).
00:24:54 Early Attempts at Personal Computing: From Mainframes to Personal Automobiles
The Flex Machine: The Flex machine was designed for biomedical research and was intended to be programmable and constructible by non-computer scientists. It had a display, display editors, and a user interface that included multiple windows and icons. Despite its technological success, it was a sociological disaster due to its rejection by non-computer users.
Early Attempts at Personal Computers: The Flex machine and other early attempts at personal computers were similar to the Model T in terms of their significance and impact on the computing landscape. Engelbart’s vision of the computer as a railroad and the liberating nature of the personal automobile influenced the design and conceptualization of these early systems.
Drawbacks of the Car Metaphor: The car metaphor, while apt in some ways, had limitations when applied to humanity. Cars require several months of learning to operate, which doesn’t extend to the world of children.
A New Metaphor: A better metaphor for personal computing emerged from seeing the Grail system, which utilized a tablet and hand character recognition instead of a keyboard. This system inspired Alan Kay to pursue a graphical user interface that was more accessible and intuitive for non-computer users.
00:31:37 From Engelbart to Kay: The Evolution of Personal and Intimate Computing
The GRAIL System and Intimate Computing: The GRAIL system at RAND Corporation allowed for direct interaction with information on the screen, resembling a musical instrument’s intimacy and closeness.
Seymour Papert’s LOGO and the Power of Easy-to-Use Languages: Seymour Papert’s early work with LOGO enabled children to write programs, highlighting the significance of combining computer power with easy-to-use languages.
The Notion of Computers as a Medium: Kay emphasizes the concept of computers as a medium, akin to pencil and paper, rather than mere machines. This perspective encourages early and widespread use, similar to how children use pencil and paper from a young age.
The Projection of Integrated Circuits and the Possibility of Notebook Computers: Kay became enthusiastic about the potential of integrated circuits and Moore’s Law, projecting the feasibility of notebook-sized computers in the future.
The Grocery List Test: Kay raises the question of whether a computer could be used for mundane tasks like carrying a grocery list, akin to the versatility of paper.
Media as Intermediaries and the Trade-Off between Direct Experience and Amplification: Kay explores the concept of media as intermediaries between humans and direct experience, noting the trade-off between alienation and amplification.
The Existentialist View of Alienation and Power: Kay acknowledges the existentialist perspective that gaining power over the world through technology comes at the cost of alienation.
Computers as Mechanisms and the Beauty of Complexity: Kay argues against the negative connotation of the word “machine,” emphasizing the scientific view of machines as beautiful and intricate mechanisms.
The Linguistic Properties of Machines and the Simulation of Anything: Kay highlights the linguistic nature of machines, capable of processing symbols and simulating various aspects of reality.
The Nature of Markings in Books and Computers: Markings in books and computers serve as carriers of descriptions, and both mediums share a wide range of possible markings.
The Unique Properties of Computers: Computers are unique in their ability to read and write their own markings at very high speeds, a self-reflexivity typically observed only in biological systems. This rapid manipulation of descriptions within a computer creates a pocket universe.
Complementary Roles of Computers and Scientists: In contrast to the 19th-century view of a universe crafted by God, computers allow scientists to create theories and bring them to life.
Simulating Different Universes Through Computers: Scientists can use computers to simulate universes with different laws of physics, such as an inverse cube law of gravity, to explore the implications and behaviors of these alternative realities.
The Unique Nature of Computers: Computers exist as both physical objects subject to the laws of physics and as virtual worlds capable of simulating and exploring concepts beyond our own universe.
Simulation and Meta-Media: The computer’s unique ability is to simulate, embodying all other media and becoming the first meta-media. It can simulate physical media (like paper) and physical processes.
The Book as an Analogy: The invention of the computer is comparable to the invention of writing or printing in terms of cultural significance. The book underwent stages: invention of writing, Gutenberg’s imitation of manuscripts, and the portable size introduced by Aldus Minucius.
The Analogy to the Book: Drawing parallels between the history of the book and the emergence of computers can be insightful. Similarities exist in the stages of development, from invention to widespread acceptance.
The Computer as a Lost Object: Today, we are at a stage similar to the period before Aldus Minucius, where computers are still too valuable to be lost. The true value of computers will emerge when they can be casually lost, like books.
Literacy and Computeracy: Literacy involves the ability to access, create, and interpret information in a particular medium. In the context of computers, “computeracy” refers to the ability to understand and use computers effectively. Just as there are different levels of literacy in traditional media, there are different levels of computer literacy, including the ability to read, write, and understand different styles of computer-based information.
The Three Major Parts of Literacy: Reading and accessing information prepared by others. Creating and expressing ideas through writing or other forms of creative expression. Understanding and navigating different styles and genres of representation, such as Shakespearean plays or different essay formats.
Comparison to Traditional Media: The development of literacy in traditional media, such as print, took centuries to evolve. In contrast, the early phases of computer development have progressed much more rapidly.
Historical Analogies: Martin Luther’s efforts to translate the Bible into German, creating a new language (High German) to make it accessible to the people, can be seen as an analogy to the need for creating new languages and interfaces to make computers more user-friendly. The invention of the printing press by Johannes Gutenberg is seen as a parallel to the development of the computer, as it made it possible for more people to access information and knowledge.
Xerox PARC: Xerox PARC was a research center established by Xerox Corporation in the 1970s. It was a hub for groundbreaking research and development in the field of computing, including the development of the graphical user interface (GUI) and the laser printer.
00:54:46 Xerox's Long-Range Research Center: A Gathering of Brilliant Minds
Origins of Xerox’s Research Center: Xerox aimed to expand beyond office copiers and invest in cutting-edge research. Jack Goldman, the new chief scientist, recognized the need for a long-range research center.
Bob Taylor’s Leadership: Bob Taylor, a former ARPA funder, was hired to establish a miniature ARPA community within Xerox. Taylor sought out top researchers from the ARPA community, including Alan Kay.
Impact of the Mansfield Amendment: The Mansfield Amendment restricted ARPA’s funding for far-out research projects. This led many talented researchers to join Taylor’s team at Xerox.
Gathering of Top Talent: Alan Kay’s personal pick of the top 100 computer scientists in the world were gathered at Xerox. These researchers had strong opinions and diverse backgrounds but shared a common vision for the future of computing.
Collaboration and Shared Resources: The researchers at Xerox worked on joint hardware projects and shared resources. They built their own hardware and developed various software designs. The psychology of this group was often perceived as arrogant due to their strong beliefs and expertise.
Legacy of Xerox’s Research Center: The research conducted at Xerox has had a lasting impact on the development of personal computing. Many of the ideas and technologies developed at Xerox have been adopted by the industry.
00:59:29 Origins and Early Designs of the Alto Computer
Xerox PARC’s Mission: Taylor’s goal was to realize ARPA’s dream of man-computer symbiosis. Research interests spanned interactive computing, artificial intelligence, and agents.
Projects at PARC: Practical projects: Gary Starkweather’s laser printer, Metcalf’s Ethernet, Thacker’s Alto workstation. Kay’s group explored the Dynabook design, contributing to the development of icons, multiple overlapping windows, and networking capabilities.
Machine Design Considerations: Aiming for a balance between stand-alone power and time sharing. Estimating the required processing power: initially targeting around 10 MIPS, but the Alto’s actual performance ranged between 1 and 3 MIPS.
Human Factors and User Interface: Extensive study of human perception, fonts, animation, and real-time music to inform the machine’s design and software capabilities.
Focus on Children as Users: Kay’s desire to create a tool for children, recognizing the challenges of designing systems for adults. Children’s unique perspectives and lack of preconceived notions about computing made them valuable test subjects. The goal was to create a system that would avoid subjecting children to more school or training.
The “Forcing Function” of Children: Designing for children presented unique challenges that forced the team to consider human mentalities and cognitive development. Inspiration from Piaget’s and Brunner’s theories on the evolution of mentality in early childhood.
Piaget’s Stages of Development:
Breaking down cognitive development into three stages: 1. Object manipulation and interaction. 2. Visual thinking and representation. 3. Logical and abstract reasoning.
01:06:41 Cognitive Psychology's Impact on User Interface Design
Piaget’s Stages of Cognitive Development: Piaget’s water pouring experiment demonstrates how children’s judgments are dominated by visual appearances. In the preoperational stage, children lack conservation, which is the understanding that quantities remain constant despite changes in appearance. In the concrete operational stage, children can make inferences based on facts and logic, rather than relying solely on visual cues.
Brunner’s Multiple Mentalities: Brunner proposed that humans have separate ways of knowing about the world: kinesthetic, visual, and symbolic. These mentalities are distinct and not well integrated, leading to potential discrepancies between visual and symbolic representations.
Implications for User Interface Design: Brunner’s theory suggests that user interfaces should consider multiple ways of knowing. Kinesthetic interactions, such as moving objects on a screen, can provide a more intuitive and engaging experience. Recasting knowledge in different forms, such as visual and symbolic representations, can make it accessible to learners at different developmental stages.
The Dynabook’s User Interface: The Dynabook’s user interface was designed to support multiple projects and allow users to move seamlessly between them. This approach aimed to provide a kinesthetic and intuitive experience that leveraged the user’s intelligence and creativity.
Leverage in User Interfaces: Leveraging a user’s intelligence involves designing interfaces that amplify their capabilities and empower them to accomplish more. Successful user interfaces create a mirror that reflects and amplifies the user’s intelligence, making them feel more capable and engaged.
01:17:11 Theatrical User Interfaces: The Art of User Illusion
User Interfaces as Evokers: Alan Kay compares user interfaces to evokers, similar to theatrical performances, that present complex concepts in a simplified and relatable manner, allowing users to reconstruct their own understanding.
Art as a Lie that Tells the Truth: Kay draws inspiration from Picasso’s quote, “art is not the truth, art is a lie that tells the truth,” suggesting that user interfaces are a form of art that evoke the capabilities of a computer in the user’s mind.
Illusion of Exclusive Use: Kay discusses the concept of user illusion, where users perceive themselves to have exclusive use of a computer, much like in timesharing systems where users believe they have dedicated access to the machine.
Objects on the Screen: Kay mentions the idea of objects on the screen, highlighting the illusion created by user interfaces that make users perceive digital representations as tangible objects.
Theatrical Magic and Foreshadowing: Kay emphasizes the importance of foreshadowing in theatrical magic, drawing parallels to user interface design. He suggests that user interfaces should set up expectations that align with the intended functionality, ensuring a cohesive and intuitive experience.
01:19:25 The Birth of the Xerox Alto: A Landmark in Computing History
The Design of the Minicom: Alan Kay initially conceptualized the Minicom, a portable handheld device with a modern design resembling a suitcase. He intended to equip it with the first version of Smalltalk and use it for various experiments, but faced rejection from a Xerox executive.
The Shift to Mini-Computers: After the rejection, Kay explored the idea of acquiring multiple mini-computers, such as Data General Nova’s, to compensate for the lack of resources. He aimed to replicate the existing system experimenting with fonts, bitmap displays, animation, and music on these mini-computers.
The Collaboration with Chuck Thacker and Butler Lampson: Chuck Thacker and Butler Lampson approached Kay with an offer to build him a machine using the funds he had. Kay provided them with his experiments on fonts, bitmap graphics, animation, and music, emphasizing the need for high speed to handle these tasks.
The Birth of the Alta: Chuck Thacker, known as a hardware genius, led the development of the Alta, incorporating bitmap display and surpassing Kay’s initial performance expectations. The Alta emerged as a groundbreaking machine, simultaneously resembling a Mac and a workstation.
The Significance of the Bitmap Display: The use of a bitmap display was controversial due to the high cost of bits at the time. Before 1971, memory was composed of expensive magnetic cores, limiting the size and practicality of digital displays.
The Introduction of the Intel 1103 Chip: The release of the Intel 1103 chip in the early 1970s marked a turning point in computing hardware. With 1,000 bits on a single chip, it significantly reduced the cost and size of memory, enabling the development of more powerful and affordable machines.
The Impact on PARC: The Intel 1103 chip became the foundation for building large time-sharing machines at PARC, including the Alto. Despite the cost constraints, the Alta’s design paved the way for future advancements in personal computing.
Display Technology and the Alto: The bitmap display, consisting of separate pixels, was not invented by the Alto team but had been previously discussed and existed in small versions like television. Bitmap displays appealed because they were general-purpose and could display any type of image without prior knowledge.
Development Process: After experimentation and calculations, the Alto team decided to adopt a bitmap display for its versatility. The first Alto display was considerably larger than the original Macintosh and had a resolution of 500,000 pixels, matching the size and resolution of the proposed Dynabook.
Introduction of the Painting Program: Alan Kay invented the first painting program in 1972, programmed by Steve Purcell, as a response to the limitations of crisp computer graphics. The painting program allowed for messy and expressive digital artwork, enabling users to sketch and render.
Overlapping Windows and Effective Screen Space: The combination of a bitmap display and multiple windows, inspired by the Flex machine, led to the concept of overlapping windows. Overlapping windows provided more effective use of screen space, allowing users to view and manipulate multiple applications simultaneously.
Comparison with the Macintosh: The Macintosh represented a significant amount of additional work and refinement compared to the Alto. The Macintosh’s screen design was unique and offered more tools and software than the Alto. The Alto introduced various features that became standard in modern computing, including painting, crisp computer graphics, desktop publishing, and WYSIWYG editors.
01:28:15 The Rise and Fall of Xerox PARC: A Pioneering Chapter in Computing
Development of Alto: In the early-mid 1970s, Xerox PARC developed the Alto, a personal computer with advanced features like a graphical user interface, overlapping windows, and a mouse. Despite the excitement surrounding the Alto’s capabilities, Xerox faced challenges in commercializing it due to its high production costs compared to word processors.
Xerox’s Decision: In 1976, Xerox made the fateful decision to turn down the Alto project, effectively rejecting the opportunity to market a computer that had been in operation for three years. This decision stemmed from Xerox’s identity as a copier company rather than a computer company, leading to a misalignment between the company’s core business and the potential of the Alto.
The Alto’s Legacy: Although Xerox did not market the Alto as a commercial product, it had a profound impact on the development of personal computing. The Alto’s innovative features, such as the graphical user interface and mouse, became standard in subsequent personal computers, including the Apple Macintosh and Microsoft Windows.
Xerox PARC as a Cultural Phenomenon: Xerox PARC is often viewed as a golden age in computing due to the groundbreaking research and development that took place there. The presence of talented researchers and a supportive environment fostered creativity and innovation, leading to significant advancements in the field.
Influence of Stuart Brand: Stuart Brand, known for his work on the Whole Earth Catalog, wrote an article about Xerox PARC that highlighted its unique culture and innovative work. The article, published in Rolling Stone magazine, drew attention to Xerox PARC’s achievements but also faced backlash from Xerox executives who disliked its portrayal of the company.
Xerox’s Support for Research: Despite the decision not to commercialize the Alto, Xerox provided long-term support for research at PARC for a decade. This support allowed researchers to continue their groundbreaking work, even in the absence of immediate commercial prospects.
Rise of Hobbyist Machines: Concurrently with Xerox’s research, the development of microchips led to the emergence of hobbyist machines, making personal computers more accessible to enthusiasts. The availability of these machines contributed to a growing interest in personal computing and provided an alternative path for individuals to engage with technology.
01:38:59 Early Personal Computing: A Perspective from PARC
PARC’s Opinions on 8-Bit Machines: There were varying opinions among PARC researchers regarding the potential of 8-bit microcomputers. Alan Kay believed that 16-bit machines were necessary for the advanced software and capabilities they envisioned. Larry Tesler was more optimistic about the possibilities of 8-bit machines.
Kay’s Regret Over Early Microcomputer Development: Kay expressed regret over the early proliferation of microcomputers due to the negative impact on computing standards. He believed that delaying the introduction of microcomputers until 1984 or later would have allowed for better standards to be established. Standards like MS-DOS, which Kay considered unfortunate, would not have gained such widespread adoption.
VisiCalc: A Notable Exception: Despite Kay’s criticism of early microcomputers, he acknowledged the significance of VisiCalc. VisiCalc was an exceptional application that demonstrated the potential of microcomputers for practical use. The success of VisiCalc surprised and humbled the PARC researchers, who had overlooked its potential.
Comparison to Early 1960s Computing: Kay drew parallels between the capabilities of early 8-bit microcomputers and the limited computing power of the early 1960s. Both eras were characterized by weak machines and rudimentary operating systems. Many of the operating systems from that era continued to persist, despite their limitations.
01:41:01 The Journey of User Interfaces: From Xerox PARC to the Macintosh
Alan Kay’s Reflections: Alan Kay shared his thoughts on Xerox being an ideal starting point for PARC’s work, given its lack of history in the computer industry. He highlighted Steve Jobs’ remarkable ability to embrace radical changes, exemplified by Apple’s willingness to abandon 8-bit computing in favor of a new approach. Kay recalled a memorable demo he gave to Steve Jobs and others from Apple, where Jobs demonstrated an exceptional understanding of PARC’s concepts.
The Impact of PARC’s Work: Kay emphasized the significance of PARC’s work in demonstrating the potential for a different approach to human-computer interaction. He described the Lisa as an aesthetically pleasing design that showcased the power of the 16-bit 68000 processor, while acknowledging the Macintosh’s success as a more practical compromise. Kay acknowledged the vindication of PARC’s ideas through the subsequent history of computing, but he cautioned against considering it a complete victory.
The Limitations of Graphical User Interfaces: Kay expressed skepticism about the ultimate value of graphical user interfaces, arguing that they may not represent the best long-term solution for interacting with computers. He observed that many people still struggle with basic interactions with computers, even with the widespread adoption of GUIs. Kay cautioned against the tendency to become overly attached to specific technologies, leading to their prolonged use beyond their actual usefulness.
The Macintosh and MS-DOS: Kay highlighted the pivotal moment when Apple bet its future on the Macintosh, despite the dominance of MS-DOS at the time. He acknowledged the intuitive appeal of the Macintosh’s graphical interface, but expressed uncertainty about the reasons for its broader acceptance over MS-DOS.
The Importance of Non-Language-Based Ways of Knowing: Alan Kay emphasizes the significance of kinesthetic and visual ways of knowing the world, which are more intuitive and analogical. These non-language-based ways of knowing provide a sense of grounding and connectedness with the objects being manipulated.
The Role of the Mouse and the Screen in the Macintosh: Kay credits the mouse as a key factor in the success of the Macintosh, as it allows users to interact with objects on the screen in a tangible and intuitive way. The screen serves as a theatrical representation of a complex computer system, simplifying the user’s interaction with the underlying technology.
The Symbolic Underpinning and the Image of the Macintosh: Kay acknowledges the challenges in developing a symbolic underpinning that enables users to express themselves within the Macintosh interface. The iconic image of a two-year-old child confidently using the Macintosh demonstrates the accessibility and intuitiveness of the interface.
The Goal of Extending the Macintosh Interface: Kay envisions expanding the Macintosh interface beyond a specialized tool or car to a medium that extends across childhood and into old age. This goal involves making the interface more accessible and natural for users of all ages and backgrounds.
The Distinction Between Software and Hardware: Kay views hardware as software that has been crystallized early. He believes that the fundamental difference between software and hardware lies in the timing of their creation, rather than any inherent distinctions in their nature.
Early Computing Devices: Alan Kay describes the simplest form of computers as having only a few states and functioning primarily as memory. He compares these early devices to clocks, with all operations occurring in memory. The presence of extensive hardware in today’s computers is attributed to the need for rapid computation of specific functions.
Computational Emphasis in Early Computing: Kay highlights the historical focus on computational functions in early computing. The inclusion of specialized logic circuits for arithmetic and other computational tasks was a natural consequence of this focus.
Alternative Computer Designs: Kay acknowledges the existence of computer designs and architectures that approach memory and processing differently. These designs may employ content-addressed storage and involve numerous processors simultaneously accessing memory. Arithmetic operations may not be a primary focus in such architectures.
Logic Elements and Trade-offs: Kay emphasizes that arithmetic operations are merely a manifestation of specific logic arrangements. He discusses the trade-off between the variety of markings a memory can store and the amount of logic required for computation.
01:52:34 Conceptualizing Software Machines: From Basic Functions to Virtual Machines
Abstraction and Layers: The hardware of an average computer can perform basic functions (100-300). The majority of a computer’s components are memory for storing information. Programmers use operating systems and programming languages to create virtual machines that simulate more user-friendly environments. The user interface is the final layer that further enhances the user experience.
Virtual Machines: Virtual machines are simulated environments that are more hospitable than the underlying hardware. They allow programmers to work in a more manageable and familiar space. Higher-level virtual machines can be simulated on top of lower-level ones for increased convenience.
Avoiding Complexity: Programmers prefer to work in higher-level environments to avoid the complexities of machine code. Abstraction layers enable programmers to focus on problem-solving without worrying about underlying details. Cooking and baking analogies are used to illustrate how abstraction simplifies tasks by subsuming lower-level details.
Speed and Simulation: Simulation becomes practical and enjoyable when computers can execute instructions quickly. In the past, simple simulations like ballistic trajectory calculations required teams of people using desk calculators. Alan Turing proposed a simple machine capable of simulating any computation, laying the foundation for modern computers.
The Universal Machine: The digital computer is a universal machine capable of simulating any other machine given sufficient memory, speed, and external storage. The concept of universality stems from the basic properties of memory, symbol manipulation, and testing. Ada Augusta, considered history’s first programmer, recognized the generality of representation in the analytical engine, extending beyond numeric calculations to general symbolization.
The Turing Legacy and Mathematical Influences: Early computer pioneers, many with mathematical backgrounds, were aware of various formulations of universal machines, including Turing’s work, Godel’s theorem, and post-production systems. These mathematical concepts translated well to computer machinery.
The Practicality of Universal Machines: The belief in the computer’s potential for growth in size and speed was crucial for the practical realization of Turing’s ideas. In the 1960s and 1970s, building simple computers to simulate more complex hardware became common practice, exemplified by the Alto at Xerox PARC.
Moldability and Romantic Appeal: The moldability of computers allows them to take on different personalities and virtual machines at the lowest level. This tractability and romanticism are unique to computers compared to other machines. The romantic appeal of computers is akin to the appreciation for musical instruments and other marvelous inventions, but with the added capacity to build private universes.
The Importance of Human Comprehension: Despite the computer’s ability to create private universes, it ultimately relies on human comprehension for meaning. Similar to a piece of music, the output of a computer must be understood by a human sensorium.
02:01:54 Programming Paradigms and Virtual Machines
Programming Paradigms: Traditional programming: Memory as inert ingredients, programs as recipes, and the central processor as the cook following the recipe. Puppet theater analogy: Puppets as inert objects manipulated by energetic puppet masters (central processor). Object-oriented programming: Puppets pulling their own strings, each with its own behavior and self-containment. Benefits of object-oriented programming: Simplicity and ease of writing programs.
Agent-Oriented Programming: A higher level of programming where elements are more like biological cells, self-contained and contributing to the larger system. Moving away from sequential programming towards more parallel and autonomous processes.
Computational Limits: As we move to higher levels of virtuality, the computer has to work harder to maintain performance. Analogy of a go-kart climbing a hill: Limited horsepower may not be sufficient for steeper inclines.
Measuring Computational Complexity: Example of opening a file on a Mac desktop: Counting the number of instructions executed provides an estimate of computational complexity. A slow response time indicates a higher number of instructions executed.
Complexity in Computing Systems: Complexity often arises from the way the system is programmed rather than being intrinsic.
Tools and Agents: Tools and agents are extensions of human capabilities, both physical and mental. Language and mathematics are examples of mental tools. Tools extend gestures and amplify human abilities.
02:07:15 Tools, Agents, and the Intimate Way of Relating to Computers
Tools vs Agents: Tools: Things manipulated by humans to extend their capabilities, such as language and physical tools. Agents: Entities that watch and manage humans, taking on their goals and acting on their behalf.
Agents in Computing: Computers can host agents that don’t need to be as smart as humans, just capable of flexible competence and taking on specific goals. Agents can work 24 hours a day, monitoring and searching for information relevant to a user’s goals.
Driving Forces for Agents: Pervasive networking: The proliferation of inexpensive integrated circuits and the widespread availability of computers led to the need for easier-to-use user interfaces. Pervasive networking creates a need for agents to navigate and manage the vast amount of information and resources available.
Example of an Agent: A newsgathering agent that works overnight to find and gather relevant news articles, photographs, and other information based on a user’s interests and preferences.
The Next Revolution in Computing: The intimate way of relating humans to computers: Instead of the institutional mainframe or the desktop model, intimate computing involves personal agents that assist users in various aspects of their lives.
02:12:57 The Dynabook Concept: A Pervasive, Seamless Computing Experience
The Dynabook Concept: The Dynabook is a conceptual device that embodies a vision for the future of computing. It is imagined as a personal, portable device that is continuously connected to a worldwide informational network. The Dynabook serves as a holy grail that inspires and guides ongoing efforts in the field of computing.
Physical Renderings of the Dynabook: Three physical renderings of the Dynabook were envisioned: A notebook form factor A wearable device with a head-mounted display A sensitive wristwatch interface
Service and Intimacy with the Dynabook: The Dynabook emphasizes the relationship between the user and the device. It aims to foster an intimate, casual, and mundane connection with the user. The Dynabook allows users to aspire to great heights, similar to learning the English language.
Low Threshold, No Ceiling: The Dynabook strives to provide a seamless and accessible user experience. It aims to eliminate barriers and allow users to explore and create without limitations.
Apple’s Educational Focus and Current Work: Apple has a tradition of supporting education and exploring the future of schooling. The Open School in Los Angeles is a platform for Apple’s work in this area. One project focuses on helping children write as fluently on computers as they can read.
Artificial Intelligence Tools for Children: Efforts are underway to develop tools that enable children to engage with artificial intelligence. These tools aim to help children understand AI and learn how to program it. The goal is to make AI programming accessible and engaging for younger generations.
02:16:22 Future-Forward Education: Learning with Technology
Key Insights: Alan Kay’s educational philosophy emphasizes the importance of children’s active involvement in creating ecologies and studying real animals and plants. He believes this approach fosters a deeper understanding of the natural world and promotes continuity between children’s play and adult activities. Kay criticizes the disconnection between children’s games and the content-rich activities of adults in the 20th century. He argues that children today are often disenfranchised from meaningful content, leading to a lack of engagement and motivation in learning. Kay advocates for parental involvement as a crucial factor in fostering healthy schools. He believes that children’s values and interests, largely influenced by their parents, shape how they spend their time and approach learning. Parental involvement helps create a supportive environment where technology can amplify children’s natural curiosity and drive for exploration. Kay emphasizes that technology alone cannot solve educational problems. He draws a parallel between the introduction of pianos in classrooms without proper instruction and the potential misuse of technology. He stresses the need for a healthy value system and parental involvement to ensure that technology enhances learning rather than hindering it. Kay acknowledges the ongoing improvements in hardware and the potential of artificial intelligence projects like the Psych Project. He praises the project’s focus on generating new ideas for representing knowledge and the ingenuity of its researchers. He believes that even if the project does not fully achieve its design goals, the innovative approaches it employs will yield valuable insights and advancements. Kay highlights the importance of social dynamics and control in education. He suggests that the use of technology in schools is influenced by the complex interactions between teachers and students and the underlying structures of control within the educational system.
Common Sense as a Fabric for Knowledge: Common sense provides a continuous fabric for understanding and reasoning, allowing us to navigate between areas of expertise. Expert systems often lack this fabric, leading to abrupt failures when venturing outside their specific domain of knowledge.
Art and Common Sense: Art is not the truth, but it tells the truth through a lie. Common sense is similar, providing a coherent framework for understanding the world, even if it may not be entirely accurate.
Expanding Common Sense with Technology: Computers can potentially expand our common sense by providing sensory contact with things beyond our normal perception. This can include things much smaller or larger than our scale of experience.
Science and Common Sense: Science often contradicts common sense, as it deals with phenomena outside our sensory domain. For example, science tells us that the sun doesn’t rise but rather that the Earth is turning. Despite these contradictions, we still rely on common sense to navigate the world, as it provides a comprehensive and universal way of understanding.
Legacy of Artificial Intelligence: Artificial intelligence (AI) was once defined as all that we don’t know how to do yet. AI has been useful in many areas, including natural language processing, image recognition, and robotics. Some of the most creative and eccentric people in AI have made significant contributions to the field.
02:27:21 Artificial Intelligence: The Pursuit of Human-Like Intelligence
Prediction in the 20th Century: Extrapolation has been the least effective method of prediction in the 20th century.
Virtual Reality: Many people will enjoy virtual reality, especially those who don’t enjoy the current reality. Virtual reality can offer immersive experiences beyond the normal senses, such as exploring a cell or experiencing dramatic situations. Certain types of content, like action-packed movies, are easy to create in virtual reality.
Artificial Intelligence: Many AI problems from the 1950s and 1960s are now taught in computer engineering courses. Strong human-like intelligence in AI is still a distant goal. There is motivation to understand human intelligence and explore alternative ways to achieve it. There may be a higher level of abstraction than the neuronal level at which artificial intelligence can be achieved. Nature’s biochemical processes are inefficient compared to human-designed chemical processes, suggesting that a neuronal level simulation may not be necessary for artificial intelligence.
Thought Experiments in Virtual Reality: Virtual reality enables real thought experiments, including classic experiments like those of Einstein and Bohr. However, simulations are not always accurate representations of real life and can lead to incorrect conclusions. Simulations can also be used to explore non-physical situations and help or mislead people.
Extrapolative Predictions: Extrapolation from current trends is not a reliable method for predicting the future of technology.
Amplifying Human Endeavor: Kay argues that predictions about the computer’s future should focus on its potential to amplify human capabilities. The computer’s power lies in its ability to serve as a medium that enhances human endeavors.
Insights from Human Psychology: Kay draws insights from psychological models, such as Freud’s and Brunner’s, to inform the design of user interfaces.
Misconceptions about the Computer’s Capabilities: Kay challenges the notion that the computer’s utility is limited by its ability to perform specific tasks. He emphasizes the emergence of new applications and possibilities as technology advances.
The Future of the Computer: Kay envisions a future where the computer becomes an unremarkable presence in our lives, akin to essential technologies like phones and wristwatches.
The Computer as an Embodiment of Previous Media: Kay believes that the computer has the potential to encompass and enhance various forms of media, transforming how we interact with information and communicate.
Caution in Simulating One Thing with Another: Kay warns that simulating one medium with another can lead to trade-offs and the loss of certain qualities inherent in the original medium.
Computer Simulation and Human Senses: Simulating experiences in computers involves trade-offs between accuracy and aesthetics. Using all human senses in an experience cannot be completely replicated through simulation.
Comparing Computers to the Printed Book: Computers are currently in an unremarkable stage similar to the early days of the printed book. The potential of computers to revolutionize civilization is comparable to the impact of the printed book.
Impact of Computers on Individuals: The transformative effects of computers may not reach everyone, just as many people have not fully experienced the ideas of the 20th century. Individual experiences with computers vary greatly, and not everyone may benefit from their potential.
Barriers to Widespread Literacy: Despite advancements in literacy and the availability of resources like libraries and books, a significant portion of the population has not been carried along with these developments.
Complexity of Reading and Writing: Reading and writing require a higher level of learning compared to visual media like television, which can be understood with minimal effort. This creates a barrier for many individuals.
Equilibrium Myth: The misconception that books and television can convey the same messages equally is contradicted by evidence. Television excels in engaging emotions, introducing individuals, and piquing interest, but it lacks the depth and analytical capabilities of written media.
Engagement and Critical Thinking: Television’s strength in engagement becomes its weakness when it comes to critical thinking. Its immersive nature hinders individuals from stepping back and considering alternative perspectives or exploring deeper connections.
Amplitude Problem: Different media require varying levels of concentration. Written media demands more focus, while certain forms of television and games may not. This difference in amplitude can impact the suitability of these media for different individuals and purposes.
02:42:07 Challenges of Technological Innovations in Society
Reading Difficulty and Concentration: The amount of concentration required to read well varies among individuals, with some reading a book a day and others struggling. Most people never learn to read fluently, making it difficult to process large amounts of reading material. This is similar to the concentration required for activities like tennis and music, which becomes effortless with practice.
Ideas from Laboratories to Commercial Use: The time it takes for ideas to move from laboratories to the commercial world is surprisingly long. This is due to various types of inertia, both positive and negative. Even revolutionary ideas can take decades to emerge in the commercial world.
Cutting Down the Time for Ideas to Commercialize: There is currently no clear way to reduce the time it takes for ideas to reach the market. It requires people to grow up interested in multiple perspectives and be open to ideas outside their current worldview. Monotheistic religious backgrounds and single-minded approaches to ideas hinder the adoption of new concepts.
Japanese Flexibility and the Future: The Japanese culture’s flexibility in accommodating multiple religions and ideas simultaneously is an advantage. Civilizations that embrace diversity in ideas and perspectives are more likely to succeed in the future.
Abstract
Updated Article: The Evolution of Computing: From Early Computers to the Future of Human-Computer Interaction
Abstract:
This article traces the transformative journey of computing, ranging from massive early machines to the evolving human-computer interaction of today. It explores key stages, including batch processing, early visions of computing, the rise of personal computing, the conceptualization of computers as meta-media, and the significance of Xerox PARC’s innovations. The discussion also delves into the complexity of programming paradigms, the influence of tools and agents on human capabilities, and the promising future of intimate computing with agents. The article emphasizes the profound impact of computing on human creativity, perception, and adaptability, suggesting that its future holds even more transformative potential.
1. The Dawn of Computing Era
The journey of computing began with massive, costly machines primarily employed for computational tasks. Their size and purpose-specific design limited broader applications. This period was characterized by a factory-like approach known as batch processing, where computing tasks were processed in limited runs, often leading to delays in results.
2. Human Perception and Early Visions
Initially, these large machines were shrouded in an aura of mythology and awe. Gradually, a shift occurred, viewing computers as potential partners. Early visions of computing diverged into two schools of thought: intelligence amplification and networked information utilities. Innovators like Ivan Sutherland and Doug Engelbart played pivotal roles in this era, with Sutherland’s Sketchpad laying the groundwork for modern computer graphics and Engelbart’s system, demonstrated in 1968, pioneering features like the mouse and hypertext.
3. Personal Computing and Its Challenges
Attempts at personal computing in these early stages faced various limitations. Systems like the Flex Machine, designed for biomedical research, were restricted in storage and programming capabilities. Personal computers of the time were compared to the Model T, and while they brought new dimensions to computing, they were not yet user-friendly, especially for children.
4. Breakthroughs at Xerox PARC
A significant leap in computing came with the founding of Xerox PARC, which created a miniaturized ARPA community. This center was instrumental in developing groundbreaking technologies like the graphical user interface (GUI), Ethernet, and the concept of the Dynabook. PARC’s focus on studying human factors led to the design of user-friendly systems, significantly influencing future technological developments.
PARC’s Opinions on 8-Bit Machines:
Xerox PARC’s Alan Kay believed 16-bit machines were necessary for the advanced software and capabilities PARC envisioned. In contrast, Larry Tesler was more optimistic about the possibilities of 8-bit machines.
Kay’s Regret Over Early Microcomputer Development:
Kay expressed regret over the early proliferation of microcomputers due to the negative impact on computing standards. He believed delaying their introduction until 1984 or later would have allowed for better standards to be established.
VisiCalc: A Notable Exception:
Despite his criticism, Kay acknowledged the significance of VisiCalc, an exceptional application that demonstrated the potential of microcomputers for practical use.
Comparison to Early 1960s Computing:
Kay drew parallels between the capabilities of early 8-bit microcomputers and the limited computing power of the early 1960s. Both eras were characterized by weak machines and rudimentary operating systems.
Alan Kay’s Reflections:
Kay emphasized the significance of PARC’s work in demonstrating the potential for a different approach to human-computer interaction. He described the Lisa as an aesthetically pleasing design that showcased the power of the 16-bit 68000 processor, while acknowledging the Macintosh’s success as a more practical compromise. Kay acknowledged the vindication of PARC’s ideas through the subsequent history of computing, but he cautioned against considering it a complete victory.
The Limitations of Graphical User Interfaces:
Kay expressed skepticism about the ultimate value of graphical user interfaces, arguing that they may not represent the best long-term solution for interacting with computers. He observed that many people still struggle with basic interactions with computers, even with the widespread adoption of GUIs.
The Macintosh and MS-DOS:
Kay highlighted the pivotal moment when Apple bet its future on the Macintosh, despite the dominance of MS-DOS at the time. He acknowledged the intuitive appeal of the Macintosh’s graphical interface, but expressed uncertainty about the reasons for its broader acceptance over MS-DOS.
5. The Computer as Meta-Media
Computers were envisioned as the first meta-media, capable of simulating all other media forms, leading to a cultural shift akin to the invention of writing or printing. This period also saw the conceptualization of computers as tools capable of creating “pocket universes” with different physical laws, thus extending the boundaries of human perception and creativity.
6. Evolution of User Interfaces
The development of user interfaces underwent a significant transformation at Xerox PARC. The Alto computer, a pioneering machine, introduced features like bitmap displays and overlapping windows. These advancements laid the foundation for future personal computing devices like the Macintosh and various software categories.
7. The Future Directions in Human-Computer Interaction
The future of human-computer interaction, as envisioned by pioneers like Alan Kay, includes a seamless connection to a global information network and a more intimate, casual relationship with technology. This vision extends to education, where technology is seen as a tool to amplify children’s interests and learning experiences.
8. The Role of Common Sense and Science in Computing
The relationship between common sense, science, and technology is intricate. While common sense provides a continuity and context, science often contradicts it by presenting concepts outside the sensory domain. The role of AI and virtual reality in this context is to expand our common sense and provide new ways of interacting with and understanding the world.
Educational Vision and the Role of Technology:
Alan Kay believes that children should be actively involved in creating ecologies and studying real animals and plants to develop a deeper understanding of the natural world. He criticizes the disconnect between children’s games and the content-rich activities of adults, which leads to a lack of engagement and motivation in learning. Kay emphasizes the importance of parental involvement in fostering healthy schools and creating a supportive environment where technology can amplify children’s natural curiosity and drive for exploration. He acknowledges the potential of artificial intelligence projects like the Psych Project in generating new ideas for representing knowledge and fostering innovation in education.
Common Sense and Artificial Intelligence:
Common sense provides a fabric for understanding and reasoning that allows us to navigate between areas of expertise, while expert systems often lack this fabric and face abrupt failures when venturing outside their specific domain. Art, similar to common sense, tells the truth through a lie by providing a coherent framework for understanding the world, even if it may not be entirely accurate. Computers can potentially expand our common sense by providing sensory contact with things beyond our normal perception, such as exploring a cell or experiencing dramatic situations. Science often contradicts common sense by dealing with phenomena outside our sensory domain, but it nevertheless provides a valuable framework for understanding the world. Artificial intelligence has been useful in many areas, and some of the most creative and eccentric people in AI have made significant contributions to the field.
Prediction, Virtual Reality, and Artificial Intelligence:
Extrapolation has been the least effective method of prediction in the 20th century. Virtual reality can offer immersive experiences beyond the normal senses, appealing to those who may not enjoy the current reality. Certain types of content, like action-packed movies, are easy to create in virtual reality. Many AI problems from the 1950s and 1960s are now taught in computer engineering courses, but strong human-like intelligence in AI remains a distant goal. There is motivation to understand human intelligence and explore alternative ways to achieve it, potentially at a higher level of abstraction than the neuronal level. Nature’s biochemical processes are inefficient compared to human-designed chemical processes, suggesting that a neuronal level simulation may not be necessary for artificial intelligence. Virtual reality enables real thought experiments, but simulations can also lead to incorrect conclusions or misleading interpretations.
9. The Sociocultural Impact of Computing
The article concludes by reflecting on the sociocultural implications of computing. The adoption of new technologies is often hindered by various cultural and social barriers. However, societies that embrace diverse perspectives and are flexible in their thinking have an advantage in innovation and adopting new technologies.
In essence, the evolution of computing is not just a technological journey but a reflection of human creativity, perception, and adaptability. The future of computing, intertwined with human experience, promises to continue shaping our world in profound ways.
Supplementary Update:
Reading Difficulty and Concentration
The amount of concentration required to read well varies significantly, with some individuals reading a book a day while others struggle to process large amounts of reading material. This phenomenon is akin to the varying degrees of concentration needed for activities like tennis and music, which become effortless with practice.
Ideas from Laboratories to Commercial Use
Surprisingly, the time it takes for ideas to move from laboratories to the commercial world can be lengthy. This is attributed to various types of inertia, both positive and negative, that impact the pace of adoption. Even groundbreaking ideas may take decades to emerge in the marketplace.
Cutting Down the Time for Ideas to Commercialize
Currently, there is no clear solution for accelerating the transition of ideas from research settings to commercial applications. To achieve this, individuals need to cultivate an interest in multiple perspectives and remain open to concepts beyond their current worldview. Monotheistic religious backgrounds and narrow-minded approaches to ideas can hinder the acceptance of innovative concepts.
Japanese Flexibility and the Future
The Japanese culture’s ability to accommodate diverse religions and ideas simultaneously offers a distinct advantage. In the future, civilizations that embrace diversity in ideas and perspectives will be better positioned to thrive.
Alan Kay's vision for personal workstations emphasized powerful tools for creativity and learning, leading to milestones like Smalltalk and the graphical user interface. His Dynabook concept aimed to provide information access anywhere, inspiring the development of modern portable devices....
Alan Kay critiques modern computer science for emphasizing form over content and calls for a paradigm shift towards innovation and user-centric design. He advocates for a holistic approach to education that fosters creativity, perspective, and epistemological frameworks to drive the future of computing....
Insights from pioneers like Alan Kay and Joe Armstrong reveal the evolution of computing, emphasizing the importance of learning, scaling, and addressing unsolved problems. Computer scientists should think like scientists, considering scaling aspects early and embracing failure to drive innovation....
Alan Kay, a pioneer in object-oriented programming, emphasized the significance of holistic approaches, creativity, architectural integrity, and continuous innovation in programming. Smalltalk's influence extends beyond syntax and libraries, embodying architectural principles and structural integrity, akin to the construction of the Chartres Cathedral....
Alan Kay emphasized strategic thinking in software engineering with a focus on domain-specific languages, criticized current computing systems, and envisioned dynamic and efficient computing in the future. Kay highlighted the need for strategic thinking in software engineering with a focus on domain-specific languages, criticized current computing systems, and envisioned dynamic...
Alan Kay's revolutionary concepts, such as the Dynabook and Smalltalk, shaped modern computing and inspired future innovations, while his emphasis on open knowledge and transformative teaching fostered creativity and critical thinking in his students. His insights into early computers and programming languages laid the groundwork for modern object-oriented programming and...
Alan Kay, a Turing Award laureate, advocates for a practical, hands-on approach to computer science education, emphasizing simplicity, systems thinking, and a beginner's mindset. He calls for a shift from theoretical rigor to practical problem-solving, preparing students to contribute to the advancement of computing....