Alan Kay (VPRI Co-founder) – Is it really “Complex”? Or did we just make it “Complicated”? (Jul 2014)
Chapters
00:00:06 Conceptualizing Complexity and Complication in Software Systems
Contextualizing the Issue: Alan Kay initiates his talk by emphasizing the importance of context when addressing software complexity. He suggests starting with a metric, such as lines of code, but recognizes its limitations. Kay compares the size of codebases to physical books and the Empire State Building to illustrate their enormity.
Software Complexity and Complication: Kay introduces two terms: complexity and complication. Complexity is an intrinsic measure of the meaning of the code, while complication refers to additional complexities introduced during the development process. He argues that the goal should be to minimize complication and bring it closer to the intrinsic complexity of the software.
Personal Computing as an Example: Kay uses personal computing as an example due to its familiarity and relevance to the audience. He questions the notion that Microsoft’s Windows and Office contain 120 million lines of code of content, highlighting the disparity between complexity and complication.
Xerox PARC’s Experience: Kay draws upon his experience at Xerox PARC, where they developed the operating system, programming language, applications, and user interface with only 10,000 lines of code. He contrasts this with modern software development practices, where programmers optimize code for Intel processors instead of designing hardware around the software.
Misalignment between Hardware and Software: Kay criticizes the current approach of building hardware first and then optimizing software to run on it, calling it “completely backwards.” He advocates for defining the software system first and then building hardware specifically tailored to run it efficiently.
Field Programmable Gate Arrays and Microcode: Kay mentions the potential of field programmable gate arrays and microcode for customizing hardware to suit software needs. He expresses frustration at Intel and Motorola’s lack of interest in incorporating such features into their chips.
Science and Hairballs: Kay introduces the concept of science as a process of untangling hairballs, using a woolly mammoth’s hairball and the starry heavens as examples. He emphasizes the difficulty of directly grasping the truth from nature and the need to interpret phenomena to understand it.
00:12:13 Science, Systems, and the Limits of Understanding
Science: Science is not the phenomena or the story but the relationship between them. It’s not a new religion with truths to be learned but a process of understanding the relationship between phenomena and stories.
Maxwell’s Equations: Maxwell’s equations, describing light and radiation, are too complex to fit on a T-shirt in their original form. Helmholtz devised macro operators like curl, divergence, and gradient to simplify Maxwell’s equations. The electric and magnetic fields in Maxwell’s equations are not symmetrically stated, which Einstein later addressed in his special theory of relativity.
Biology: Schrodinger’s wave equation, similar to Maxwell’s equations, describes biological phenomena. Simulations based on Schrodinger’s wave equation have revealed new insights into nature, confirmed by subsequent observations. Biological systems involve complex organizations and interactions of molecules, making individual details less relevant. Systems and architectural aspects are crucial in understanding how biological systems develop and function. Errors during morphogenesis are often canceled out, resulting in robust and functional organisms.
Recommended Reading: Molecular Biology of the Cell, 3rd edition, is recommended for those interested in systems biology.
00:18:27 Connecting Biology, Math, and Engineering for Innovation
Mathematical and Biological Foundation: Complexity and complication problems require a combination of math and biology to simplify complexities. Biology is kludgy, offering opportunities for better design in engineering.
The Bridge as a Metaphor for Phenomena: Once created, a bridge is phenomena, regardless of how it was made. The merging of science and engineering has led to incredible feats like the two-mile span bridge in Japan.
The Role of Computer Phenomena: Computer hardware and software can be viewed as phenomena. Lisp, developed by John McCarthy, is an interesting t-shirt that offers more possibilities than Fortran.
Scaling Up and the Need for Molecular Biology: As we scale up from programming languages to personal computing and the internet, the need for something like the molecular biology of the cell arises. Real object-oriented languages, personal computing systems, and the internet exhibit math-like, bio-like, ecology-like, and politics-like characteristics.
Communication Challenges in the Intergalactic Network: Licklider envisioned the intergalactic network, aiming to reach every human on the planet. The biggest problem in such a network would be learning to communicate with aliens, both human and software.
Negotiation and Theater in Computing: As we scale up, negotiation becomes the appropriate level of coordination. User interface, the theater part, is crucial for making computing accessible.
Books for Further Reading: The Equivalent of the Molecular Biology of the Cell: Christopher Alexander’s classic book from the 60s on architecture. The Design of Design: Fred Brooks’ book with insights from his experience in software systems.
00:25:37 Inventive Effort for Higher-Level Languages and Architectures
Challenges in Computing: Alan Kay emphasizes the lack of a formal science of system design and the difficulty in scaling up design ideas. We tend to favor tactical approaches over strategic thinking in problem-solving.
Strategic Thinking in Architecture: Kay highlights the invention of the arch as an example of strategic thinking in architecture. Arches allow for more intricate and durable structures compared to simple stacking of bricks.
Tinkering and Funding: Kay criticizes the common funding model in computing, which often lacks funding for problem finding. He suggests setting a desirable future vision and involving experts to identify the actual problems.
Programming Languages and Complexity: Kay discusses the spectrum of programming languages from machine code to higher-level languages. As we move towards higher-level languages, the programming effort should decrease, but inventing these languages requires significant upfront investment.
Stuck in the Past: Kay points out that many programming languages and styles we use today, such as C++, Simula, and Java, have roots in languages from the 1960s and 1970s. This stagnation hinders progress towards more efficient and effective programming.
Funding Challenges: Kay highlights the difficulty in obtaining funding for research on higher-level programming languages and architectures. Organizations like NSF are reluctant to fund such projects, and private companies often prioritize short-term profits.
00:32:45 Reevaluating Programming Crew Size in the Era of Advanced Technology
The Clipper Ship Era: Clipper ships, like the Peking, were massive sailing vessels that could stay out at sea for up to two years, completely independent. They had crews of about 100 highly trained individuals who could handle any situation, even in the dark or during a hurricane.
The Nuclear Submarine Era: Modern nuclear submarines are about the same size as clipper ships but have crews of about 130 people. This raises the question of whether such a large crew is necessary or efficient, especially in emergency situations.
The Airplane Era: In contrast to submarines, airplanes like the Airbus 800 and the Boeing 747 can be flown by just two people. This is possible due to advanced automation and the fact that airplanes don’t require the same level of manual labor as ships or submarines.
The Problem with Large Programming Crews: Alan Kay argues that the large programming crews common in software development today are a relic of the past, when code was written in machine code and there was much more code to be written. Today, programming should be done in terms of specifications or requirements, which are often not runnable or debuggable.
The Need for Consistent and Debuggable Specifications: The first step in improving software development is to create consistent and debuggable specifications. This can be done by using a supercomputer to create a prototype of the system. If the prototype is successful, it can be shipped to users.
The Importance of Focusing on Goals and Requirements: Instead of focusing on the number of people needed to develop a software system, we should focus on the number of people needed to define the goals and requirements of the system. This is a more important and challenging task that requires a deep understanding of the problem domain.
Computing Should Serve Itself Better: Computing has been terrible at developing systems for itself over the last 30 years. We should be able to create better languages and development environments for ourselves, rather than relying on inadequate tools and weak expressiveness.
00:38:22 The Evolution of Interactive Computer Graphics and Personal Computing
Early Interactive Computer Graphics: Introduction of computer graphics and interactive systems 50 years ago Ivan Sutherland’s invention of Sketchpad, one of the earliest interactive systems Sketchpad’s revolutionary approach to programming with constraints and object-oriented design
Key Features of Sketchpad: Interactive drawing and manipulation of objects Masters and instances concept, allowing for reusable components Programming by specifying constraints and relationships Ability to construct complex structures like bridges and electric circuits without predefined knowledge
Sketchpad’s Recreation and Modern Applications: Demonstration of a recreated Sketchpad bridge simulation on a laptop Extension of Sketchpad’s principles to modern simulations, including text and dynamic weight manipulation Exploration of the three main constraints used in the simulation: gravity, spring constant, and pin constraint
Philosophical Implications and Beyond PowerPoint: Critique of PowerPoint and the need for innovative presentation tools Emphasis on the computer’s potential as a simulator, enabling creation of immersive and interactive experiences Introduction of “Frank,” a collection of experimental systems exploring the complexity of personal computing Integration of presentation capabilities within the experimental systems themselves
00:47:50 Developing New Languages for Personal Computing: Creating Efficient Graphics Systems
High-Level Overview: Alan Kay discusses the conceptual framework, design principles, and implementation details of a personal computing system that aims to simplify and enhance the process of understanding complex systems.
Key Points: The system leverages DSLs (Domain-Specific Languages) to express various aspects of computing, enabling modularity and code reduction. The graphics system is implemented with 435 lines of code, significantly smaller than traditional approaches, demonstrating the power of DSLs. A meta-compiler is employed to handle structured objects, facilitating parsing, transformations, and optimizations.
Implementation Details: Dan Amelang developed a novel formula for defining arbitrary shapes with pixels, eliminating the need for super sampling. A visualizer was created to provide an intuitive understanding of the graphics code, enhancing debugging and comprehension. Compositing routines are concisely implemented, expressing complex mathematical operations in a few lines of code. Pen stroking, texturing, gradients, and Gaussian filters are implemented with minimal code, showcasing the efficiency of the DSL approach.
The Meta-compiler: Alex Worth developed a meta-compiler capable of handling structured objects, enabling parsing, transformations, and optimizations. It generates efficient code for various platforms, including JavaScript, C, and machine code.
Conclusion: Alan Kay emphasizes the importance of suppressing the present and exploring alternative ways of thinking to envision a different future for computing. The personal computing system presented demonstrates the potential of DSLs in simplifying complex systems, leading to code reduction and improved understanding.
01:03:21 New Approaches to Programming: Making Code More Efficient and Creative
Using a Metalanguage to Create Languages: Alan Kay discusses the concept of using a metalanguage to create languages. The metalanguage used in this case is a language for making languages. By using a metalanguage, it is possible to create a language with only 76 lines of code, plus additional lines for optimization.
TCPIP Optimization: TCPIP is a complex protocol that is typically implemented in thousands of lines of code. Alan Kay presents an example of a TCPIP implementation in only 160 lines of code. This optimization was achieved by viewing TCPIP as a non-deterministic parser and using a higher-level form to implement it.
Ant-Inspired Text Justifier and Editor: Kay describes an experiment where simple rules were applied to individual text characters, causing them to organize themselves into a text justifier and editor. This experiment demonstrates how simple rules can lead to complex behavior in a distributed system.
Document as Code: Kay introduces the idea of a document in the system that is also an essay containing code. He provides an example of code that disperses text and wraps it around, demonstrating how a different point of view can lead to more efficient and elegant solutions.
Systems as Intercommunicating Processes: Alan Kay proposes a conceptualization of computing systems as intercommunicating processes, with user interfaces as one such process.
Virtual Machines and Physical Machines: Virtual machines should map over physical machines in multiple ways, enabling adaptability, such as in mobile computing scenarios.
Recursive Internet Model: The system resembles a recursive version of the internet, with machine processors connected physically and virtual machines connected virtually through loose coupling.
Next-Generation Publish and Subscribe: Future systems will utilize next-generation publish and subscribe mechanisms, emphasizing semantics to handle differences between objects.
Negotiation Processes: Negotiation processes will be crucial to reconcile superficial differences between objects, ensuring compatibility and interoperability.
Loose Coupling and Dynamic Bindings: Loose coupling is employed in the system, allowing for dynamic bindings between objects, represented visually through lighting up when messages are sent.
Shifting from Solvers to Constraints and Meanings: The future of computing involves moving away from solvers and towards focusing on constraints, meanings, and relationships.
Automatic Libraries and Expert Systems: Automatic libraries and expert systems will assist in the design process, selecting appropriate solvers and organizing them into efficient solutions.
3D Printing Analogy: Kay compares the desired design process to 3D printing, where the focus is on creating the design and automating the production process.
n-Dimensional Printing in Computing: Computing involves n-dimensional printing, where relationships in dynamic n-space are printed, requiring more complex optimization techniques.
CAD-Like Development Systems: Kay envisions future development systems resembling CAD systems, enabling comprehensive design, testing, and simulation within a single environment.
Overall Points: Computing in the future will involve devices with significantly more power than current standard computing. Prototypes can be tested and designs can be considered for shipping with such powerful devices. Kay’s programming language, Ometa, is available for experimentation and includes a version that can run in a browser. The generation of behavior, not just code, from specifications has been attempted in the past, including in the 1970s. Higher-level languages, including problem-oriented languages like ALGOL, PL1, and COBOL, have been developed to match specific problem domains.
Availability of Ometa: Ometa is accessible through a UCLA thesis from 2007 or 2008. A user-friendly version is available for experimentation, which can be downloaded and run in a browser. It includes approximately 20 languages, allowing users to explore how languages are made and create their own languages.
Discussion on Specification and Code Generation: Attempts have been made in the past to create specification languages that can be automatically compiled into high-level languages like C. The speaker corrects the notion that C is a high-level language, emphasizing its low-level nature. The evolution of higher-level languages has been driven by the goal of matching problem domains, as seen in the development of problem-oriented languages (POLs).
Highlighting the Sophistication of Sketchpad: Sketchpad demonstrates the advanced thinking of its time, surpassing much of the contemporary thinking. It represents the work of a single individual in a year, showcasing the potential of a true PhD thesis. Sketchpad serves as an example of the efforts to develop higher-level languages that align with specific problem domains.
01:20:46 Innovations in Programming and Computing: From Dynabook to AI
Problems with Programming Languages: Alan Kay highlights the challenge of developing programming languages that minimize unnecessary code and allow for efficient real-time usage. The complexity of the task should not discourage efforts, but the value of a solution should be considered.
NILE’s Significance: NILE’s early development allowed for real-time use, encouraging its practical application. The transition from a demo to a usable tool involved substantial work, and Moore’s Law alone does not guarantee a solution.
Optimizing Compilers: Kay discusses the evolution of compiler optimization techniques. Modern optimizers can utilize large stretches of code and achieve performance comparable to statically typed languages, even in dynamically typed languages.
Articulation of Problems: Kay emphasizes that the problems with programming languages were identified and articulated 50 years ago. Despite this, computing has not progressed significantly, and we still rely on languages like C, resulting in immense codebases.
Maintenance and Flexibility: Kay points out that 85% of software costs occur after deployment due to maintenance and changing requirements. The inflexibility of current programming practices makes it challenging to adapt to these changes.
01:24:52 Apple's Stifling of Creativity and Innovation
Apple’s Restrictions on Child Creativity: Apple forbids children from sharing their creations on the internet and downloading creations from other children, hindering personal computing and creativity.
iPad’s User Interface: Apple caters to extreme ends of the user spectrum, neglecting the needs of individuals who want to learn and use the device for more than just consuming media.
Lack of Stylus Support: The iPad lacks a dedicated place to store a capacitive pen, making it inconvenient for users who want to use the device for creative purposes.
Impact on Education: Apple’s restrictive policies and the iPad’s user interface hinder the device’s potential as a learning tool, replacing computers that offer more educational capabilities.
Convenience over Substance: Apple’s focus on convenience and simplicity has led to a device that lacks the depth and functionality necessary for learning and creativity.
The Esau Story: Alan Kay draws a parallel between Esau’s impulsive decision to trade his birthright for a bowl of soup and humanity’s tendency to sacrifice long-term benefits for immediate convenience.
Blame on Apple and Schools: Kay blames Apple for intentionally creating a device that caters to consumers’ convenience rather than their educational needs. He also criticizes schools for failing to educate themselves about the limitations of the iPad as a learning tool.
Kay’s Efforts to Improve the iPad: Kay dedicated time and effort to make systems like Squeak and Scratch compatible with the iPad, but faced obstacles due to Apple’s restrictive policies.
Apple’s Partial Concession: Steve Jobs allowed interpreters to be installed on the iPad but did not permit sharing interpretations over the internet, creating limitations for users.
Critique of Apple’s Consumer-Centric Approach: Kay criticizes Apple’s consumer-centric approach, which views consumers as mere sources of revenue rather than individuals capable of learning and creating.
01:28:35 Abstraction and Optimization in Computing Systems
First-Order vs. Second-Order Theories: There are two schools of thought regarding the development of operating systems and programming languages. The first-order theory suggests that one should avoid creating their own operating system or programming language due to the risk of getting lost in the details and forgetting the original purpose. The second-order theory argues that if one has the skills, they should create their own operating system and programming language to avoid dealing with vendors and external ideas.
The Importance of Implementation and Debugging: In computing, it is difficult to prove things mathematically. The most interesting aspects of computing often cannot be proven. Therefore, the best way to assess the quality of an idea is to implement it, debug it, and evaluate its performance.
Five Smart People Can Prove Suppositions Wrong: To address the concern that high-level abstraction may result in suboptimal code for embedded systems, Alan Kay suggests assigning five smart people to work on the problem. He believes that there is a 90% chance that they would disprove the supposition and create something that rivals hand-crafted code in terms of speed and debuggability.
Benchmarking Real Programs: Integer benchmarks are not always meaningful for evaluating the performance of a system. A more effective approach is to run real programs on the system and measure the actual ups and downs of the hardware. This provides a more accurate assessment of the system’s performance under realistic conditions.
01:32:35 Revolutionizing Computing with Field Programmable Gate Arrays
Hardware and Software Progression: Commercial hardware companies like Intel lack the incentive to innovate and improve their products, leading to stagnation in hardware development.
FPGA Technology: Field Programmable Gate Arrays (FPGAs) have undergone significant advancements in recent years, providing a new approach to hardware design. FPGAs offer a large number of gates, allowing for the implementation of complex functions. Optimization techniques have improved the efficiency of FPGA programming, enabling better utilization of chip resources.
BEE3 and B-Cube: Chuck Thacker, a former Xerox PARC researcher, developed the BEE3, a modern equivalent of the Xerox PARC machine using FPGA technology. B-Cube, the company behind the BEE3, offers a powerful system with four Xilinx FPGA chips, 170 gigabytes of RAM, and a compact form factor.
Hardware as Crystallized Software: Alan Kay views hardware as software that is “crystallized early” due to its early implementation in the process. He emphasizes the importance of considering hardware design as an integral part of the overall system design process.
Hardware and Software Convergence: In the 1960s, computer science required proficiency in both hardware and software, recognizing the interconnectedness of these disciplines. The distinction between hardware and software has led to separate cultures and approaches to system design.
Nile System and High-Level Language Efficiency: The Nile system automatically adapts to the number of cores and hardware threads available, optimizing performance without manual intervention. Automatic SIMD generation and parallelization techniques enhance the efficiency of high-level programs. High-level languages can be used effectively for efficient system development, challenging the notion that low-level languages are necessary for performance.
Intel’s Multi-Core Approach: Intel’s multi-core strategy was a reactive measure in response to a chip meltdown incident. Cache line management in the Intel architecture can be challenging due to its complex design.
01:40:21 Understanding Hardware Challenges for Nile's Execution
Difficulties of Feeding Multiple Cores: Intel’s memory architecture makes it challenging to efficiently feed even one core, let alone multiple cores with double threading.
Challenges Faced by the Nile Development Team: The team faced significant difficulties in optimizing Nile’s performance on Intel hardware due to the lack of control over the hardware. The team spent a considerable amount of time and effort in efficiently distributing computations across Intel processors.
Initial Plan for a Proof of Concept: The team initially intended to use FPGA (Field-Programmable Gate Array) hardware for a proof of concept.
Practical Considerations and the Importance of Custom Hardware: For practical applications, it’s beneficial to have the option of designing and rolling out custom hardware. Developing simple ASICs (Application-Specific Integrated Circuits) can significantly improve performance, especially for tasks with high overhead in virtual machine execution.
Advantages of Byte Codes and Custom Hardware: Byte codes are more compact than regular instructions, and executing them as fast as regular instructions can provide significant performance gains. Custom hardware can accelerate the execution of byte codes, leading to improved overall performance.
Conclusion: The development of Nile on Intel hardware highlighted the importance of having control over hardware and the potential benefits of custom hardware for improving performance.
Abstract
Bridging Complexity: The Evolution and Impact of Alan Kay’s Perspectives on Computing and Science, with Supplemental Insights
In a series of thought-provoking talks, Alan Kay, a renowned figure in computing and science, delved deep into the intricacies of code complexity, the evolution of programming languages, and the convergence of science and engineering. By drawing analogies from personal computing, biological systems, and historical innovations, Kay emphasized the crucial need for strategic thinking in the field of software engineering and the creation of more efficient, user-centric computing systems. He critiqued the current landscape, including large codebases and their maintenance, while advocating for a paradigm shift towards domain-specific languages (DSLs) and customized hardware solutions. This article synthesizes these insights, highlighting Kay’s profound impact on how we perceive and approach complex systems in the digital age.
Organizing Main Ideas
1. The Complexity and Complications in Code: Kay highlighted the gap between system complexity and added complications in software, citing code size and the challenges in managing large codebases as significant issues. In his work at Xerox PARC, he demonstrated how software can be developed with significantly fewer lines of code, thereby reducing complexity and enhancing understanding.
2. Science and Systems Understanding: Kay’s unique perspective on science as a relationship between phenomena and explanatory stories underlines the importance of a holistic approach to understanding complex systems, particularly in biology. By drawing parallels between the study of biology and the development of computing systems, Kay emphasizes the importance of understanding the system’s architecture and dynamics, rather than focusing solely on individual components.
3. Convergence of Science and Engineering: Emphasizing the artistic nature of this convergence, Kay draws parallels between biological evolution and engineering design, urging for a molecular biology-like approach in scaling systems. He envisions a future where science, engineering, and art converge to create elegant and efficient computing systems.
4. Shift from Tactical to Strategic Thinking: Kay critiqued the prevalent tactical approach in computing, advocating for strategic thinking as exemplified by the invention of the arch and the need for high-level programming languages. He highlights the arch as an example of strategic thinking in architecture, allowing for more intricate and durable structures compared to simple stacking of bricks.
5. The Role of Crew Size in Software Development: Questioning the necessity of large programming teams, Kay stresses the importance of clear, debuggable requirements in software development. He argues that the large programming crews common in software development today are a relic of the past, when code was written in machine code and there was much more code to be written. Today, programming should be done in terms of specifications or requirements, which are often not runnable or debuggable.
6. Innovative Experiments in Personal Computing: Through his experiments, Kay demonstrated efficient and novel approaches to personal computing, emphasizing the potential of DSLs and meta-compilers in simplifying complex code. His work on a highly efficient personal computing system, requiring minimal lines of code, demonstrates the potential for significant advancements in software design.
7. Reimagining Computing Systems: Kay criticizes the stagnation in computing systems development and urges for the creation of better languages and environments, leveraging his insights from the Sketchpad and Frank System. He highlights the limitations of current systems like PowerPoint and advocates for leveraging computers’ full capabilities as simulators.
8. The Future of Computing: Envisioning a future with loosely coupled systems, semantic compilers, and n-dimensional CAD-like development, Kay outlines the path for more dynamic and efficient computing. He envisions a future where computing involves more dynamic relationships and efficient design processes, made possible by technologies like semantic compilers and n-dimensional printing.
9. Alan Kay’s Insights on Programming Languages: Kay addresses the challenges and potential in developing usable programming languages, criticizing the persistence of outdated paradigms and advocating for innovative approaches. He discusses the spectrum of programming languages from machine code to higher-level languages, emphasizing that as we move towards higher-level languages, the programming effort should decrease. However, inventing these languages requires significant upfront investment.
10. Apple’s Role in Personal Computing: Kay’s critique of Apple’s restrictive policies highlights the need for more open, creative computing environments for learning and innovation. He criticizes Apple’s policies for stifling creativity and learning in computing, reflecting on the ethical implications of their approach.
11. High-Level Abstractions and System Performance: Discussing theories on system development, Kay emphasizes the significance of high-level language efficiency and the role of custom hardware in optimizing performance. He illustrates the need for innovation in both software and hardware design, discussing the efficiency of high-level languages and the potential of custom hardware.
12. Exploring the Design Principles and Implementation of a Personal Computing System with Alan Kay: Kay discusses the conceptual framework, design principles, and implementation details of a personal computing system that aims to simplify and enhance the process of understanding complex systems. The system leverages DSLs (Domain-Specific Languages) to express various aspects of computing, enabling modularity and code reduction. The graphics system is implemented with 435 lines of code, significantly smaller than traditional approaches, demonstrating the power of DSLs.
13. Language Creation and Optimization: Kay discusses the concept of using a metalanguage to create languages, demonstrating the creation of a language with only 76 lines of code. He also presents an example of a TCPIP implementation in only 160 lines of code, achieved by viewing TCPIP as a non-deterministic parser. Additionally, Kay describes an experiment where simple rules applied to individual text characters caused them to organize themselves into a text justifier and editor, demonstrating how simple rules can lead to complex behavior in a distributed system.
14. Insights and Key Points from Alan Kay’s Discussion on Computing Systems and Design: Kay conceptualizes computing systems as intercommunicating processes, emphasizing adaptability and dynamic bindings. He proposes a recursive internet model with next-generation publish and subscribe mechanisms and negotiation processes to handle differences between objects. Kay advocates for a shift from solvers to constraints and meanings, assisted by automatic libraries and expert systems. He draws analogies to 3D printing and n-dimensional printing, envisioning future development systems resembling CAD systems for comprehensive design and testing.
15. Summary of Alan Kay’s Presentation on Computing and Programming Languages: Kay’s presentation highlights the potential of future computing devices with significantly increased power, enabling the testing of prototypes and the consideration of designs for shipping. He discusses the availability of his programming language, Ometa, for experimentation, including a user-friendly version that can be downloaded and run in a browser. Kay also discusses past attempts at creating specification languages that can be automatically compiled into high-level languages, emphasizing the sophistication of Sketchpad and the challenges of developing programming languages that minimize unnecessary code.
16. Alan Kay on Programming Language Evolution: Kay acknowledges the complexity of developing programming languages that minimize unnecessary code and allow for efficient real-time usage. He highlights the significance of NILE’s early development in enabling real-time use and the challenges of transitioning from a demo to a usable tool. Kay also discusses the evolution of compiler optimization techniques, emphasizing the potential of modern optimizers in achieving performance comparable to statically typed languages. However, he criticizes the lack of significant progress in computing, despite the identification of problems with programming languages decades ago, resulting in immense codebases and high maintenance costs.
17. Apple’s Restrictive Policies and the iPad’s User Interface: Kay critiques Apple’s restrictive policies, which prevent children from sharing their creations on the internet and downloading creations from other children, hindering personal computing and creativity. He also criticizes the iPad’s user interface, which caters to extreme ends of the user spectrum and neglects the needs of individuals who want to learn and use the device for more than just consuming media. Kay highlights the lack of stylus support, which makes it inconvenient for creative purposes, and the impact on education, as the iPad fails to offer the educational capabilities of computers. He draws parallels between Apple’s approach and Esau’s decision to trade his birthright for a bowl of soup, emphasizing the importance of long-term benefits over immediate convenience. Kay also criticizes Apple for intentionally creating a device that caters to convenience rather than educational needs and blames schools for failing to educate themselves about the iPad’s limitations as a learning tool. He mentions his efforts to make systems like Squeak and Scratch compatible with the iPad but faced obstacles due to Apple’s restrictive policies. Despite Steve Jobs’s partial concession to allow interpreters on the iPad, Kay criticizes Apple’s consumer-centric approach, which views consumers as sources of revenue rather than individuals capable of learning and creating.
Background and Additional Information
In concluding, this article reflects on the broader implications of Kay’s insights for the future of computing. His emphasis on strategic thinking and his calls for the development of better languages and environments have been instrumental in shaping the current landscape of computing. Kay’s vision for the future of computing, with its focus on dynamic relationships and efficient design processes, promises to usher in a new era of innovation and creativity in the field.
First-Order vs. Second-Order Theories:
Two schools of thought exist regarding the development of operating systems and programming languages. The first-order theory suggests avoiding creating your own operating system or programming language due to the risk of getting lost in the details and forgetting the original purpose. On the other hand, the second-order theory argues that skilled individuals should create their own operating systems and programming languages to avoid dealing with vendors and external ideas.
The Importance of Implementation and Debugging:
In computing, it is difficult to prove things mathematically. The most interesting aspects of computing often cannot be proven. Therefore, the best way to assess the quality of an idea is to implement it, debug it, and evaluate its performance.
Five Smart People Can Prove Suppositions Wrong:
To address the concern that high-level abstraction may result in suboptimal code for embedded systems, Alan Kay suggests assigning five smart people to work on the problem. He believes that there is a 90% chance that they would disprove the supposition and create something that rivals hand-crafted code in terms of speed and debuggability.
Benchmarking Real Programs:
Integer benchmarks are not always meaningful for evaluating a system’s performance. A more effective approach is to run real programs on the system and measure the actual ups and downs of the hardware. This provides a more accurate assessment of the system’s performance under realistic conditions.
Hardware and Software Progression:
Commercial hardware companies like Intel lack the incentive to innovate and improve their products, leading to stagnation in hardware development.
FPGA Technology:
Field Programmable Gate Arrays (FPGAs) have undergone significant advancements in recent years, providing a new approach to hardware design. FPGAs offer a large number of gates, allowing for the implementation of complex functions. Optimization techniques have improved the efficiency of FPGA programming, enabling better utilization of chip resources.
BEE3 and B-Cube:
Chuck Thacker, a former Xerox PARC researcher, developed the BEE3, a modern equivalent of the Xerox PARC machine using FPGA technology. B-Cube, the company behind the BEE3, offers a powerful system with four Xilinx FPGA chips, 170 gigabytes of RAM, and a compact form factor.
Hardware as Crystallized Software:
Alan Kay views hardware as software that is “crystallized early” due to its early implementation in the process. He emphasizes the importance of considering hardware design as an integral part of the overall system design process.
Hardware and Software Convergence:
In the 1960s, computer science required proficiency in both hardware and software, recognizing the interconnectedness of these disciplines. The distinction between hardware and software has led to separate cultures and approaches to system design.
Nile System and High-Level Language Efficiency:
The Nile system automatically adapts to the number of cores and hardware threads available, optimizing performance without manual intervention. Automatic SIMD generation and parallelization techniques enhance the efficiency of high-level programs. High-level languages can be used effectively for efficient system development, challenging the notion that low-level languages are necessary for performance.
Intel’s Multi-Core Approach:
Intel’s multi-core strategy was a reactive measure in response to a chip meltdown incident. Cache line management in the Intel architecture can be challenging due to its complex design.
Difficulties of Feeding Multiple Cores:
Intel’s memory architecture makes it challenging to efficiently feed even one core, let alone multiple cores with double threading.
Challenges Faced by the Nile Development Team:
The team faced significant difficulties in optimizing Nile’s performance on Intel hardware due to the lack of control over the hardware. The team spent a considerable amount of time and effort in efficiently distributing computations across Intel processors.
Initial Plan for a Proof of Concept:
The team initially intended to use FPGA (Field-Programmable Gate Array) hardware for a proof of concept.
Practical Considerations and the Importance of Custom Hardware:
For practical applications, it’s beneficial to have the option of designing and rolling out custom hardware. Developing simple ASICs (Application-Specific Integrated Circuits) can significantly improve performance, especially for tasks with high overhead in virtual machine execution.
Advantages of Byte Codes and Custom Hardware:
Byte codes are more compact than regular instructions, and executing them as fast as regular instructions can provide significant performance gains. Custom hardware can accelerate the execution of byte codes, leading to improved overall performance.
The development of Nile on Intel hardware highlighted the importance of having control over hardware and the potential benefits of custom hardware for improving performance.
Alan Kay's vision for personal workstations emphasized powerful tools for creativity and learning, leading to milestones like Smalltalk and the graphical user interface. His Dynabook concept aimed to provide information access anywhere, inspiring the development of modern portable devices....
Insights from pioneers like Alan Kay and Joe Armstrong reveal the evolution of computing, emphasizing the importance of learning, scaling, and addressing unsolved problems. Computer scientists should think like scientists, considering scaling aspects early and embracing failure to drive innovation....
Alan Kay critiques modern computer science for emphasizing form over content and calls for a paradigm shift towards innovation and user-centric design. He advocates for a holistic approach to education that fosters creativity, perspective, and epistemological frameworks to drive the future of computing....
Alan Kay, a pioneer in object-oriented programming, emphasized the significance of holistic approaches, creativity, architectural integrity, and continuous innovation in programming. Smalltalk's influence extends beyond syntax and libraries, embodying architectural principles and structural integrity, akin to the construction of the Chartres Cathedral....
Alan Kay's revolutionary concepts, such as the Dynabook and Smalltalk, shaped modern computing and inspired future innovations, while his emphasis on open knowledge and transformative teaching fostered creativity and critical thinking in his students. His insights into early computers and programming languages laid the groundwork for modern object-oriented programming and...
Alan Kay, a Turing Award laureate, advocates for a practical, hands-on approach to computer science education, emphasizing simplicity, systems thinking, and a beginner's mindset. He calls for a shift from theoretical rigor to practical problem-solving, preparing students to contribute to the advancement of computing....
Computing's evolution reflects human creativity, perception, and adaptability, with promises of future transformation. Computers' impact on human experience will continue to shape the world in profound ways....