Moral Codes

Designing Alternatives to AI

Why the world needs less AI and better programming languages.


Decades ago, we believed that robots and computers would take over all the boring jobs and drudgery, leaving humans to a life of leisure. This hasn’t happened. Instead, humans are still doing boring jobs, and even worse, AI researchers have built technology that is creative, self-aware, and emotional—doing the tasks humans were supposed to enjoy. How did we get here? In Moral Codes, Alan Blackwell argues that there is a fundamental flaw in the research agenda of AI. What humanity needs, Blackwell argues, is better ways to tell computers what we want them to do, with new and better programming languages: More Open Representations, Access to Learning, and Control Over Digital Expression, in other words, MORAL CODE.

Blackwell draws on his deep experiences as a programming language designer—which he has been doing since 1983—to unpack fundamental principles of interaction design and explain their technical relationship to ideas of creativity and fairness. Taking aim at software that constrains our conversations with strict word counts or infantilizes human interaction with likes and emojis, Blackwell shows how to design software that is better—not more efficient or more profitable, but better for society and better for all people. Covering recent research and the latest smart tools, Blackwell offers rich design principles for a better kind of software—and a better kind of world.
Alan F. Blackwell is Professor of Interdisciplinary Design in the Cambridge University department of Computer Science and Technology. He is a Fellow of Darwin College Cambridge, cofounder with David Good of the Crucible Network for Research in Interdisciplinary Design, and with David and Lara Allen the Global Challenges strategic research initiative of the University of Cambridge.
Acknowledgments
1   Are You Paying Attention?
2   Would You Like Me to Do the Rest? When AI Makes Code
3   Why Is Code Not like AI?
4   Intending and Attending: Chatting to the Stochastic Parrots
5   A Meaningful Conversation with the Internet
6   Making Meaningful Worlds: Being at Home in Code
7   Lessons from Smalltalk: Moral Code before Machine Learning
8   Explanation and Transparency: Beyond No-Code / Low-Code
9   Why Code Is More Important than Flat Design
10   The Craft of Coding
11   How Can Stochastic Parrots Help Us Code?
12   Codes for Creativity and Surprise
13   Making Code Less WEIRD
14   Re-Imagining AI to Invent More Moral Codes
15   Conclusion
Notes
Index

About

Why the world needs less AI and better programming languages.


Decades ago, we believed that robots and computers would take over all the boring jobs and drudgery, leaving humans to a life of leisure. This hasn’t happened. Instead, humans are still doing boring jobs, and even worse, AI researchers have built technology that is creative, self-aware, and emotional—doing the tasks humans were supposed to enjoy. How did we get here? In Moral Codes, Alan Blackwell argues that there is a fundamental flaw in the research agenda of AI. What humanity needs, Blackwell argues, is better ways to tell computers what we want them to do, with new and better programming languages: More Open Representations, Access to Learning, and Control Over Digital Expression, in other words, MORAL CODE.

Blackwell draws on his deep experiences as a programming language designer—which he has been doing since 1983—to unpack fundamental principles of interaction design and explain their technical relationship to ideas of creativity and fairness. Taking aim at software that constrains our conversations with strict word counts or infantilizes human interaction with likes and emojis, Blackwell shows how to design software that is better—not more efficient or more profitable, but better for society and better for all people. Covering recent research and the latest smart tools, Blackwell offers rich design principles for a better kind of software—and a better kind of world.

Author

Alan F. Blackwell is Professor of Interdisciplinary Design in the Cambridge University department of Computer Science and Technology. He is a Fellow of Darwin College Cambridge, cofounder with David Good of the Crucible Network for Research in Interdisciplinary Design, and with David and Lara Allen the Global Challenges strategic research initiative of the University of Cambridge.

Table of Contents

Acknowledgments
1   Are You Paying Attention?
2   Would You Like Me to Do the Rest? When AI Makes Code
3   Why Is Code Not like AI?
4   Intending and Attending: Chatting to the Stochastic Parrots
5   A Meaningful Conversation with the Internet
6   Making Meaningful Worlds: Being at Home in Code
7   Lessons from Smalltalk: Moral Code before Machine Learning
8   Explanation and Transparency: Beyond No-Code / Low-Code
9   Why Code Is More Important than Flat Design
10   The Craft of Coding
11   How Can Stochastic Parrots Help Us Code?
12   Codes for Creativity and Surprise
13   Making Code Less WEIRD
14   Re-Imagining AI to Invent More Moral Codes
15   Conclusion
Notes
Index