Reviews"Through clever provocations guided by a devotion to the uniqueness of human creativity, Blackwell calls for better programming languages that prioritize humanity over the dark valley of AI scenarios that merely mimic human performance." --Ben Shneiderman, Emeritus Professor, University of Maryland "There are few better guides to 'thinking about thinking' than Alan F. Blackwell. Brimming with insight into our increasingly computerized reality, Moral Codes is thoughtful, nuanced, and a pleasure to read." --Frank Pasquale, Professor of Law, Cornell Tech and Cornell Law School "This is an important book for the new age of AI--both philosophical and practical at the same time. Blackwell is a champion for designing technology that gives people agency and choice, celebrating what it means to be human." --Abigail Sellen, Vice President and Distinguished Scientist, Microsoft Research
Dewey Edition23
Table Of ContentAcknowledgments 1 Are You Paying Attention? 2 Would You Like Me to Do the Rest? When AI Makes Code 3 Why Is Code Not like AI? 4 Intending and Attending: Chatting to the Stochastic Parrots 5 A Meaningful Conversation with the Internet 6 Making Meaningful Worlds: Being at Home in Code 7 Lessons from Smalltalk: Moral Code before Machine Learning 8 Explanation and Transparency: Beyond No-Code / Low-Code 9 Why Code Is More Important than Flat Design 10 The Craft of Coding 11 How Can Stochastic Parrots Help Us Code? 12 Codes for Creativity and Surprise 13 Making Code Less WEIRD 14 Re-Imagining AI to Invent More Moral Codes 15 Conclusion Notes Index
SynopsisDecades ago, we believed that robots and computers would take over all the boring jobs and drudgery, leaving humans to a life of leisure. This hasn't happened. Instead, humans are still doing boring jobs, and even worse, AI researchers have built technology that is creative, self-aware, and emotional-and can do the tasks humans were supposed to enjoy. How did we get here? In Moral Codes, Alan Blackwellargues that there is a fundamental flaw in the research agenda of AI. What humanity needs, Blackwell argues, is better ways to tell computers what we want them to do, with new and better programming languages: More Open Representations, Access to Learning, and Control Over Digital Expression-in other words, MORAL CODE., Why the world needs less AI and better programming languages. Decades ago, we believed that robots and computers would take over all the boring jobs and drudgery, leaving humans to a life of leisure. This hasn't happened. Instead, humans are still doing boring jobs, and even worse, AI researchers have built technology that is creative, self-aware, and emotional--doing the tasks humans were supposed to enjoy. How did we get here? In Moral Codes , Alan Blackwell argues that there is a fundamental flaw in the research agenda of AI. What humanity needs, Blackwell argues, is better ways to tell computers what we want them to do, with new and better programming languages: More Open Representations, Access to Learning, and Control Over Digital Expression, in other words, MORAL CODE. Blackwell draws on his deep experiences as a programming language designer--which he has been doing since 1983--to unpack fundamental principles of interaction design and explain their technical relationship to ideas of creativity and fairness. Taking aim at software that constrains our conversations with strict word counts or infantilizes human interaction with likes and emojis, Blackwell shows how to design software that is better--not more efficient or more profitable, but better for society and better for all people. Covering recent research and the latest smart tools, Blackwell offers rich design principles for a better kind of software--and a better kind of world.