Princeton Docket # 18-3370
As artificial intelligence becomes more integrated into our lives, there is an increasing need to probe whether it can be conscious. If a given type of AI is conscious (i.e., if it feels like something to be an AI) ethical considerations can halt its marketability. Further, testing is key to determine how AI consciousness impacts AI intelligence, safety, empathy and goal content integrity. The Turing test does not probe whether an AI is conscious, it only judges whether the output can pass for that of human. It cannot determine whether the synthetic minds we create have an experience-based understanding of the way it feels to be conscious. To explore these fundamental questions, researchers at Princeton University, together with Susan Schneider, have proposed a behavior-based artificial consciousness test (ACT), and related tests for AI safety.
Humans intuitively understand what it means to be conscious. Every adult can quickly and readily grasp concepts such as reincarnation or the soul leaving the body, scenarios that are difficult to understand without conscious experience. An ACT test would challenge an AI with a series of increasingly demanding behavioral and natural language interactions to see how quickly and readily the AI can grasp and use concepts based on the internal experiences we associate with consciousness. To prevent AIs from assembling answers using knowledge of human consciousness they can be “boxed in,” i.e., isolated from the outside world.
Princeton is seeking industry collaborators to help develop an ACT test. For a more detailed discussion of the test, please refer to our recent article in Scientific American: “Is Anyone Home? A Way to Find Out If AI Has Become Self-Aware.”
• Natural language consciousness determination of advanced AIs
• Can be extended to probe concepts such as ethics, empathy, and safety
• Test results impact feasibility of brain-machine interfaces
• Does not require knowing the philosophical nature or neural basis for consciousness
• Seeks to understand the internal dialogue of an AI, not just output
Publications and Media
Schneider S. and Turner E. (2017) Is Anyone Home? A Way to Find Out If AI Has Become Self-Aware. Scientific American Blog Network.
Schneider S. Future Minds: AI, Brain Enhancement, and the Nature of the Self, Princeton University Press, forthcoming.
Schneider S. “Can a Machine Feel?” TED talk.
Schneider, S. “AI, Consciousness and Moral Status” The Routledge Handbook of Neuroethics, Syd Johnson and Karen Rommelfanger, eds Routlege, 2017.
Turner E. and Schneider S. “The ACT test for AI Consciousness”, Ethics of Artificial Intelligence, Liao, Matthew and Chalmers, D., Oxford: Oxford University Press, forthcoming.
Edwin L. Turner is a professor of Astrophysical Sciences at Princeton University, an Affiliate Scientist at the Kavli Institute for the Physics and Mathematics of the Universe at the University of Tokyo, a visiting member in the Program in Interdisciplinary Studies at the Institute for Advanced Study in Princeton, and a co-founding Board of Directors member of YHouse, Inc. He has taken an active interest in artificial intelligence issues since working in the AI Lab at MIT in the early 1970s.
Susan Schneider is a professor of philosophy and cognitive science at the University of Connecticut, a researcher at YHouse, Inc., in New York, a member of the Ethics and Technology Group at Yale University and a visiting member at the Institute for Advanced Study at Princeton. Her work focuses on the computational nature of the brain, the scope and limits of AI and brain enhancement technologies, and the nature of the self. Her books include The Language of Thought, Science Fiction and Philosophy, and The Blackwell Companion to Consciousness (with Max Velmans). For more of her work see: SchneiderWebsite.com.
Intellectual Property Status
Provisional patent application is pending. Princeton is seeking industrial collaborators for further development and commercialization of this technology.
Princeton University Office of Technology Licensing • (609) 258-6762• email@example.com