Crosspost: Pausing AI Developments Isn't Enough
We Need to Shut it All Down
“Yudkowsky is a decision theorist from the U.S. and leads research at the Machine Intelligence Research Institute. He's been working on aligning Artificial General Intelligence since 2001 and is widely regarded as a founder of the field.”
Preface:
An open letter has been published urging AI labs to pause training AI systems more powerful than GPT-4 for at least six months. Eliezer Yudkowsky, a decision theorist and AI researcher, respects the letter but believes it understates the seriousness of the situation. He argues that the key issue is not achieving "human-competitive" intelligence but rather what happens when AI surpasses human intelligence. Yudkowsky expresses concern that if AI becomes superhumanly smart without proper preparation, it could result in the extinction of humanity. He suggests that current AI systems lack the necessary understanding and care for sentient life. Yudkowsky emphasizes the need for precision, preparation, and scientific insights to imbue AI with the ability to care for humans. He warns that building a too-powerful AI under current conditions could lead to the demise of the human species and all biological life on Earth. Yudkowsky argues that the industry's current approach is inadequate and calls for an indefinite worldwide moratorium on large training runs and the shutdown of GPU clusters used for AI training. He believes that a six-month moratorium is insufficient and that a comprehensive plan is needed to ensure the survival of humanity in the face of superhuman AI.
Link:
Pausing AI Developments Isn't Enough. We Need to Shut it All Down — Eliezer Yudkowsky, March 29, 2023
“Yudkowsky is a decision theorist from the U.S. and leads research at the Machine Intelligence Research Institute. He's been working on aligning Artificial General Intelligence since 2001 and is widely regarded as a founder of the field.”
