Pausing AI Developments Isn’t Enough. We Need to Shut it All Down

Jan 20, 2024 | Ethics, Sentience

The open letter urges a six-month halt to training AI systems beyond GPT-4, citing concerns beyond human-competitive intelligence. It stresses that reaching superhuman intelligence poses immense risks, potentially leading to humanity’s extinction. Current AI lacks the necessary care for sentient life, implying catastrophic consequences if not controlled. The letter highlights the vast gap between AI advancement and our understanding, emphasizing the lack of preparedness to manage powerful AI. The author suggests an indefinite, global moratorium on large AI training runs and GPU clusters to prevent extinction scenarios, stressing the urgency of action over political considerations.

Related Articles


Submit a Comment

Your email address will not be published. Required fields are marked *