Pausing Giant AI Experiments?
Created on March 30|Last edited on March 30
Comment
The Future of Life Institute just called for a 6-month pause on all AI research concerning all GPT-4-sized models or larger. Lots of industry leaders and renowned AI researchers signed onto this petition.
This petition is specifically on large generative models and is motivated by the need to re-assess the risks and implications of these stronger models.
There's a lot of questions going into this. Some people believe this to be a strategic move for OpenAI competitors to catch up. Some believe this is impossible to enforce. There are also a lot of financial aspects to consider like how some AI-centered companies focused on generative AI might sustain themselves in a 6-month grace period. There's also the question of what everyone should do during this period and after it. Is 6 months an adequate amount to assess the risks and discover methods to accurately monitor these models?
It's also interesting to think about what the implications of this petition are. A call for a halt on experiments and large generative model research is a sign that lots of people in this field are not only deeply concerned but that there do not exist a standard set of practices to regulate rapid AI growth. 6 years ago, in 2017, Transformers were introduced and AGI was barely mentioned. Fast forward to 2023, there is an argument to be made that some of the best current systems are scaling towards the AGI summit.
References
Deutsche Welle. “Tech Experts Call for 6-Month Pause on AI Development.” Dw.com, Deutsche Welle, 29 Mar. 2023.
Add a comment
Tags: ML News
Iterate on AI agents and models faster. Try Weights & Biases today.