Sam Altman, speaking at the World Governments Summit in Dubai reiterated his call for a body like the International Atomic Energy Agency to be created to oversee AI
In response to open letter penned by Elon Musk and other techies demanding a 6-month moratorium on artificial intelligence research, AI researcher and author Eliezer Yudkowsky said that the moratorium was insufficient to deal with the potential threat of a superhumanly intelligent AI. Yudkowsky argues that the main issue is not “human-competitive” intelligence, but what happens once AI surpasses human capabilities. There is a likelihood, he argued, that the creation of superhumanly smart AI, without precision and preparation, could result in the death of everyone on Earth, Economy News, Times Now