AI Could Achieve Human Like Intelligence by 2030 and Destroy Mankind Google Predicts

The worldwide debate about Artificial Intelligence (AI) continues to be how beneficial and how harmful it can be for humans. Another type of Artificial Intelligence is now in discussion which is human level artificial intelligence. This is called Artificial General Intelligence (AGI). A shocking thing is coming out about AGI. A new research has said that Artificial General Intelligence will be working between us by 2030. It must have become smarter than humans and it will completely destroy humanity.

Google Deepmind This shocking revelation has been made in this research paper released on behalf of. The study states that Given the widespread possible impact of AGI, it is likely that this may pose a possible threat to serious damage. It also includes existent risk. That is, the risk of permanent extinction of humanity can also arise.

One thing is worth noting here that this paper, co-founded by Deepmind co-founder Shane leg, does not mention how AGI can cause mankind’s extinction. Instead of explaining the reason, it tries to draw attention to solutions that Google and other AI companies should do to reduce the risk of AGI.

This study divides the risks of advanced AI into four major categories- misuse, wrong alignment, mistakes and structural risk. It also highlights the risk finish strategy of deepminds which focuses on the prevention of its misuse. Where people can use AI to harm others. In February, Deepmind CEO Demis Hasabis said that AGI is as smart as humans, or even smarter. It will begin to emerge in the next five or ten years. He also advocated a umbrella organization like UN to oversee the development of AGI.

Gadgets 360 for Latest Tech News, Smartphone Review and exclusive offer on popular mobiles Android Download the app and us Google News Follow on

Related news

(Tagstotranslate) Agi (T) Artificial General Intelligence (T) Artificial General Intelligence Impact (T) Artificial General Intelligence effects

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *