vovaniche.blogg.se

Dylann vox 2021
Dylann vox 2021













dylann vox 2021

It was a baffling observation - radioactive elements had to that point only been known to emit small particles and transmute to slightly smaller elements - but by Christmas Eve, their collaborators, the physicists Lise Meitner and Otto Frisch, had come up with an explanation: the neutrons had split the uranium atoms, creating solid barium and krypton gas. In December 1938, the chemists Otto Hahn and Fritz Strassmann found that if they bombarded the radioactive element uranium with neutrons, they got what looked like barium, an element much smaller than uranium. Similarity: extremely rapid scientific progress Here’s an incomplete list of ways in which the two technologies seem similar - and different.

#Dylann vox 2021 how to#

But the analogy is crude at best, and there are important differences between the technologies that will prove vital in thinking about how to regulate AI to ensure it’s deployed safely, without bias against marginalized groups and with protections against misuse by bad actors. AI is a new, bewildering technology that many experts believe is extremely dangerous, and we want conceptual tools to help us wrap our heads around it and think about its consequences. It’s easy to understand why people grasp for analogies like this. I recently saw a copy on a coffee table at Anthropic’s offices, when I was visiting there for a reporting trip. Anecdotally, I know tons of people working on AI policy who’ve been reading Rhodes’s book for inspiration. Some policy experts are calling for a Manhattan Project for AI, just to make the analogy super-concrete.

dylann vox 2021

The New York Times ran a quiz asking people if they could distinguish quotes about nuclear weapons from quotes about AI. The Making of the Atomic Bomb author Richard Rhodes thinks there are important parallels. Oppenheimer director Christopher Nolan, by contrast, doesn’t think AI and nukes are very similar. The heads of AI labs OpenAI, Anthropic, and Google DeepMind, as well as researchers like Geoffrey Hinton and Yoshua Bengio and prominent figures like Bill Gates, signed an open letter in May making the analogy explicitly, stating: “Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.” Like nukes, the argument goes, AI is a cutting-edge technology that emerged with unnerving rapidity and comes with serious and difficult to predict risks that society is ill-equipped to handle. If you spend enough time reading about artificial intelligence, you’re bound to encounter one specific analogy: nuclear weapons.















Dylann vox 2021