upcarta
  • Sign In
  • Sign Up
  • Explore
  • Search

Perry E. Metzger

No followers
community-curated profile

Mad Scientist, Bon Vivant, and Raconteur.

Overview Posts Content Recommendations
Most Recommended Recent
  • Tweet
Tweet Apr 25, 2023
Eliezer and his acolytes believe it’s inevitable AIs will go “foom” without warning, meaning, one day you build an AGI and hours or days later the thing has recursively self improved into godlike intelligence and then eats the world. Is this realisti
by Perry E. Metzger
Post Add to Collection Mark as Completed
Recommended by 1 person
1 mention
  • upcarta ©2025
  • Home
  • About
  • Terms
  • Privacy
  • Cookies
  • @upcarta