upcarta
Sign In
Sign Up
Explore
Search
Perry E. Metzger
Follow
No followers
community-curated profile
Mad Scientist, Bon Vivant, and Raconteur.
Most Recommended
Most Recommended
Recent
Tweet
Tweet
Apr 25, 2023
Eliezer and his acolytes believe it’s inevitable AIs will go “foom” without warning, meaning, one day you build an AGI and hours or days later the thing has recursively self improved into godlike intelligence and then eats the world. Is this realisti
by
Perry E. Metzger
Post
Add to Collection
Mark as Completed