upcarta
  • Sign In
  • Sign Up
  • Explore
  • Search
Mentions
Andrej Karpathy @karpathy · Nov 17, 2022
  • From Twitter

Good post. A lot of interest atm in wiring up LLMs to a wider compute infrastructure via text I/O (e.g. calculator, python interpreter, google search, scratchpads, databases, ...). The LLM becomes the "cognitive engine" orchestrating resources, its thought stack trace in raw text

Article Nov 16, 2022
The Near Future of AI is Action-Driven
by John McDonnell
Post Add to Collection Mark as Completed
Recommended by 1 person
1 mention
Share on Twitter Repost
  • upcarta ©2025
  • Home
  • About
  • Terms
  • Privacy
  • Cookies
  • @upcarta