upcarta
  • Sign In
  • Sign Up
  • Explore
  • Search
Mentions
Emmanuel Ameisen @mlpowered · Dec 26, 2022
  • From Twitter

In a great YT video @karpathy challenges viewers to find a vectorized way to backprop through a complex embedding lookup. Tada! mapping = [link] num_classes=27).float() dC = torch.tensordot(mapping, demb, dims=([0,1], [0,1]))

Video Oct 11, 2022
Building makemore Part 4: Becoming a Backprop Ninja
by Andrej Karpathy
Post Add to Collection Mark as Completed
Recommended by 1 person
1 mention
Share on Twitter Repost
  • upcarta ©2025
  • Home
  • About
  • Terms
  • Privacy
  • Cookies
  • @upcarta