upcarta
  • Sign In
  • Sign Up
  • Explore
  • Search

Controllable Neural Text Generation

  • Article
  • Jan 2, 2021
  • #ArtificialIntelligence #ComputerProgramming
Lilian Weng
@lilianweng
(Author)
lilianweng.github.io
Read on lilianweng.github.io
1 Recommender
1 Mention
[Updated on 2021-02-01: Updated to version 2.0 with several work added and many typos fixed.] [Updated on 2021-05-26: Add P-tuning and Prompt Tuning in the “prompt design” section.]... Show More

[Updated on 2021-02-01: Updated to version 2.0 with several work added and many typos fixed.] [Updated on 2021-05-26: Add P-tuning and Prompt Tuning in the “prompt design” section.] [Updated on 2021-09-19: Add “unlikelihood training”.]
There is a gigantic amount of free text on th...

Show Less
Recommend
Post
Save
Complete
Collect
Mentions
See All
Pan Lu @lupantech · Mar 3, 2023
  • Post
  • From Twitter
An excellent blog on Controllable Neural Text Generation from @lilianweng! It's important to consider ways to reduce the hallucinations of LLMs and better reflect human intentions, especially given their current success and limitations. #ChatGPT #LLM
  • upcarta ©2025
  • Home
  • About
  • Terms
  • Privacy
  • Cookies
  • @upcarta