Thread by Avi Kumar Talaviya
- Tweet
 - Mar 5, 2023
 - #ComputerProgramming #Algorithm
 
Thread
Text generation algorithms are taking over the world unleashing new tech revolutions around the globe๐๐๐
Here are 4 techniques that you must know which is used in text-generation models
A๐งต๐
        
          Here are 4 techniques that you must know which is used in text-generation models
A๐งต๐
1.  Markov Chain:
A statistical model that predicts the next word based on the probability distribution of the previous words in the text. (Cont...)
        
          A statistical model that predicts the next word based on the probability distribution of the previous words in the text. (Cont...)
It generates new text by starting with an initial state, and then repeatedly sampling the next word based on the probability distribution learned from the input text.
        
          
2.  Sequence-to-Sequence (Seq2Seq) Model:
A deep learning model that consists of two recurrent neural networks (RNNs), an encoder, and a decoder. (Cont...)
        
          A deep learning model that consists of two recurrent neural networks (RNNs), an encoder, and a decoder. (Cont...)
The encoder takes the input text and produces a fixed-length vector representation, while the decoder generates the output text based on the vector representation.
        
          
3.  Generative Adversarial Network (GAN):
A deep learning model that consists of two neural networks, a generator, and a discriminator. The generator produces new text samples, while the discriminator tries distinguishing between the generated text and the simple text. (Cont...)
        
          A deep learning model that consists of two neural networks, a generator, and a discriminator. The generator produces new text samples, while the discriminator tries distinguishing between the generated text and the simple text. (Cont...)
The two networks are trained in an adversarial manner, where the generator tries to produce more realistic text, while the discriminator tries to become better at recognizing fake text.
        
          
4.  Transformer-based Models:
Transformer models are neural network architectures designed for NLP tasks, such as text classification and machine translation. They have been shown to perform well on text-generation tasks as well.
        
          Transformer models are neural network architectures designed for NLP tasks, such as text classification and machine translation. They have been shown to perform well on text-generation tasks as well.
Learn more about NLP and LLMs from the awesome course at @DataKwery
Check out the basics of the NLP blog:
www.datakwery.com/post/fundamentals-of-nlp-guide/?utm_source=avi&utm_medium=blog
        
          Check out the basics of the NLP blog:
www.datakwery.com/post/fundamentals-of-nlp-guide/?utm_source=avi&utm_medium=blog
End of this thread!๐
If you've found it informative then do like, RT/QT first tweet, and comment what you think on this๐ฌ
And Don't forget to follow me at @avikumart_ and @DataKwery for more updates๐ฅ๐
  
        
          If you've found it informative then do like, RT/QT first tweet, and comment what you think on this๐ฌ
And Don't forget to follow me at @avikumart_ and @DataKwery for more updates๐ฅ๐
Are you looking to learn data science and machine learning from some of the awesome blogs?
Then, check out my medium page for the same๐
medium.com/@avikumart_
        
      Then, check out my medium page for the same๐
medium.com/@avikumart_
Mentions
See All
            
              
                RS Punia๐ @CodingMantras
              
            
            
              
                ยท
                Mar 5, 2023
              
            
          
          
        
              Highly informative!!! A must-read...
            
          
        
      
    
  
            
              
                Sumanth @Sumanth_077
              
            
            
              
                ยท
                Mar 5, 2023
              
            
          
          
        
              Great one, Well written