Sequence to sequence models

ke, 01 August 2018

[ deep_learning  ]
  • Basic models

  • Beam search

Step1:

Step2:

Step3:

  • Refinements to beam search

Length normalization

  • Error analysis on beam search


Human: Jane visite Africa in September.(y* )
Algorithm: Jane visited Africa last September.(y^)
Case 1: P(y* |x) > p(y^|x)
Beam search chose y^. But y* attains higerP(y|x). Conclusion: Beam search is at fault.
Case 2: P(y* |x) <= P(y^|x)
y* is a better translation than Y^. But RNN predictedP(y* |x)<P(y^|x).
Conclusion: RNN model is at fault.

Through much of up process, figures out what faction of errors are “due to” beam search vs. RNN model.(BRBRBBBBRRR which one is the most)

  • Bleu score



  • Attention model intuition

  • Attention model


attention model for human readable data to compute readable data, a simple version(reduce yt-1 as a input to LSTM):

visualize the a< t,t’>

  • Speech recognition

x(audio clip)—————–>y(transcript)”the quick brown fox”

  • Trigger word detection

Trigger word detection algorithm

Trigger word model