Skip to content Skip to sidebar Skip to footer

Long Short-Term Memory for NLP (NLP Zero to Hero – Part 5)



Welcome to episode 5 of our Natural Language Processing with TensorFlow series. In this video we’re going to take a look at how to manage the understanding of context in language across longer sentences, where we can see that the impact of a word early in the sentence can determine the meaning and semantics of the end of the sentence. We’ll use something called an LSTM, or Long Short-Term Memory to achieve this.

NLP Zero to Hero playlist → https://goo.gle/nlp-z2h
Subscribe to the TensorFlow channel → https://goo.gle/TensorFlow

source

Leave a comment