Learn Text Classification With Python and Keras (Summary)
You have learned how to work with text classification with Keras, and we have gone from a bag-of-words model with logistic regression to increasingly more advanced methods leading to convolutional neural networks.
You should be now familiar with word embeddings, why they are useful, and also how to use pretrained word embeddings for your training. You have also learned how to work with neural networks and how to use hyperparameter optimization to squeeze more performance out of your model.
One big topic which we have not covered here left for another time was recurrent neural networks, more specifically LSTM and GRU. Those are other powerful and popular tools to work with sequential data like text or time series. Other interesting developments are currently in neural networks that employ attention which are under active research and seem to be a promising next step since LSTM tend to be heavy on the computation.
You can use this knowledge and the models that you have trained on an advanced project as in this tutorial to employ sentiment analysis on a continuous stream of twitter data with Kibana and Elasticsearch. You could also combine sentiment analysis or text classification with speech recognition like in this handy tutorial using the SpeechRecognition library in Python.
Further Reading
If you want to delve deeper into the various topics from this article you can take a look at these links:
- AI researchers allege that machine learning is alchemy
- When Will AI Exceed Human Performance? Evidence from AI Experts
- Keras Code Examples
- Deep Learning, NLP, and Representations
- Word2Vec Paper
- GloVe Paper
Congratulations, you made it to the end of the course! What’s your #1 takeaway or favorite thing you learned? How are you going to put your newfound skills to use? Leave a comment in the discussion section and let us know.
00:00 This course was about natural language processing. You saw several techniques for using sentiment analysis. First, you learned about the bag-of-words model and used scikit-learn to create a baseline model.
00:13 Next, you learned about logistic regression and eventually used Keras to train a model with a neural network. Finally, you saw the more advanced convolutional neural network and used hyperparameter optimization to find the best values for the hyperparameters.
00:30 But this is not the end of the story. You could look at another type of neural network called the recurrent neural network. Take a look at the long short-term memory or gated recurrent unit if you are interested. Also, you have not reached the limits of natural language processing.
00:48 The most popular application is sentiment analysis, but you could also detect spam emails and classify documents and more exciting tasks. If you’re interested, there are several good resources to help you dive deeper into this topic. First, you can experiment with the examples on the official Keras site, keras.io.
01:09 To get more on GloVe, you can read the paper on the Stanford University site. And more about Word2Vec, the pretrained embedding from Google, is also available. Finally, don’t forget the demo Notebook is available here.
01:25 There’s also more related content on Real Python. You could extend the concepts of streaming Twitter data for sentiment analysis using Elasticsearch and Kibana, or you can implement speech recognition.
01:38 And you can always join the Slack community to chat with others.
01:43 Thanks for watching this course. I hope you found it of use.
Become a Member to join the conversation.