Building a Neural Network & Making Predictions (Summary)
Congratulations! You built a neural network from scratch using NumPy. With this knowledge, you’re ready to dive deeper into the world of artificial intelligence in Python.
In this course, you learned:
- What deep learning is and what differentiates it from machine learning
- How to represent vectors with NumPy
- What activation functions are and why they’re used inside a neural network
- What the backpropagation algorithm is and how it works
- How to train a neural network and make predictions
The process of training a neural network mainly consists of applying operations to vectors. Today, you did it from scratch using only NumPy as a dependency. This isn’t recommended in a production setting because the whole process can be unproductive and error-prone. That’s one of the reasons why deep learning frameworks like Keras, PyTorch, and TensorFlow are so popular.
For additional information on topics covered in this course, check out these resources:
- Look Ma, No For-Loops: Array Programming With NumPy
- Linear Regression in Python
- Practical Text Classification With Python and Keras
- Pure Python vs NumPy vs TensorFlow Performance Comparison
- PyTorch vs TensorFlow for Your Python Deep Learning Project
Congratulations, you made it to the end of the course! What’s your #1 takeaway or favorite thing you learned? How are you going to put your newfound skills to use? Leave a comment in the discussion section and let us know.
00:00 In this course, you learned the foundations of deep learning and neural networks. You also built a simple neural network from scratch in Python and used it to train a model. Along the way, you learned about the differences between deep learning and machine learning, and you saw how to represent data with NumPy.
00:20 You learned about activation functions and their task in a neural network. You implemented the backpropagation algorithm that updates the weights and bias in the network to increase the accuracy of predictions as the model is trained.
00:34 And you learned how to train the network and use the trained model to make predictions on new data. Now, it’s theoretically possible to implement a production neural network using NumPy, but it’s not practical. For production neural networks, you can rely upon frameworks such as TensorFlow, Keras, and PyTorch. These take care of a lot of the boilerplate code.
00:59 You only have to worry about the part that relates to your unique data: the design of the layers. For more information, you can check out these resources on Real Python. Note that I’ve also created the course based on the resource Practical Text Classification with Python and Keras.
01:19 Thank you for watching this course. I hope you found this helpful. If you would like to continue the conversation, please leave a comment, or you might like to join the discussion forums on realpython.com.
Rafael on Sept. 20, 2022
Good explanation, but why, why there is no final example, where one vector would be given to the trained neuronal net with explanation:
-This Vector was given because…
-We await the prediction of…
-The trained model predicted x, because…
Short, to run the model and interpret the results by random input.
marcinszydlowski1984 on Oct. 26, 2022
Well explained.
bennikambs on June 12, 2024
Hey,
nice overview course, that demystifies some of the deep-learning scarecrows.
Two things: 1. Every tutorial that one might take after this one will be so much more advanced. Something intermediate building on top of this one here, but not yet being on the level of Tensorflow, Keras, PyTorch deep learning, would help flatten the steep learning curve 2. Something minor: Yeah, the dot-product measures similarities, and I know similarities are important for deep-learning. However, I guess the example in this course misses a bit the point. To my understanding it is the similarity of two different input vectors that matters, not the similarity of one input vector to the weights.
Thanks!
Become a Member to join the conversation.
Santosh on Dec. 27, 2021
Fantastic. Brilliantly explained.