Exploring Neural Networks
00:12 Each layer will manipulate the training data and forward the output to the next layer. And at the end, the output is a prediction. In this way, you can consider each layer to be a step in the feature engineering process.
00:38 Then it compares the prediction to the observed values in the training data. Based on how good the prediction was, the values, called weights, in the layers are adjusted to make better predictions in the future. This process is repeated over and over for each example in the training data. Implementing a neural network, which you’ll start in the next lesson, requires doing lots of operations with vectors.
A vector is basically a multi-dimensional array, or an
ndarray in NumPy. Specifically, vectors support the dot product that is used to determine a similarity of two vectors, and the similarity is used to make predictions.
01:32 This can express the dependent variable as a weighted sum of the independent variables. For example, if you have a dataset with the price of houses, their age, their location, and other variables, you can use linear regression to predict the price.
01:49 The independent variables would be, for example, the age and location, and the dependent variable is the price. The weighted sum, which is the price, is the sum of the products of the independent variables and the weights, as seen here.
02:04 There’s also a bias vector that sets the results when the independent variables are equal to zero. As the model is trained, the weights and bias will be adjusted so that the similarity between the prediction and the expected outcome, the human response, is within an acceptable threshold.
Become a Member to join the conversation.