**Hint:**You can adjust the default video playback speed in your account settings.

**Hint:**You can set your subtitle preferences in your account settings.

**Sorry!**Looks like there’s an issue with video playback 🙁 This might be due to a temporary outage or because of a configuration issue with your browser. Please see our video player troubleshooting guide to resolve the issue.

# Exploring Neural Networks

**00:00**
As mentioned previously, neural networks are used in deep learning. The structure of a neural network arranges the values that it uses to train the model into layers.

**00:12**
Each layer will manipulate the training data and forward the output to the next layer. And at the end, the output is a prediction. In this way, you can consider each layer to be a step in the feature engineering process.

**00:28**
To train a neural network, you first take an example from the training dataset. The layers of the neural network manipulate the example and make a prediction.

**00:38**
Then it compares the prediction to the observed values in the training data. Based on how good the prediction was, the values, called weights, in the layers are adjusted to make better predictions in the future. This process is repeated over and over for each example in the training data. Implementing a neural network, which you’ll start in the next lesson, requires doing lots of operations with vectors.

**01:04**
A vector is basically a multi-dimensional array, or an `ndarray`

in NumPy. Specifically, vectors support the dot product that is used to determine a similarity of two vectors, and the similarity is used to make predictions.

**01:20**
Another important task in neural networks is linear regression. It is used to model the relationship between dependent variables and two or more independent variables.

**01:32**
This can express the dependent variable as a weighted sum of the independent variables. For example, if you have a dataset with the price of houses, their age, their location, and other variables, you can use linear regression to predict the price.

**01:49**
The independent variables would be, for example, the age and location, and the dependent variable is the price. The weighted sum, which is the price, is the sum of the products of the independent variables and the weights, as seen here.

**02:04**
There’s also a bias vector that sets the results when the independent variables are equal to zero. As the model is trained, the weights and bias will be adjusted so that the similarity between the prediction and the expected outcome, the human response, is within an acceptable threshold.

**02:22**
In the next lesson, you’ll start to implement some of the concepts in code.

Become a Member to join the conversation.