Multiple Polynomial Regression: Code
Here’s the data that you’ll need to implement Multiple Polynomial Regression in Python:
x = [[0, 1], [5, 1], [15, 2], [25, 5], [35, 11], [45, 15], [55, 34], [60, 35]]
y = [4, 5, 20, 14, 32, 22, 38, 43]
x = np.array(x)
y = np.array(y)
00:00 Let’s build a quadratic model for a regression problem where the input has two components. So this is the type of model that we took a look at in the previous lesson.
00:09 Let me go back and show you. All right. So here it is. What we’re going to do is we’ve got multiple features—so in this case two—and we want to fit a quadratic model.
00:21 So the transformer that will instantiate will generate the values of x₁² the mixed terms, and then the quadratic term for x₂. All right. So let’s go back and do this.
00:34
I’m going to copy and paste some data that you can find in the notes to the video lesson. All right. So once you’ve copy-pasted, go ahead and run that. And now let’s instantiate a PolynomialFeatures
object.
00:49
So this will be our transformer, but you know what we’ll do? When we get back this object that PolynomialFeatures
will return, we’ll just call the .fit_transform()
method right on it.
01:01 And then that way, we’ll generate our transformed or matrix or input features that have all the way up to the quadratic terms.
01:10
So go ahead and continue writing PolynomialFeatures
,
01:14
and this’ll be degree=2
, and we don’t want to include the bias. So let’s set that to False
and go ahead and get the transformed data right away all at once from the input x.
01:28 And go ahead and print that so you can see what this will return.
01:35 All right, so we get some scientific notation, but that’s okay. So the first row, well, that corresponds up here to this first observation of the input. The input has two components. For the first observation, the first component is zero and the second one is one.
01:52
And so those values are here: 0.000e+00
and 1.000e+00
. And then this term right here (0.000
) is x₁². This term right here (0.000
) is x₁x₂. That’s the mixed term.
02:04
And zero times one is zero. And then that last component here (1.000
) is the x₂ term squared, and so we get that. So let’s go on to the next one.
02:17
The second observation has first component input 5
, and then the second one is 1
. So there it is, 5, 1
. This (2.500e+01
) is five. Five squared is twenty-five.
02:29
And then this (5.000e+00
) is the mixed term, which is five times one. And then we get five. And then one squared is one. So that’s exactly that matrix x. Of course, without the bias terms, which are all one, because we passed in a False
value to the include_bias
keyword.
02:46 And so that’s it. That’s the only extra step that you need to do when you’re implementing a polynomial regression scheme. So let’s continue and build the model.
02:55
So build the model. Call our LinearRegression
, and a class will fit the model right away using the transformed data. And go ahead and run that.
03:07 Let’s take a look at the R² value.
03:17 So that’s a pretty good R² value. So it’s telling us that the input data along with the response is pretty well modeled using a quadratic regression model.
03:27
Let’s go ahead and print out the intercept (b_0
),
03:37 and we’ll also print out the coefficients.
03:47
And so if you recall from the model, b_1
and b_2
, those are the coefficients that are multiplying x₁ and x₂, respectively. b_3
is the one that’s multiplying x₁².
03:58
b_4
is multiplying the mixed term, x₁x₂ , and b_5
is multiplying the x₂² term.
04:06
And so that’s it. In order to implement polynomial regression, the extra step that you need to do is to generate this PolynomialFeatures
object and generate the transformed inputs so that you get your quadratic terms or cubic terms, depending on the degree that you’d like.
04:24
And then continuing on and using the LinearRegression
class to solve the associated multiple linear regression problem. All right, let’s wrap things up in the next lesson.
Become a Member to join the conversation.