ElasticNet Regression Fundamentals and Modeling in Python
In this blog post, I will first try to explain the basics of ElasticNet Regression. Then, we’ll build the model using a dataset with Python. Finally, we’ll evaluate the model by calculating the mean square error. Let’s get started step by step.
What is the ElasticNet Regression?
The main purpose of ElasticNet Regression is to find the coefficients that minimize the sum of error squares by applying a penalty to these coefficients. ElasticNet combines L1 and L2 (Lasso and Ridge) approaches. As a result, it performs a more efficient smoothing process. In another source, it is defined as follows:
Elastic Net first emerged as a result of critique on Lasso, whose variable selection can be too dependent on data and thus unstable. The solution is to combine the penalties of Ridge regression and Lasso to get the best of both worlds.
Features of ElasticNet Regression
- It combines the L1 and L2 approaches.
- It performs a more efficient regularization process.
- It has two parameters to be set, λ and α.
The elastic net method improves on lasso’s limitations, i.e., where lasso takes a few samples for high dimensional data, the elastic net procedure provides the inclusion of “n” number of variables until saturation. In a case where the variables are highly correlated groups, lasso tends to choose one variable from such groups and ignore the rest entirely.
ElasticNet Regression Model
Elastic Net aims at minimizing the following loss function:
The terms used in the mathematical model are the same as in Ridge and Lasso Regression. That’s why I won’t go back to it here. You can review the explanations about the mathematical model from the articles I shared below.
Modeling with Python
Now let’s build a ElasticNet Regression
model on a sample data set. And then let’s calculate the square root of the model’s Mean Squared Error
This will give us the model error.
First of all, we import the libraries necessary for modeling as usual.
Then we do data reading and some data editing operations.
With ElasticNet regression, we set up the model on the train set.
I do not go into concept details such as what is fit, what is a train set.
According to the variables in the data set we have, we find the variable coefficients in the ElasticNet model as follows.
We found the constant of the ElasticNet regression model to be -6,46 with the following function.
Prediction
Now let’s make the model prediction under normal conditions without specifying any parameters. We can see the first 10observations of the model prediction for the train set as follows.
Likewise, we can see the first 10observations of the model prediction for the test set as follows.
Then we saved the values we predicted over the test set in a cluster named y_pred. And we found the RMSE value as 357,16 as a result of the following calculation.
As a result, we found the R-Squared score as 0,41. The R-squared score is the percentage of the change in the dependent variable explained by the independent variables.
In other words, we can say that independent variables in ElasticNet Regression Model explain 41.07% of the change in dependent variables for this data set.
What is R-Squared?
R-squared (R2) is a statistical measure that represents the proportion of the variance for a dependent variable that’s explained by an independent variable or variables in a regression model. Whereas correlation explains the strength of the relationship between an independent and dependent variable, R-squared explains to what extent the variance of one variable explains the variance of the second variable. So, if the R2 of a model is 0.50, then approximately half of the observed variation can be explained by the model’s inputs.
Model Tuning
In this section, we will do the operations using the ElasticNetCV method to find the optimum lambda value.
Accordingly, we find the alpha value as 5230,76.
Afterward, we can find the constant of the model established with ElasticNetCV as follows.
We can find the coefficients of the variables of the model established with ElasticNetCV as follows.
Then we rebuild the Adjusted ElasticNet model with this optimum alpha value. Then we print the predicted values from the test set into y_pred. As a result, we find the RMSE value as 394,15.
Finally
First, we examined what is ElasticNet Regression in this blog post. Then we talked about the features and basics of ElasticNet Regression. Mathematically, we examined the model of this algorithm. Then we set up the model under the current conditions and calculate the error value. In the Model Tuning part, we calculated the corrected error value by calculating the optimum alpha value with ElasticNetCV and rebuilding the corrected model according to this alpha value.
Resources
- https://www.datacamp.com/community/tutorials/tutorial-ridge-lasso-elastic-net?utm_source=adwords_ppc&utm_campaignid=1455363063&utm_adgroupid=65083631748&utm_device=c&utm_keyword=&utm_matchtype=b&utm_network=g&utm_adpostion=&utm_creative=278443377095&utm_targetid=aud-299261629574:dsa-429603003980&utm_loc_interest_ms=&utm_loc_physical_ms=1012782&gclid=CjwKCAjwm7mEBhBsEiwA_of-TEfkgJzJgtfZDPfFBSf07sCnxj3oL40GlgMm4-JqD57-oQw7aKyOeBoCOd4QAvD_BwE
- https://towardsdatascience.com/ridge-lasso-and-elasticnet-regression-b1f9c00ea3a3
- https://hackernoon.com/an-introduction-to-ridge-lasso-and-elastic-net-regression-cca60b4b934f
- https://www.geeksforgeeks.org/elastic-net-regression-in-r-programming/
- https://corporatefinanceinstitute.com/resources/knowledge/other/elastic-net/
- https://towardsdatascience.com/from-linear-regression-to-ridge-regression-the-lasso-and-the-elastic-net-4eaecaf5f7e6