Jaafar, Zuharah and Ismail, Norazlina (2022) Penalized regression method in high dimensional data. Journal of Theoretical and Applied Information Technology, 100 (8). pp. 2470-2479. ISSN 1992-8645
Full text not available from this repository.
Official URL: http://www.jatit.org/volumes/Vol100No8/9Vol100No8....
Abstract
In huge multivariate data set with a number of variables greater than the number of samples, the standard linear model (or ordinary least squares method) performs badly. In such situations, a better option is penalized regression, which allows you to design a linear regression model that is penalized for having too many variables by adding a constraint to the equation Shrinkage or regularization procedures are other names for this. The penalty has the effect of reducing (i.e. shrinking) the coefficient values towards zero. This permits the coefficients of the less important variables to be near to or equal to zero. By decreasing the number of coefficients and maintaining those with coefficients greater than zero, penalized regression models improve prediction in new data when compared to traditional methods. We demonstrate that the proposed regularizer is capable of achieving competitive results as well as exceedingly compact networks. Extensive tests are carried out on a number of benchmark datasets to demonstrate the effectiveness of the method.
Item Type: | Article |
---|---|
Uncontrolled Keywords: | lasso, ridge, adaptive lasso |
Subjects: | Q Science > QA Mathematics |
Divisions: | Science |
ID Code: | 98757 |
Deposited By: | Narimah Nawil |
Deposited On: | 02 Feb 2023 08:25 |
Last Modified: | 02 Feb 2023 08:25 |
Repository Staff Only: item control page