Yazar "Arashi, Mohammad" seçeneğine göre listele
Listeleniyor 1 - 5 / 5
Sayfa Başına Sonuç
Sıralama seçenekleri
Öğe LAD, LASSO and Related Strategies in Regression Models(Springer International Publishing Ag, 2020) Yuzbasi, Bahadir; Ahmed, Syed Ejaz; Arashi, Mohammad; Norouzirad, MinaIn the context of linear regression models, it is well-known that the ordinary least squares estimator is very sensitive to outliers whereas the least absolute deviations (LAD) is an alternative method to estimate the known regression coefficients. Selecting significant variables is very important; however, by choosing these variables some information may be sacrificed. To prevent this, in our proposal, we can combine the full model estimates toward the candidate sub-model, resulting in improved estimators in risk sense. In this article, we consider shrinkage estimators in a sparse linear regression model and study their relative asymptotic properties. Advantages of the proposed estimators over the usual LAD estimator are demonstrated through a Monte Carlo simulation as well as a real data example.Öğe Penalized regression via the restricted bridge estimator(Springer, 2021) Yuzbasi, Bahadir; Arashi, Mohammad; Akdeniz, FikriThis article is concerned with the bridge regression, which is a special family in penalized regression with penalty function Sigma(p)(j=1) |beta(j) (q) with q > 0, in a linear model with linear restrictions. The proposed restricted bridge (RBRIDGE) estimator simultaneously estimates parameters and selects important variables when a piece of prior information about parameters are available in either low-dimensional or high-dimensional case. Using local quadratic approximation, we approximate the penalty term around a local initial values vector. The RBRIDGE estimator enjoys a closed-form expression that can be solved when q > 0. Special cases of our proposal are the restricted LASSO (q = 1), restricted RIDGE (q = 2), and restricted Elastic Net (1 < q < 2) estimators. We provide some theoretical properties of the RBRIDGE estimator for the low-dimensional case, whereas the computational aspects are given for both low- and high-dimensional cases. An extensive Monte Carlo simulation study is conducted based on different prior pieces of information. The performance of the RBRIDGE estimator is compared with some competitive penalty estimators and the ORACLE. We also consider four real-data examples analysis for comparison sake. The numerical results show that the suggested RBRIDGE estimator outperforms outstandingly when the prior is true or near exact.Öğe Rank-based Liu regression(Springer Heidelberg, 2018) Arashi, Mohammad; Norouzirad, Mina; Ahmed, S. Ejaz; Yuzbasi, BahadirDue to the complicated mathematical and nonlinear nature of ridge regression estimator, Liu (Linear-Unified) estimator has been received much attention as a useful method to overcome the weakness of the least square estimator, in the presence of multicollinearity. In situations where in the linear model, errors are far away from normal or the data contain some outliers, the construction of Liu estimator can be revisited using a rank-based score test, in the line of robust regression. In this paper, we define the Liu-type rank-based and restricted Liu-type rank-based estimators when a sub-space restriction on the parameter of interest holds. Accordingly, some improved estimators are defined and their asymptotic distributional properties are investigated. The conditions of superiority of the proposed estimators for the biasing parameter are given. Some numerical computations support the findings of the paper.Öğe Shrinkage Estimation Strategies in Generalised Ridge Regression Models: Low/High-Dimension Regime(Wiley, 2020) Yuzbasi, Bahadir; Arashi, Mohammad; Ahmed, S. EjazIn this study, we suggest pretest and shrinkage methods based on the generalised ridge regression estimation that is suitable for both multicollinear and high-dimensional problems. We review and develop theoretical results for some of the shrinkage estimators. The relative performance of the shrinkage estimators to some penalty methods is compared and assessed by both simulation and real-data analysis. We show that the suggested methods can be accounted as good competitors to regularisation techniques, by means of a mean squared error of estimation and prediction error. A thorough comparison of pretest and shrinkage estimators based on the maximum likelihood method to the penalty methods. In this paper, we extend the comparison outlined in his work using the least squares method for the generalised ridge regression.Öğe SLASSO: a scaled LASSO for multicollinear situations(Taylor & Francis Ltd, 2021) Arashi, Mohammad; Asar, Yasin; Yuzbasi, BahadirWe propose a re-scaled LASSO by pre-multiplying the LASSO with a matrix term, namely, scaled LASSO (SLASSO), for multicollinear situations. Our numerical study has shown that the SLASSO is comparable with other sparse modeling techniques and often outperforms the LASSO and elastic net. Our findings open new visions about using the LASSO still for sparse modeling and variable selection. We conclude our study by pointing that the same efficient algorithm can solve the SLASSO for solving the LASSO and suggest following the same construction technique for other penalized estimators