Arşiv logosu
  • Türkçe
  • English
  • Giriş
    Yeni kullanıcı mısınız? Kayıt için tıklayın. Şifrenizi mi unuttunuz?
Arşiv logosu
  • Koleksiyonlar
  • Sistem İçeriği
  • Analiz
  • Talep/Soru
  • Türkçe
  • English
  • Giriş
    Yeni kullanıcı mısınız? Kayıt için tıklayın. Şifrenizi mi unuttunuz?
  1. Ana Sayfa
  2. Yazara Göre Listele

Yazar "Ahmed, S. Ejaz" seçeneğine göre listele

Listeleniyor 1 - 11 / 11
Sayfa Başına Sonuç
Sıralama seçenekleri
  • Küçük Resim Yok
    Öğe
    Big data analytics: integrating penalty strategies
    (Taylor & Francis Ltd, 2016) Ahmed, S. Ejaz; Yuzbasi, Bahadir
    We present efficient estimation and prediction strategies for the classical multiple regression model when the dimensions of the parameters are larger than the number of observations. These strategies are motivated by penalty estimation and Stein-type estimation procedures. More specifically, we consider the estimation of regression parameters in sparse linear models when some of the predictors may have a very weak influence on the response of interest. In a high-dimensional situation, a number of existing variable selection techniques exists. However, they yield different subset models and may have different numbers of predictors. Generally speaking, the least absolute shrinkage and selection operator (Lasso) approach produces an over-fitted model compared with its competitors, namely the smoothly clipped absolute deviation (SCAD) method and adaptive Lasso (aLasso). Thus, prediction based only on a submodel selected by such methods will be subject to selection bias. In order to minimize the inherited bias, we suggest combining two models to improve the estimation and prediction performance. In the context of two competing models where one model includes more predictors than the other based on relatively aggressive variable selection strategies, we plan to investigate the relative performance of Stein-type shrinkage and penalty estimators. The shrinkage estimator improves the prediction performance of submodels significantly selected from existing Lasso-type variable selection methods. A Monte Carlo simulation study is carried out using the relative mean squared error (RMSE) criterion to appraise the performance of the listed estimators. The proposed strategy is applied to the analysis of several real high-dimensional data sets.
  • Küçük Resim Yok
    Öğe
    High-Dimensional Regression Under Correlated Design: An Extensive Simulation Study
    (Springer International Publishing Ag, 2019) Ahmed, S. Ejaz; Kim, Hwanwoo; Yildirim, Gokhan; Yuzbasi, Bahadir
    Regression problems where the number of predictors, p, exceeds the number of responses, n, have become increasingly important in many diverse fields in the last couple of decades. In the classical case of small p and large n, the least squares estimator is a practical and effective tool for estimating the model parameters. However, in this so-called Big Data era, models have the characteristic that p is much larger than n. Statisticians have developed a number of regression techniques for dealing with such problems, such as the Lasso by Tibshirani (J R Stat Soc Ser B Stat Methodol 58:267-288, 1996), the SCAD by Fan and Li (J Am Stat Assoc 96(456):1348- 1360, 2001), the LARS algorithm by Efron et al. (Ann Stat 32(2):407-499, 2004), the MCP estimator by Zhang (Ann Stat. 38:894-942, 2010), and a tuning-free regression algorithm by Chatterjee (High dimensional regression and matrix estimation without tuning parameters, 2015, https://arxiv.org/abs/1510.07294). In this paper, we investigate the relative performances of some of these methods for parameter estimation and variable selection through analyzing real and synthetic data sets. By an extensive Monte Carlo simulation study, we also compare the relative performance of proposed methods under correlated design matrix.
  • Küçük Resim Yok
    Öğe
    . IMPROVED PENALTY STRATEGIES in LINEAR REGRESSION MODELS
    (Inst Nacional Estatistica-Ine, 2017) Yuzbasi, Bahadir; Ahmed, S. Ejaz; Gungor, Mehmet
    We suggest pretest and shrinkage ridge estimation strategies for linear regression models. We investigate the asymptotic properties of suggested estimators. Further, a Monte Carlo simulation study is conducted to assess the relative performance of the listed estimators. Also, we numerically compare their performance with Lasso, adaptive Lasso and SCAD strategies. Finally, a real data example is presented to illustrate the usefulness of the suggested methods.
  • Küçük Resim Yok
    Öğe
    L1 Correlation-Based Penalty in High-Dimensional Quantile Regression
    (Ieee, 2018) Yuzbasi, Bahadir; Ahmed, S. Ejaz; Asar, Yasin
    In this study, we propose a new method called L1 norm correlation based estimation in quantile regression in high-dimensional sparse models where the number of explanatory variables is large, may be larger than the number of observations, however, only some small subset of the predictive variables are important in explaining the dependent variable. Therefore, the importance of new method is that it addresses both grouping affect and variable selection. Monte Carlo simulations confirm that the new method compares well to the other existing regularization methods.
  • Küçük Resim Yok
    Öğe
    Liu-type shrinkage estimations in linear models
    (Taylor & Francis Ltd, 2022) Yuzbasi, Bahadir; Asar, Yasin; Ahmed, S. Ejaz
    In this study, we present the preliminary test, Stein-type and positive part Stein-type Liu estimators in the linear models when the parameter vector beta is partitioned into two parts, namely, the main effects beta(1) and the nuisance effects beta(2) such that beta = (beta(1), beta(2)). We consider the case that a priori known or suspected set of the explanatory variables do not contribute to predict the response so that a sub-model maybe enough for this purpose. Thus, the main interest is to estimate beta(1) when beta(2) is close to zero. Therefore, we investigate the performance of the suggested estimators asymptotically and via a Monte Carlo simulation study. Moreover, we present a real data example to evaluate the relative efficiency of the suggested estimators, where we demonstrate the superiority of the proposed estimators.
  • Küçük Resim Yok
    Öğe
    Rank-based Liu regression
    (Springer Heidelberg, 2018) Arashi, Mohammad; Norouzirad, Mina; Ahmed, S. Ejaz; Yuzbasi, Bahadir
    Due to the complicated mathematical and nonlinear nature of ridge regression estimator, Liu (Linear-Unified) estimator has been received much attention as a useful method to overcome the weakness of the least square estimator, in the presence of multicollinearity. In situations where in the linear model, errors are far away from normal or the data contain some outliers, the construction of Liu estimator can be revisited using a rank-based score test, in the line of robust regression. In this paper, we define the Liu-type rank-based and restricted Liu-type rank-based estimators when a sub-space restriction on the parameter of interest holds. Accordingly, some improved estimators are defined and their asymptotic distributional properties are investigated. The conditions of superiority of the proposed estimators for the biasing parameter are given. Some numerical computations support the findings of the paper.
  • Küçük Resim Yok
    Öğe
    Ridge Type Shrinkage Estimation of Seemingly Unrelated Regressions And Analytics of Economic and Financial Data from Fragile Five Countries
    (Mdpi, 2020) Yuzbasi, Bahadir; Ahmed, S. Ejaz
    In this paper, we suggest improved estimation strategies based on preliminarily test and shrinkage principles in a seemingly unrelated regression model when explanatory variables are affected by multicollinearity. To that end, we split the vector regression coefficient of each equation into two parts: one includes the coefficient vector for the main effects, and the other is a vector for nuisance effects, which could be close to zero. Therefore, two competing models per equation of the system regression model are obtained: one includes all the regression of coefficients (full model); the other (sub model) includes only the coefficients of the main effects based on the auxiliary information. The preliminarily test estimation improves the estimation procedure if there is evidence that the vector of nuisance parameters does not provide a useful contribution to the model. The shrinkage estimation method shrinks the full model estimator in the direction of the sub-model estimator. We conduct a Monte Carlo simulation study in order to examine the relative performance of the suggested estimation strategies. More importantly, we apply our methodology based on the preliminarily test and the shrinkage estimations to analyse economic data by investigating the relationship between foreign direct investment and several economic variables in the Fragile Five countries between 1983 and 2018.
  • Küçük Resim Yok
    Öğe
    Ridge-type pretest and shrinkage estimations in partially linear models
    (Springer, 2020) Yuzbasi, Bahadir; Ahmed, S. Ejaz; Aydin, Dursun
    In this paper, we suggest pretest and shrinkage ridge regression estimators for a partially linear regression model, and compare their performance with some penalty estimators. We investigate the asymptotic properties of proposed estimators. We also consider a Monte Carlo simulation comparison, and a real data example is presented to illustrate the usefulness of the suggested methods.
  • Küçük Resim Yok
    Öğe
    Shrinkage and penalized estimation in semi-parametric models with multicollinear data
    (Taylor & Francis Ltd, 2016) Yuzbasi, Bahadir; Ahmed, S. Ejaz
    In this paper, we consider estimation techniques based on ridge regression when the matrix appears to be ill-conditioned in the partially linear model using kernel smoothing. Furthermore, we consider that the coefficients can be partitioned as is the coefficient vector for main effects, and is the vector for nuisance' effects. We are essentially interested in the estimation of is close to zero. We suggest ridge pretest, ridge shrinkage and ridge positive shrinkage estimators for the above semi-parametric model, and compare its performance with some penalty estimators. In particular, suitability of estimating the nonparametric component based on the kernel smoothing basis function is also explored. Monte Carlo simulation study is used to compare the relative efficiency of proposed estimators, and a real data example is presented to illustrate the usefulness of the suggested methods. Moreover, the asymptotic properties of the proposed estimators are obtained.
  • Küçük Resim Yok
    Öğe
    Shrinkage Estimation Strategies in Generalised Ridge Regression Models: Low/High-Dimension Regime
    (Wiley, 2020) Yuzbasi, Bahadir; Arashi, Mohammad; Ahmed, S. Ejaz
    In this study, we suggest pretest and shrinkage methods based on the generalised ridge regression estimation that is suitable for both multicollinear and high-dimensional problems. We review and develop theoretical results for some of the shrinkage estimators. The relative performance of the shrinkage estimators to some penalty methods is compared and assessed by both simulation and real-data analysis. We show that the suggested methods can be accounted as good competitors to regularisation techniques, by means of a mean squared error of estimation and prediction error. A thorough comparison of pretest and shrinkage estimators based on the maximum likelihood method to the penalty methods. In this paper, we extend the comparison outlined in his work using the least squares method for the generalised ridge regression.
  • Küçük Resim Yok
    Öğe
    Shrinkage Ridge Regression Estimators in High-Dimensional Linear Models
    (Springer-Verlag Berlin, 2015) Yuzbasi, Bahadir; Ahmed, S. Ejaz
    In this paper, we suggest shrinkage ridge regression estimators for a multiple linear regression model, and compared their performance with some penalty estimators which are lasso, adaptive lasso and SCAD. Monte Carlo studies were conducted to compare the estimators and a real data example is presented to illustrate the usefulness of the suggested methods.

| İnönü Üniversitesi | Kütüphane | Rehber | OAI-PMH |

Bu site Creative Commons Alıntı-Gayri Ticari-Türetilemez 4.0 Uluslararası Lisansı ile korunmaktadır.


İnönü Üniversitesi, Battalgazi, Malatya, TÜRKİYE
İçerikte herhangi bir hata görürseniz lütfen bize bildirin

DSpace 7.6.1, Powered by İdeal DSpace

DSpace yazılımı telif hakkı © 2002-2025 LYRASIS

  • Çerez Ayarları
  • Gizlilik Politikası
  • Son Kullanıcı Sözleşmesi
  • Geri Bildirim