Skip to main content

Table 2 Performance of different GLM methods for Poisson regression over 100 simulations, where values in the parenthesis are the standard deviations, and ANSF: Average number of selected features; rMSE: Average square root of mean squared error; \(|\hat { {\beta } }- {\beta } | =\sum _{i} |\hat { {\beta } }_{i} - {\beta }_{i}|\): average absolute bias when comparing true and estimated parameters

From: Sparse generalized linear model with L 0 approximation for feature selection and prediction with big omics data

  PMS glmnet SparseReg L 0ADRIDGE
   L 1 SCAD MC+  
  rMSE 1.10(±.091) 1.090(±.092) 1 . 0 8 7 ( ± . 0 9 1 ) 1.937(±.222)
N =100 \(|\hat { {\beta }} - {\beta } |\) 1.755(±.274) 1.754(±.275) 1.737±.273) 0 . 2 2 2 ( ± . 1 1 6 )
P =100 ANSF 43.03(±3.52) 43.07(±3.57) 42.06(±3.51) 3 . 9 9 ( ± . 1 0 0 )
  PTM 0% 0% 0% 9 9 %
  FDR 90.6% 90.6% 90.6% 0 %
  rMSE 0.503(±.017) 0.502(±.017) 0 . 5 0 1 ( ± . 0 1 8 ) 2.108(±.359)
N =100 \(|\hat { {\beta }} - {\beta } |\) 2.671(±.421) 2.673(±.425) 2.821±2.012) 0 . 4 2 4 ( ± . 3 5 0 )
P=103 ANSF 75.47(±5.61) 75.82(±5.71) 75.14(±8.69) 3 . 6 1 0 ( ± . 6 0 1 )
  PTM 0% 0% 0% 6 4 %
  FDR 94.7% 94.7% 94.6% 2 . 4 %
  rMSE 0 . 2 7 1 ( ± . 0 0 4 ) 0.272(±.012) 0.275(±.025) 1.916(±.081)
N =500 \(|\hat { {\beta }} - {\beta } |\) 5.845(±.280) 6.185(±2.359) 5.807±.273) 0 . 0 8 6 ( ± . 0 3 3 )
P=104 ANSF 465.6(±14.1) 475.1(±15.5) 463.6(±13.9) 4 . 0 0 0 ( ± . 0 0 0 )
  PTM 0% 0% 0% 1 0 0 %
  FDR 99.1% 99.2% 99.1% 0 %
  1. PMS: Performance Measures. PTM: Percentage of true models. FDR: False discovery rate. The values in boldface indicate the best performance