Description
Approaches a group sparse solution of an underdetermined linear system. It implements the proximal gradient algorithm to solve a lower regularization model of group sparse learning. For details, please refer to the paper "Y. Hu, C. Li, K. Meng, J. Qin and X. Yang. Group sparse optimization via l_{p,q} regularization. Journal of Machine Learning Research, to appear, 2017".
Downloads
149
Last 30 days
21357th
149
Last 90 days
149
Last year
CRAN Check Status
14
NOTE
Show all 14 flavors
| Flavor | Status |
|---|---|
| r-devel-linux-x86_64-debian-clang | NOTE |
| r-devel-linux-x86_64-debian-gcc | NOTE |
| r-devel-linux-x86_64-fedora-clang | NOTE |
| r-devel-linux-x86_64-fedora-gcc | NOTE |
| r-devel-macos-arm64 | NOTE |
| r-devel-windows-x86_64 | NOTE |
| r-oldrel-macos-arm64 | NOTE |
| r-oldrel-macos-x86_64 | NOTE |
| r-oldrel-windows-x86_64 | NOTE |
| r-patched-linux-x86_64 | NOTE |
| r-release-linux-x86_64 | NOTE |
| r-release-macos-arm64 | NOTE |
| r-release-macos-x86_64 | NOTE |
| r-release-windows-x86_64 | NOTE |
Check details (17 non-OK)
NOTE
r-devel-linux-x86_64-debian-clang
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
NOTE
r-devel-linux-x86_64-debian-gcc
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
NOTE
r-devel-linux-x86_64-fedora-clang
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
NOTE
r-devel-linux-x86_64-fedora-gcc
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
NOTE
r-devel-macos-arm64
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
NOTE
r-devel-windows-x86_64
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
NOTE
r-oldrel-macos-arm64
LazyData
'LazyData' is specified without a 'data' directory
NOTE
r-oldrel-macos-arm64
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
NOTE
r-oldrel-macos-x86_64
LazyData
'LazyData' is specified without a 'data' directory
NOTE
r-oldrel-macos-x86_64
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
NOTE
r-oldrel-windows-x86_64
LazyData
'LazyData' is specified without a 'data' directory
NOTE
r-oldrel-windows-x86_64
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
NOTE
r-patched-linux-x86_64
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
NOTE
r-release-linux-x86_64
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
NOTE
r-release-macos-arm64
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
NOTE
r-release-macos-x86_64
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
NOTE
r-release-windows-x86_64
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression described in [Hu et al(2017)], in which the proximal gradient algorithm is implemented to solve the L_{2,1/2} regularization model. GSparO is an iterative algorithm consisting of a gradient step for the least squares regression and a proximal steps for the L_{2,1/2} penalty, which is analytically formulated in this function. Also, GSparO can solve sparse variable selection problem in absence of group structure. In particular, setting group in GSparO be a vector of ones, GSparO is reduced to the iterative half thresholding algorithm introduced in [Xu et al (2012)].
| ^
Check History
NOTE 0 OK · 14 NOTE · 0 WARNING · 0 ERROR · 0 FAILURE Mar 9, 2026
NOTE
r-devel-linux-x86_64-debian-clang
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression
NOTE
r-devel-linux-x86_64-debian-gcc
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression
NOTE
r-devel-linux-x86_64-fedora-clang
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression
NOTE
r-devel-linux-x86_64-fedora-gcc
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression
NOTE
r-devel-macos-arm64
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression
NOTE
r-devel-windows-x86_64
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression
NOTE
r-patched-linux-x86_64
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression
NOTE
r-release-linux-x86_64
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression
NOTE
r-release-macos-arm64
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression
NOTE
r-release-macos-x86_64
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression
NOTE
r-release-windows-x86_64
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression
NOTE
r-oldrel-macos-arm64
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression
NOTE
r-oldrel-macos-x86_64
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression
NOTE
r-oldrel-windows-x86_64
Rd files
checkRd: (-1) GSparO.Rd:23: Lost braces; missing escapes or markup?
23 | Group sparse optimization (GSparO) for least squares regression by using the proximal gradient algorithm to solve the L_{2,1/2} regularization model.
| ^
checkRd: (-1) GSparO.Rd:26: Lost braces; missing escapes or markup?
26 | GSparO is group sparse optimization for least squares regression
Dependency Network
Version History
new
1.0
Mar 10, 2026