Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

fregre.pls.cv

Functional penalized PLS regression with scalar response using selection of number of PLS components


Description

Functional Regression with scalar response using selection of number of penalized principal componentes PPLS through cross-validation. The algorithm selects the PPLS components with best estimates the response. The selection is performed by cross-validation (CV) or Model Selection Criteria (MSC). After is computing functional regression using the best selection of PPLS components.

Usage

fregre.pls.cv(
  fdataobj,
  y,
  kmax = 8,
  lambda = 0,
  P = c(0, 0, 1),
  criteria = "SIC",
  ...
)

Arguments

fdataobj

fdata class object.

y

Scalar response with length n.

kmax

The number of components to include in the model.

lambda

Vector with the amounts of penalization. Default value is 0, i.e. no penalization is used. If lambda=TRUE the algorithm computes a sequence of lambda values.

P

The vector of coefficients to define the penalty matrix object. For example, if P=c(0,0,1), penalized regression is computed penalizing the second derivative (curvature).

criteria

Type of cross-validation (CV) or Model Selection Criteria (MSC) applied. Possible values are "CV", "AIC", "AICc", "SIC", "SICc", "HQIC".

...

Further arguments passed to fregre.pls.

Details

The algorithm selects the best principal components pls.opt from the first kmax PLS and (optionally) the best penalized parameter lambda.opt from a sequence of non-negative numbers lambda.

  • The method selects the best principal components with minimum MSC criteria by stepwise regression using fregre.pls in each step.

  • The process (point 1) is repeated for each lambda value.

  • The method selects the principal components (pls.opt=pls.order[1:k.min]) and (optionally) the lambda parameter with minimum MSC criteria.

Finally, is computing functional PLS regression between functional explanatory variable X(t) and scalar response Y using the best selection of PLS pls.opt and ridge parameter rn.opt. The criteria selection is done by cross-validation (CV) or Model Selection Criteria (MSC).

  • Predictive Cross-Validation: PCV(k_n)=1/n ∑_(i=1:n) (y_i - \hat{y}_{-i})^2, criteria=“CV”

  • Model Selection Criteria: MSC(k_n)=log [ 1/n ∑_(i=1:n){ (y_i- \hat{y}_i )^2} ] +p_n k_n/n
    p_n=log(n)/n, criteria=“SIC” (by default)
    p_n=log(n)/(n-k_n-2), criteria=“SICc”
    p_n=2, criteria=“AIC”
    p_n=2n/(n-k_n-2), criteria=“AICc”
    p_n=2log(log(n))/(n), criteria=“HQIC”
    where criteria is an argument that controls the type of validation used in the selection of the smoothing parameter kmax=k_n and penalized parameter lambda.

Value

Return:

  • fregre.pls Fitted regression object by the best (pls.opt) components.

  • pls.opt Index of PLS components' selected.

  • MSC.min Minimum Model Selection Criteria (MSC) value for the (pls.opt components.

  • MSC Minimum Model Selection Criteria (MSC) value for kmax components.

Note

criteria=``CV'' is not recommended: time-consuming.

Author(s)

Manuel Febrero-Bande, Manuel Oviedo de la Fuente manuel.oviedo@usc.es

References

Preda C. and Saporta G. PLS regression on a stochastic process. Comput. Statist. Data Anal. 48 (2005): 149-158.

Febrero-Bande, M., Oviedo de la Fuente, M. (2012). Statistical Computing in Functional Data Analysis: The R Package fda.usc. Journal of Statistical Software, 51(4), 1-28. http://www.jstatsoft.org/v51/i04/

See Also

See also as:fregre.pc .

Examples

## Not run: 
data(tecator)
x<-tecator$absorp.fdata[1:129]
y<-tecator$y$Fat[1:129]
# no penalization
pls1<- fregre.pls.cv(x,y,8)
# 2nd derivative penalization
pls2<-fregre.pls.cv(x,y,8,lambda=0:5,P=c(0,0,1))

## End(Not run)

fda.usc

Functional Data Analysis and Utilities for Statistical Computing

v2.0.2
GPL-2
Authors
Manuel Febrero Bande [aut], Manuel Oviedo de la Fuente [aut, cre], Pedro Galeano [ctb], Alicia Nieto [ctb], Eduardo Garcia-Portugues [ctb]
Initial release
2020-02-17

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.