Functional Regression with scalar response using Principal Components Analysis
Computes functional (ridge or penalized) regression between functional
explanatory variable X(t) and scalar response Y using Principal
Components Analysis.
Y=<X,β>+ε
where <.,.> denotes the inner product on
L_2 and ε are random errors with mean zero , finite
variance σ^2 and E[X(t)ε]=0.
fregre.pc( fdataobj, y, l = NULL, lambda = 0, P = c(0, 0, 1), weights = rep(1, len = n), ... )
fdataobj |
|
y |
Scalar response with length |
l |
Index of components to include in the model.If is null |
lambda |
Amount of penalization. Default value is 0, i.e. no penalization is used. |
P |
If |
weights |
weights |
... |
Further arguments passed to or from other methods. |
The function computes the ν_1,...,ν_∞ orthonormal
basis of functional principal components to represent the functional data as
X(t)=∑_(k=1:∞)
γ_k ν_k and the functional parameter as
β(t)=∑_(k=1:∞)
β_k ν_k, where γ_k= < X(t), ν_k > and
β_k=<β,ν_k>.
The response can be fitted by:
λ=0, no penalization,
y.est= ν'(ν'ν)^{-1}ν'y
Ridge regression, λ>0 and P=1,
y.est=ν'(ν'ν+λ I)^{-1}ν'y
Penalized regression, λ>0 and P!=0. For example, P=c(0,0,1) penalizes the second derivative (curvature) by P=P.penalty(fdataobj["argvals"],P)
,
y.est=ν'(ν'ν+λ v'Pv)^{-1}ν'y
Return:
call
The matched call of fregre.pc
function.
coefficients
A named vector of coefficients.
residuals
y
-fitted values
.
fitted.values
Estimated scalar response.
beta.est
beta coefficient estimated of class fdata
df
The residual degrees of freedom. In ridge regression, df(rn)
is the effective degrees of freedom.
r2
Coefficient of determination.
sr2
Residual variance.
Vp
Estimated covariance matrix for the parameters.
H
Hat matrix.
l
Index of principal components selected.
lambda
Amount of shrinkage.
P
Penalty matrix.
fdata.comp
Fitted object in fdata2pc
function.
lm
lm
object.
fdataobj
Functional explanatory data.
y
Scalar response.
Manuel Febrero-Bande, Manuel Oviedo de la Fuente manuel.oviedo@usc.es
Cai TT, Hall P. 2006. Prediction in functional linear regression. Annals of Statistics 34: 2159-2179.
Cardot H, Ferraty F, Sarda P. 1999. Functional linear model. Statistics and Probability Letters 45: 11-22.
Hall P, Hosseini-Nasab M. 2006. On properties of functional principal components analysis. Journal of the Royal Statistical Society B 68: 109-126.
Febrero-Bande, M., Oviedo de la Fuente, M. (2012). Statistical Computing in Functional Data Analysis: The R Package fda.usc. Journal of Statistical Software, 51(4), 1-28. http://www.jstatsoft.org/v51/i04/
N. Kraemer, A.-L. Boulsteix, and G. Tutz (2008). Penalized Partial Least Squares with Applications to B-Spline Transformations and Functional Data. Chemometrics and Intelligent Laboratory Systems, 94, 60 - 69. http://dx.doi.org/10.1016/j.chemolab.2008.06.009
See Also as: fregre.pc.cv
,
summary.fregre.fd
and predict.fregre.fd
.
Alternative method: fregre.basis
and fregre.np
.
## Not run: data(tecator) absorp=tecator$absorp.fdata ind=1:129 x=absorp[ind,] y=tecator$y$Fat[ind] res=fregre.pc(x,y) summary(res) res2=fregre.pc(x,y,l=c(1,3,4)) summary(res2) # Functional Ridge Regression res3=fregre.pc(x,y,l=c(1,3,4),lambda=1,P=1) summary(res3) # Functional Regression with 2nd derivative penalization res4=fregre.pc(x,y,l=c(1,3,4),lambda=1,P=c(0,0,1)) summary(res4) betas<-c(res$beta.est,res2$beta.est,res3$beta.est,res4$beta.est) plot(betas) ## End(Not run)
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.