### Abstract

In many applications of functional data analysis, summarising functional variation based on fits, without taking account of the estimation process, runs the risk of attributing the estimation variation to the functional variation, thereby overstating the latter. For example, the first eigenvalue of a sample covariance matrix computed from estimated functions may be biased upwards. We display a set of estimated neuronal Poisson-process intensity functions where this bias is substantial, and we discuss two methods for accounting for estimation variation. One method uses a random-coefficient model, which requires all functions to be fitted with the same basis functions. An alternative method removes the same-basis restriction by means of a hierarchical Gaussian process model. In a small simulation study the hierarchical Gaussian process model outperformed the random-coefficient model and greatly reduced the bias in the estimated first eigenvalue that would result from ignoring estimation variability. For the neuronal data the hierarchical Gaussian process estimate of the first eigenvalue was much smaller than the naive estimate that ignored variability due to function estimation. The neuronal setting also illustrates the benefit of incorporating alignment parameters into the hierarchical scheme.

Original language | English (US) |
---|---|

Pages (from-to) | 419-434 |

Number of pages | 16 |

Journal | Biometrika |

Volume | 92 |

Issue number | 2 |

DOIs | |

State | Published - Jun 2005 |

Externally published | Yes |

### Fingerprint

### Keywords

- Bayesian adaptive regression spline
- Bayesian functional data analysis
- Curve fitting
- Free-knot spline
- Functional data analysis
- Hierarchical Gaussian process
- Neuron spike train
- Nonparametric regression
- Reversible-jump Markov chain Monte Carlo
- Smoothing

### ASJC Scopus subject areas

- Agricultural and Biological Sciences(all)
- Agricultural and Biological Sciences (miscellaneous)
- Statistics and Probability
- Mathematics(all)
- Applied Mathematics

### Cite this

*Biometrika*,

*92*(2), 419-434. https://doi.org/10.1093/biomet/92.2.419

**Hierarchical models for assessing variability among functions.** / Behseta, Sam; Kass, Robert E.; Wallstrom, Garrick L.

Research output: Contribution to journal › Article

*Biometrika*, vol. 92, no. 2, pp. 419-434. https://doi.org/10.1093/biomet/92.2.419

}

TY - JOUR

T1 - Hierarchical models for assessing variability among functions

AU - Behseta, Sam

AU - Kass, Robert E.

AU - Wallstrom, Garrick L.

PY - 2005/6

Y1 - 2005/6

N2 - In many applications of functional data analysis, summarising functional variation based on fits, without taking account of the estimation process, runs the risk of attributing the estimation variation to the functional variation, thereby overstating the latter. For example, the first eigenvalue of a sample covariance matrix computed from estimated functions may be biased upwards. We display a set of estimated neuronal Poisson-process intensity functions where this bias is substantial, and we discuss two methods for accounting for estimation variation. One method uses a random-coefficient model, which requires all functions to be fitted with the same basis functions. An alternative method removes the same-basis restriction by means of a hierarchical Gaussian process model. In a small simulation study the hierarchical Gaussian process model outperformed the random-coefficient model and greatly reduced the bias in the estimated first eigenvalue that would result from ignoring estimation variability. For the neuronal data the hierarchical Gaussian process estimate of the first eigenvalue was much smaller than the naive estimate that ignored variability due to function estimation. The neuronal setting also illustrates the benefit of incorporating alignment parameters into the hierarchical scheme.

AB - In many applications of functional data analysis, summarising functional variation based on fits, without taking account of the estimation process, runs the risk of attributing the estimation variation to the functional variation, thereby overstating the latter. For example, the first eigenvalue of a sample covariance matrix computed from estimated functions may be biased upwards. We display a set of estimated neuronal Poisson-process intensity functions where this bias is substantial, and we discuss two methods for accounting for estimation variation. One method uses a random-coefficient model, which requires all functions to be fitted with the same basis functions. An alternative method removes the same-basis restriction by means of a hierarchical Gaussian process model. In a small simulation study the hierarchical Gaussian process model outperformed the random-coefficient model and greatly reduced the bias in the estimated first eigenvalue that would result from ignoring estimation variability. For the neuronal data the hierarchical Gaussian process estimate of the first eigenvalue was much smaller than the naive estimate that ignored variability due to function estimation. The neuronal setting also illustrates the benefit of incorporating alignment parameters into the hierarchical scheme.

KW - Bayesian adaptive regression spline

KW - Bayesian functional data analysis

KW - Curve fitting

KW - Free-knot spline

KW - Functional data analysis

KW - Hierarchical Gaussian process

KW - Neuron spike train

KW - Nonparametric regression

KW - Reversible-jump Markov chain Monte Carlo

KW - Smoothing

UR - http://www.scopus.com/inward/record.url?scp=21644455755&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=21644455755&partnerID=8YFLogxK

U2 - 10.1093/biomet/92.2.419

DO - 10.1093/biomet/92.2.419

M3 - Article

AN - SCOPUS:21644455755

VL - 92

SP - 419

EP - 434

JO - Biometrika

JF - Biometrika

SN - 0006-3444

IS - 2

ER -