Hierarchical models for assessing variability among functions

Sam Behseta, Robert E. Kass, Garrick L. Wallstrom

Research output: Contribution to journalArticle

43 Citations (Scopus)

Abstract

In many applications of functional data analysis, summarising functional variation based on fits, without taking account of the estimation process, runs the risk of attributing the estimation variation to the functional variation, thereby overstating the latter. For example, the first eigenvalue of a sample covariance matrix computed from estimated functions may be biased upwards. We display a set of estimated neuronal Poisson-process intensity functions where this bias is substantial, and we discuss two methods for accounting for estimation variation. One method uses a random-coefficient model, which requires all functions to be fitted with the same basis functions. An alternative method removes the same-basis restriction by means of a hierarchical Gaussian process model. In a small simulation study the hierarchical Gaussian process model outperformed the random-coefficient model and greatly reduced the bias in the estimated first eigenvalue that would result from ignoring estimation variability. For the neuronal data the hierarchical Gaussian process estimate of the first eigenvalue was much smaller than the naive estimate that ignored variability due to function estimation. The neuronal setting also illustrates the benefit of incorporating alignment parameters into the hierarchical scheme.

Original languageEnglish (US)
Pages (from-to)419-434
Number of pages16
JournalBiometrika
Volume92
Issue number2
DOIs
StatePublished - Jun 2005
Externally publishedYes

Fingerprint

Hierarchical Model
First Eigenvalue
Gaussian Process
Random Coefficient Models
Gaussian Model
Process Model
Functional Data Analysis
Sample Covariance Matrix
Intensity Function
risk estimate
Function Estimation
Poisson process
Estimate
Biased
Basis Functions
data analysis
Covariance matrix
Alignment
methodology
Simulation Study

Keywords

  • Bayesian adaptive regression spline
  • Bayesian functional data analysis
  • Curve fitting
  • Free-knot spline
  • Functional data analysis
  • Hierarchical Gaussian process
  • Neuron spike train
  • Nonparametric regression
  • Reversible-jump Markov chain Monte Carlo
  • Smoothing

ASJC Scopus subject areas

  • Agricultural and Biological Sciences(all)
  • Agricultural and Biological Sciences (miscellaneous)
  • Statistics and Probability
  • Mathematics(all)
  • Applied Mathematics

Cite this

Hierarchical models for assessing variability among functions. / Behseta, Sam; Kass, Robert E.; Wallstrom, Garrick L.

In: Biometrika, Vol. 92, No. 2, 06.2005, p. 419-434.

Research output: Contribution to journalArticle

Behseta, S, Kass, RE & Wallstrom, GL 2005, 'Hierarchical models for assessing variability among functions', Biometrika, vol. 92, no. 2, pp. 419-434. https://doi.org/10.1093/biomet/92.2.419
Behseta, Sam ; Kass, Robert E. ; Wallstrom, Garrick L. / Hierarchical models for assessing variability among functions. In: Biometrika. 2005 ; Vol. 92, No. 2. pp. 419-434.
@article{0ebf573271694c919ae4fc4ef1dd0f57,
title = "Hierarchical models for assessing variability among functions",
abstract = "In many applications of functional data analysis, summarising functional variation based on fits, without taking account of the estimation process, runs the risk of attributing the estimation variation to the functional variation, thereby overstating the latter. For example, the first eigenvalue of a sample covariance matrix computed from estimated functions may be biased upwards. We display a set of estimated neuronal Poisson-process intensity functions where this bias is substantial, and we discuss two methods for accounting for estimation variation. One method uses a random-coefficient model, which requires all functions to be fitted with the same basis functions. An alternative method removes the same-basis restriction by means of a hierarchical Gaussian process model. In a small simulation study the hierarchical Gaussian process model outperformed the random-coefficient model and greatly reduced the bias in the estimated first eigenvalue that would result from ignoring estimation variability. For the neuronal data the hierarchical Gaussian process estimate of the first eigenvalue was much smaller than the naive estimate that ignored variability due to function estimation. The neuronal setting also illustrates the benefit of incorporating alignment parameters into the hierarchical scheme.",
keywords = "Bayesian adaptive regression spline, Bayesian functional data analysis, Curve fitting, Free-knot spline, Functional data analysis, Hierarchical Gaussian process, Neuron spike train, Nonparametric regression, Reversible-jump Markov chain Monte Carlo, Smoothing",
author = "Sam Behseta and Kass, {Robert E.} and Wallstrom, {Garrick L.}",
year = "2005",
month = "6",
doi = "10.1093/biomet/92.2.419",
language = "English (US)",
volume = "92",
pages = "419--434",
journal = "Biometrika",
issn = "0006-3444",
publisher = "Oxford University Press",
number = "2",

}

TY - JOUR

T1 - Hierarchical models for assessing variability among functions

AU - Behseta, Sam

AU - Kass, Robert E.

AU - Wallstrom, Garrick L.

PY - 2005/6

Y1 - 2005/6

N2 - In many applications of functional data analysis, summarising functional variation based on fits, without taking account of the estimation process, runs the risk of attributing the estimation variation to the functional variation, thereby overstating the latter. For example, the first eigenvalue of a sample covariance matrix computed from estimated functions may be biased upwards. We display a set of estimated neuronal Poisson-process intensity functions where this bias is substantial, and we discuss two methods for accounting for estimation variation. One method uses a random-coefficient model, which requires all functions to be fitted with the same basis functions. An alternative method removes the same-basis restriction by means of a hierarchical Gaussian process model. In a small simulation study the hierarchical Gaussian process model outperformed the random-coefficient model and greatly reduced the bias in the estimated first eigenvalue that would result from ignoring estimation variability. For the neuronal data the hierarchical Gaussian process estimate of the first eigenvalue was much smaller than the naive estimate that ignored variability due to function estimation. The neuronal setting also illustrates the benefit of incorporating alignment parameters into the hierarchical scheme.

AB - In many applications of functional data analysis, summarising functional variation based on fits, without taking account of the estimation process, runs the risk of attributing the estimation variation to the functional variation, thereby overstating the latter. For example, the first eigenvalue of a sample covariance matrix computed from estimated functions may be biased upwards. We display a set of estimated neuronal Poisson-process intensity functions where this bias is substantial, and we discuss two methods for accounting for estimation variation. One method uses a random-coefficient model, which requires all functions to be fitted with the same basis functions. An alternative method removes the same-basis restriction by means of a hierarchical Gaussian process model. In a small simulation study the hierarchical Gaussian process model outperformed the random-coefficient model and greatly reduced the bias in the estimated first eigenvalue that would result from ignoring estimation variability. For the neuronal data the hierarchical Gaussian process estimate of the first eigenvalue was much smaller than the naive estimate that ignored variability due to function estimation. The neuronal setting also illustrates the benefit of incorporating alignment parameters into the hierarchical scheme.

KW - Bayesian adaptive regression spline

KW - Bayesian functional data analysis

KW - Curve fitting

KW - Free-knot spline

KW - Functional data analysis

KW - Hierarchical Gaussian process

KW - Neuron spike train

KW - Nonparametric regression

KW - Reversible-jump Markov chain Monte Carlo

KW - Smoothing

UR - http://www.scopus.com/inward/record.url?scp=21644455755&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=21644455755&partnerID=8YFLogxK

U2 - 10.1093/biomet/92.2.419

DO - 10.1093/biomet/92.2.419

M3 - Article

AN - SCOPUS:21644455755

VL - 92

SP - 419

EP - 434

JO - Biometrika

JF - Biometrika

SN - 0006-3444

IS - 2

ER -