The generalized coherence (GC) estimate is a well studied statistic for detection of a common but unknown signal on several noisy channels. In this paper, it is shown that the GC detector arises naturally from a Bayesian perspective. Specifically, it is derived as a test of the hypothesis that the signals in the channels are independent Gaussian processes against the hypothesis that the processes have some arbitrary correlation. This is achieved by introducing suitable non-informative priors for the covariance matrices across the channels under the two hypotheses. Subsequently, reduced likelihoods are obtained by marginalizing the joint distribution of the data and the covariance matrix in each case. The likelihood ratio is then shown to be a monotonic function of the GC detection statistic. This derivation extends to the case of time-correlated signals, allowing comparison with the generalized likelihood ratio test (GLRT) recently proposed by Ramírez et al.