Abstract
Independent data are efficiently integrated by adding their respective log-likelihoods. Instead of Bayesian updating of information, We propose to use the likelihood directly as a vehicle for coherent learning. If past data are summarised in a likelihood, it might be termed a prior likelihood component when integrated with new data. In the frequentist tradition, statistical reporting often done in the format of confidence intervals. The confidence distribution, with quantiles pecifying all possible confidence intervals provides a more complete report than a 95% interval, say. The concept or confidence distribution is discussed, and a new versio of the Neyman-Pearson lemma is provided. Confidence distributions based on prior data represent frequentist analogues to Bayesian priors. These confidence distributions need to be converted to likelihoods before they can be integrated with the new data likelihood. This can be done if their probability bases are reported. Confidence distributions resulting from the integrated analysis, along with their probability bases, represent the frequentist analogue to the Bayesian posterior distributions.