Bayesian model averagingの枠組みでBayes予測に関するリスク最小問題を定式化する。予測のよさは双対なKullback-Leilber divergence損失によって測る。損失の双対性を通じて,統計学における2大原理である尤度最大化とShannon entropy最大化の双対性が明らかになる。(著者抄録)
Aitchison, J. (1975). Goodness of prediction fit, Biometrika, 62, 547–554.
Akaike, H. (1973). Information theory as an extension of the maximum likelihood principle, in Second International Symposium on Information Theory, edited by B. N. Petrov and F. Csaki, pp. 267–281, Akademiai Kiado, Budapest.
Amari, S.-I. and Nagaoka, H. (2000). Methods of Information Geometry, American Mathematical Society, Load Island.
Buckland, S. T., Burnham, K. P. and Augustin, N. H. (1997). Model selection: An integral part of inference, Biometrics, 53, 603–618.
Callen, H. B. (1985). Thermodynamics and an Introduction to Thermostatistics (2nd), Wiley, New York.