Maximising entropy efficiently

Williamson, Jon (2002) Maximising entropy efficiently. Electronic Transactions in Artificial Intelligence, 7 . ISSN 1401-9841. (The full text of this publication is not currently available from this repository. You may be able to access a copy if URLs are provided)

The full text of this publication is not available from this repository. (Contact us about this Publication)
Official URL


Determining a prior probability function via the maximum entropy principle can be a computationally intractable task. However one can easily determine - in advance of entropy maximisation - a list of conditional independencies that the maximum entropy function will satisfy. These independencies can be used to reduce the complexity of the entropy maximisation task. In particular, one can use these independencies to construct a direct acyclic graph in a Bayesian network, and then maximise entropy with respect to the numerical parameters of this network. This can result in an efficient representation of a prior probability function, and one that may allow efficient updating and marginalisation. The computational complexity of maximising entropy can be further reduced when knowledge of causal relationships is available. Moreover, the proposed simplification of the entropy maximisation task may be exploited to construct a proof theory for probabilistic logic.

Item Type: Article
Subjects: B Philosophy. Psychology. Religion > B Philosophy (General)
B Philosophy. Psychology. Religion > BC Logic
Q Science > QA Mathematics (inc Computing science) > QA273 Probabilities
Divisions: Faculties > Humanities > School of European Culture and Languages
Depositing User: Jon Williamson
Date Deposited: 17 Oct 2008 11:13
Last Modified: 02 Jun 2014 08:28
Resource URI: (The current URI for this page, for reference purposes)
  • Depositors only (login required):


Downloads per month over past year