Williamson, J. (2002) Maximising entropy efficiently. Electronic Transactions in Artificial Intelligence, 7 . ISSN 1401-9841.
|The full text of this publication is not available from this repository. (Contact us about this Publication)|
Determining a prior probability function via the maximum entropy principle can be a computationally intractable task. However one can easily determine - in advance of entropy maximisation - a list of conditional independencies that the maximum entropy function will satisfy. These independencies can be used to reduce the complexity of the entropy maximisation task. In particular, one can use these independencies to construct a direct acyclic graph in a Bayesian network, and then maximise entropy with respect to the numerical parameters of this network. This can result in an efficient representation of a prior probability function, and one that may allow efficient updating and marginalisation. The computational complexity of maximising entropy can be further reduced when knowledge of causal relationships is available. Moreover, the proposed simplification of the entropy maximisation task may be exploited to construct a proof theory for probabilistic logic.
|Subjects:||B Philosophy. Psychology. Religion > B Philosophy (General)
B Philosophy. Psychology. Religion > BC Logic
Q Science > QA Mathematics (inc Computing science) > QA273 Probabilities
|Divisions:||Faculties > Humanities > School of European Culture and Languages|
|Depositing User:||Jon Williamson|
|Date Deposited:||17 Oct 2008 11:13|
|Last Modified:||20 Jan 2012 15:04|
|Resource URI:||http://kar.kent.ac.uk/id/eprint/7376 (The current URI for this page, for reference purposes)|
- Depositors only (login required):