Prevenas, Sotirios (2024) Some Advances in Bayesian Regression Problems. Doctor of Philosophy (PhD) thesis, University of Kent,. (doi:10.22024/UniKent/01.02.107363) (KAR id:107363)
|
PDF
Language: English
This work is licensed under a Creative Commons Attribution 4.0 International License.
|
|
|
Download this file (PDF/3MB) |
Preview |
| Official URL: https://doi.org/10.22024/UniKent/01.02.107363 |
|
Abstract
In the first part of the thesis, we propose a novel objective Bayesian method with an application in vector autoregressive models when the normal-Wishart prior, with ν degrees of freedom, is considered. In particular, we depart from the current approach of setting ν = m + 1 in the Wishart prior of an m−dimensional covariance matrix by setting a loss-based prior on ν. By doing so, we have been able to exploit any information about ν in the data and achieve better predictive performance than the method currently used in the literature. We show how this works well on both simulated and real data sets, where, in the latter case, we used data of macroeconometric nature as well as viral data. In addition, we explain why we believe we achieve better performance by showing that the data appear to suggest a value of ν far from the canonical m + 1 value.
In the second part of the thesis, we introduce a variational Bayesian inference algorithm that performs variable selection in linear regression models. The algorithm is based on a combination of a spike-and-slab prior with a normal-gamma prior on the coefficients. The spike-and-slab prior is widely considered the “gold standard” for sparse Bayesian problems, and the normal-gamma prior is a hierarchical shrinkage prior that generalises the Bayesian LASSO. The algorithm also combines two types of updates, one for the initial iterations and one for the later ones. This hybrid updating scheme results in both high accuracy and speed of convergence. Simulations and real data examples demonstrate the competitiveness of the algorithm against recent variable selection methods. We also modify our method to perform variable selection on two types of generalised linear models.
| Item Type: | Thesis (Doctor of Philosophy (PhD)) |
|---|---|
| Thesis advisor: | Zhang, Jian |
| Thesis advisor: | Bentham, James |
| DOI/Identification number: | 10.22024/UniKent/01.02.107363 |
| Uncontrolled keywords: | objective prior vector autoregressive models Kullback-Leibler divergence variational inference Bayesian variable selection sparcity spike-and-slab prior |
| Subjects: | Q Science > QA Mathematics (inc Computing science) |
| Institutional Unit: | Schools > School of Engineering, Mathematics and Physics > Mathematical Sciences |
| Former Institutional Unit: |
Divisions > Division of Computing, Engineering and Mathematical Sciences > School of Mathematics, Statistics and Actuarial Science
|
| Funders: | University of Kent (https://ror.org/00xkeyj56) |
| SWORD Depositor: | System Moodle |
| Depositing User: | System Moodle |
| Date Deposited: | 27 Sep 2024 11:10 UTC |
| Last Modified: | 01 Sep 2025 23:00 UTC |
| Resource URI: | https://kar.kent.ac.uk/id/eprint/107363 (The current URI for this page, for reference purposes) |
- Link to SensusAccess
- Export to:
- RefWorks
- EPrints3 XML
- BibTeX
- CSV
- Depositors only (login required):

Altmetric
Altmetric