Skip to main content
Kent Academic Repository

Some Advances in Bayesian Regression Problems

Prevenas, Sotirios (2024) Some Advances in Bayesian Regression Problems. Doctor of Philosophy (PhD) thesis, University of Kent,. (doi:10.22024/UniKent/01.02.107363) (Access to this publication is currently restricted. You may be able to access a copy if URLs are provided) (KAR id:107363)

PDF
Language: English

Restricted to Repository staff only until August 2025.

Contact us about this Publication
[thumbnail of 32prevenas2024phdfinal.pdf]
Official URL:
https://doi.org/10.22024/UniKent/01.02.107363

Abstract

In the first part of the thesis, we propose a novel objective Bayesian method with an application in vector autoregressive models when the normal-Wishart prior, with ν degrees of freedom, is considered. In particular, we depart from the current approach of setting ν = m + 1 in the Wishart prior of an m−dimensional covariance matrix by setting a loss-based prior on ν. By doing so, we have been able to exploit any information about ν in the data and achieve better predictive performance than the method currently used in the literature. We show how this works well on both simulated and real data sets, where, in the latter case, we used data of macroeconometric nature as well as viral data. In addition, we explain why we believe we achieve better performance by showing that the data appear to suggest a value of ν far from the canonical m + 1 value.

In the second part of the thesis, we introduce a variational Bayesian inference algorithm that performs variable selection in linear regression models. The algorithm is based on a combination of a spike-and-slab prior with a normal-gamma prior on the coefficients. The spike-and-slab prior is widely considered the “gold standard” for sparse Bayesian problems, and the normal-gamma prior is a hierarchical shrinkage prior that generalises the Bayesian LASSO. The algorithm also combines two types of updates, one for the initial iterations and one for the later ones. This hybrid updating scheme results in both high accuracy and speed of convergence. Simulations and real data examples demonstrate the competitiveness of the algorithm against recent variable selection methods. We also modify our method to perform variable selection on two types of generalised linear models.

Item Type: Thesis (Doctor of Philosophy (PhD))
Thesis advisor: Zhang, Jian
Thesis advisor: Bentham, James
DOI/Identification number: 10.22024/UniKent/01.02.107363
Uncontrolled keywords: objective prior vector autoregressive models Kullback-Leibler divergence variational inference Bayesian variable selection sparcity spike-and-slab prior
Subjects: Q Science > QA Mathematics (inc Computing science)
Divisions: Divisions > Division of Computing, Engineering and Mathematical Sciences > School of Mathematics, Statistics and Actuarial Science
Funders: University of Kent (https://ror.org/00xkeyj56)
SWORD Depositor: System Moodle
Depositing User: System Moodle
Date Deposited: 27 Sep 2024 11:10 UTC
Last Modified: 30 Sep 2024 14:02 UTC
Resource URI: https://kar.kent.ac.uk/id/eprint/107363 (The current URI for this page, for reference purposes)

University of Kent Author Information

Prevenas, Sotirios.

Creator's ORCID:
CReDIT Contributor Roles:
  • Depositors only (login required):

Total unique views for this document in KAR since July 2020. For more details click on the image.