Skip to main content

Comparing computational approaches to the analysis of high-frequency trading data using Bayesian methods

Cremaschi, Andrea (2017) Comparing computational approaches to the analysis of high-frequency trading data using Bayesian methods. Doctor of Philosophy (PhD) thesis, University of Kent,. (KAR id:60839)

Language: English
Download (39MB) Preview
[thumbnail of 104Thesis_Cremaschi_Final.pdf]
This file may not be suitable for users of assistive technology.
Request an accessible format


Financial prices are usually modelled as continuous, often involving geometric Brownian motion with drift, leverage, and possibly jump components. An alternative modelling approach allows financial observations to take discrete values when they are interpreted as integer multiples of a fixed quantity, the ticksize, the monetary value associated with a single change in the asset evolution. These samples are usually collected at very high frequency, exhibiting diverse trading operations per seconds. In this context, the observables are modelled in two different ways: on one hand, via the Skellam process, defined as the difference between two independent Poisson processes; on the other, using a stochastic process whose conditional law is that of a mixture of Geometric distributions. The parameters of the two stochastic processes modelled as functions of a stochastic volatility process, which is in turn described by a discretised Gaussian Ornstein-Uhlenbeck AR(1) process.

The work will present, at first, a parametric model for independent and identically distributed data, in order to motivate the algorithmic choices used as a basis for the next Chapters. These include adaptive Metropolis-Hastings algorithms, and Interweaving Strategy.

The central Chapters of the work are devoted to the illustration of Particle Filtering methods for MCMC posterior computations (or PMCMC methods). The discussion starts

by presenting the existing Particle Gibbs and the Particle Marginal Metropolis-Hastings

samplers. Additionally, we propose two extensions to the existing methods. Posterior inference and out-of-sample prediction obtained with the different methodologies is

discussed, and compared to the methodologies existing in the literature. To allow for more flexibility in the modelling choices, the work continues with a presentation of a semi-parametric version of the original model. Comparative inference obtained via the previously discussed methodologies is presented.

The work concludes with a summary and an account of topics for further research.

Item Type: Thesis (Doctor of Philosophy (PhD))
Thesis advisor: Griffin, Jim
Uncontrolled keywords: High-frequency data, Bayesian inference, Computational Econometrics, Particle Filtering methods
Subjects: Q Science > QA Mathematics (inc Computing science)
Divisions: Divisions > Division of Computing, Engineering and Mathematical Sciences > School of Mathematics, Statistics and Actuarial Science
Depositing User: Users 1 not found.
Date Deposited: 10 Mar 2017 12:00 UTC
Last Modified: 16 Feb 2021 13:43 UTC
Resource URI: (The current URI for this page, for reference purposes)
  • Depositors only (login required):


Downloads per month over past year