Skip to main content
Kent Academic Repository

Towards the development of user tools for knowledge acquisition in digital document analysis

Fairhurst, Michael, Erbilek, Meryem (2014) Towards the development of user tools for knowledge acquisition in digital document analysis. Journal of e learning and knowledge society, 10 (2). pp. 35-52. ISSN 1826 - 6223. E-ISSN 1971 - 8829. (The full text of this publication is not currently available from this repository. You may be able to access a copy if URLs are provided) (KAR id:43687)

The full text of this publication is not currently available from this repository. You may be able to access a copy if URLs are provided.

Abstract

Handwritten documents provide a rich source of data and, with the growth in the availability of digitised documents, it becomes increasingly important to improve our ability to analyse and extract “knowledge” from such sources. This paper describes an approach to the provision of tools which can extract information about the writer of handwritten documents, especially those which were written in earlier times and which constitute key elements in our heritage and culture. We show how the constraints inherent in such documents influence our analytical approach, and we also show how developing appropriate “knowledge extraction” techniques can also be essential in other, more general, important application scenarios.

Item Type: Article
Subjects: T Technology
Divisions: Divisions > Division of Computing, Engineering and Mathematical Sciences > School of Engineering and Digital Arts
Depositing User: Tina Thompson
Date Deposited: 24 Oct 2014 15:43 UTC
Last Modified: 17 Aug 2022 10:57 UTC
Resource URI: https://kar.kent.ac.uk/id/eprint/43687 (The current URI for this page, for reference purposes)

University of Kent Author Information

Fairhurst, Michael.

Creator's ORCID:
CReDIT Contributor Roles:

Erbilek, Meryem.

Creator's ORCID:
CReDIT Contributor Roles:
  • Depositors only (login required):

Total unique views for this document in KAR since July 2020. For more details click on the image.