Background and Challenge
Human daily activity analysis and recognition has proven to be of key importance within health care monitoring. For example, determining a person’s level of physical activity can be used to help physicians treat lifestyle-dependent diseases, such as diabetes and heart related diseases. Sensor technology with the ability to record a wide range of measurements such as activity levels, vital signs, proximity measures, to name but a few have been used for the purposes of data collection due to the diversity of information required to gain a full insight into a user’s behaviour. Traditionally, one of the key research issues in the activity recognition process relates to the optimal fusion of heterogeneous sensor based information.
Efforts which have considered sensor fusion have, however, not fully considered that the sensing inputs may not be independent and that they could be causally related with each other in a network. Taking this into consideration the process of fusion should not be considered as a simple process of combination, but instead, one which requires an implicit “causal reasoning” process within the overall fusion scheme. Aim and Objectives
This project will make an innovative contribution to the field of activity recognition by introducing state of the art modelling and computational intelligence techniques to address the key challenges of activity recognition. These techniques include data fusion, knowledge-based systems, network science and optimisation in complex systems, in addition to sensor-based activity recognition strategies.
The aim of this project is to investigate the role of a causal network learning framework based on heterogeneous multi-sensor data and based on this to develop a causal network system for activity recognition. More specifically, the objectives of this project are to:
*Investigate and develop an effective causal network-based knowledge presentation framework for capturing and visualising the complex human activity through both qualitative and quantitative exploration of scenarios
*Explore and evaluate the algorithms to determinate the causal network from multi-sensor data
*Develop a network-based activity recognition model by incorporating different inference mechanisms (fusion and causal reasoning) in an integrated framework
*Implement this decision model through a computer-supported tool for visualisation, manipulation, analysis and evaluation
*Conduct evaluation of the developed methodology and system using openly available data sets in addition to data sets collected from a real smart environment.
Anticipated Research Outcome/Impact
The main domain of application is Smart Homes for Ambient Assistive Living. The research will be undertaken in the context of Smart Environments within the School of Computing. The research results will be evaluated and tested in activity recognition and assistance scenarios. It is anticipated that upon the completion of this project, a practical causal network based system will be developed that can offer innovative solutions to activity recognition which will be transferrable to a broad range of applications.
This project brings together a variety of experience from three supervisors in decision analytics, information fusion, sensor networks, knowledge-based systems, optimisation, and ambient assisted living.
If the University receives a large number of applicants for the project, the following desirable criteria may be applied to shortlist applicants for interview.
The scholarship will cover tuition fees at the Home rate and a maintenance allowance of £ 14,777 per annum for three years. EU applicants will only be eligible for the fees component of the studentship (no maintenance award is provided). For Non EU nationals the candidate must be "settled" in the UK.
As Senior Engineering Manager of Analytics at Seagate Technology I utilise the learning from my PhD ever day
Adrian Johnston - PhD in InformaticsWatch Video
Monday 6 August 2018
mid August 2018
When applying for this PhD opportunity please quote reference number: