Summary

A social interaction is an exchange between two or more individuals by means of verbal and non-verbal communication channels. By interacting with one another, people create social norms in society, by observing and acting upon many of the paralinguistic and non-verbal cues of humans. Examples of paralinguistic cues are tone, prosody or loudness of speech and examples of non-verbal cues are facial expression, gesture, posture, or body movement. Such meta-communication channels for these cues are the face, body and voice. The face is used for facial expressions, the body is used for gesture, and posture (i.e. upper body movement), and the voice is used for paralinguistic cues (other than verbal). Gestures are often considered supplement or clarification to speech. When considering environments where humans and robots cohabit, the ability to successfully communicate and interact with each other in a seamless manner is integral to this cohabitation.

Sociable robots should be capable of proactively engaging with people within pre-defined social norms to enhance the interaction process e.g. to improve social interaction capability via mechanisms such as reinforcement learning from humans. Currently, this paradigm is one in which robots struggle with as, due to their limited perceptual, cognitive and behavioural abilities, robots do not understand many of the paralinguistic and non-verbal cues of humans.

This project aims to address the aforementioned shortcoming, by developing sociable robots that will utilise artificial intelligence to socially interact with humans in a more meaningful manner.

The objective is to develop a computational model for human-robot social interaction using paralinguistic and non-verbal cues. Using robotic sensory information, these cues will be identified by extracting key features associated with each cue. A dataset of paralinguistic and non-verbal cues will be developed as part of this phase of the project which can be disseminated and utilised by the wider research community. A multimodal deep learning computational model will then be developed to analyse the extracted features for identification and classification of various paralinguistic and non-verbal social cues so that appropriate onward action can be executed. Endowing robots with the ability to conduct human-robot social interactions will contribute to advancing the integration of robot systems in human centric environments.

This project will make use of the pepper robot which will be made available in the Intelligent Systems Research Centre robotics lab. Pepper robots include a range of sensing capabilities including speech, touch/tactile, Infrared, bumpers, inertial, 2D and 3D cameras, and sonar, together with a number of perception modules to recognise and interact with people. Being humanoid, they lend themselves very well to social interaction activites and thus are a suitable choice for the work to be undertaken.


Essential criteria

Applicants should hold, or expect to obtain, a First or Upper Second Class Honours Degree in a subject relevant to the proposed area of study.

We may also consider applications from those who hold equivalent qualifications, for example, a Lower Second Class Honours Degree plus a Master’s Degree with Distinction.

In exceptional circumstances, the University may consider a portfolio of evidence from applicants who have appropriate professional experience which is equivalent to the learning outcomes of an Honours degree in lieu of academic qualifications.

  • Experience using research methods or other approaches relevant to the subject domain
  • Research proposal of 1500 words detailing aims, objectives, milestones and methodology of the project
  • A demonstrable interest in the research area associated with the studentship

Desirable Criteria

If the University receives a large number of applicants for the project, the following desirable criteria may be applied to shortlist applicants for interview.

  • First Class Honours (1st) Degree
  • Masters at 70%
  • For VCRS Awards, Masters at 75%
  • Experience using research methods or other approaches relevant to the subject domain
  • Work experience relevant to the proposed project
  • Publications - peer-reviewed
  • Experience of presentation of research findings

Funding and eligibility

The University offers the following levels of support:

Vice Chancellors Research Studentship (VCRS)

Full award (full-time PhD fees + DfE level of maintenance grant + RTSG for 3 years).

This scholarship will cover full-time PhD tuition fees and provide the recipient with £15,840 (tbc) maintenance grant per annum for three years (subject to satisfactory academic performance). This scholarship also comes with £900 per annum for three years as a research training support grant (RTSG) allocation to help support the PhD researcher.

Applicants who already hold a doctoral degree or who have been registered on a programme of research leading to the award of a doctoral degree on a full-time basis for more than one year (or part-time equivalent) are NOT eligible to apply for an award.

Vice-Chancellor’s Research Bursary (VCRB)

Part award (full-time PhD fees + 50% DfE level of maintenance grant + RTSG for 3 years).

This scholarship will cover full-time PhD tuition fees and provide the recipient with £8,000 maintenance grant per annum for three years (subject to satisfactory academic performance). This scholarship also comes with £900 per annum for three years as a research training support grant (RTSG) allocation to help support the PhD researcher.

Applicants who already hold a doctoral degree or who have been registered on a programme of research leading to the award of a doctoral degree on a full-time basis for more than one year (or part-time equivalent) are NOT eligible to apply for an award.

Vice-Chancellor’s Research Fees Bursary (VCRFB)

Fees only award (PhD fees + RTSG for 3 years).

This scholarship will cover full-time PhD tuition fees for three years (subject to satisfactory academic performance). This scholarship also comes with £900 per annum for three years as a research training support grant (RTSG) allocation to help support the PhD researcher.

Applicants who already hold a doctoral degree or who have been registered on a programme of research leading to the award of a doctoral degree on a full-time basis for more than one year (or part-time equivalent) are NOT eligible to apply for an award.

Department for the Economy (DFE)

The scholarship will cover tuition fees at the Home rate and a maintenance allowance of £15,840 (tbc) per annum for three years (subject to satisfactory academic performance). This scholarship also comes with £900 per annum for three years as a research training support grant (RTSG) allocation to help support the PhD researcher.

  • Candidates with pre-settled or settled status under the EU Settlement Scheme, who also satisfy a three year residency requirement in the UK prior to the start of the course for which a Studentship is held MAY receive a Studentship covering fees and maintenance.
  • Republic of Ireland (ROI) nationals who satisfy three years’ residency in the UK prior to the start of the course MAY receive a Studentship covering fees and maintenance (ROI nationals don’t need to have pre-settled or settled status under the EU Settlement Scheme to qualify).
  • Other non-ROI EU applicants are ‘International’ are not eligible for this source of funding.
  • Applicants who already hold a doctoral degree or who have been registered on a programme of research leading to the award of a doctoral degree on a full-time basis for more than one year (or part-time equivalent) are NOT eligible to apply for an award.

Due consideration should be given to financing your studies. Further information on cost of living


Recommended reading

  1. Breazeal, C. (2003) Toward sociable robots, Robotics and Autonomous Systems, 42, pp. 167–175.
  2. Das, S.; Fime, A. A.; Siddique, N; Hashem, M.M.A. (2021) Estimation of Road Boundary for Intelligent Vehicles Based on DeepLabV3+ Architecture, IEEE Access, DOI: 10.1109/ACCESS.2021.3107353.
  3. Frischen, A., Bayliss, A.P., and Tipper, S.P. (2007) Gaze cueing of attention: visual attention, social cognition, and individual differences, Psychol Bull 133, pp. 694–724.
  4. Hossain, M.R.; Hoque, M.M.; Dewan, M. A. A.; Siddique, N; Islam, M. N.; Sarker, I. H. (2021) Authorship Classification in a Resource Constraint Language Using Convolutional Neural Networks, IEEE Access, Vol. 9, pp. 100319 –100338, DOI: 10.1109/ACCESS.2021.3095967,
  5. Hossain, M.R.;Hoque, M.M.; Siddique, N;Sarker, I. H. (2021) Bengali text document categorization based on very deep convolution neural network, Expert Systems with Applications, Vol 184, 1 December 2021, 115394, DOI: https://doi.org/10.1016/j.eswa.2021.115394.
  6. Krauss, R. M. and Hadar, U. (1999) The role of speech-related arm/hand gestures in word retrieval, In R. Campbell & L. Messing (Eds.), Gesture, speech, and sign, Oxford University Press, Oxford, UK, pp. 93-116.
  7. Trager, G. L. (1961). The typology of paralanguage. Anthropological Linguistics, 3 (1), 17–21.

The Doctoral College at Ulster University