Funded by: European Comission Horizon 2020
Website of the project: http://aria-agent.eu/
Partners: University of Nottingham, Augsburg University, Imperial College London, University of Twente, Paris Telecom CereProc, Cantoche
PI: Dr. Michel Valstar
This ARIA-VALUSPA project, with its acronym standing for Artificial Retrieval of Information Assistants - Virtual Agents with Linguistic Understanding, Social skills, and Personalised Aspects, aims at the creation of the new generation of virtual humans. The scenario considered by the project consists on the creation of a virtual agent that will act as an information retrieval system. The system will be able to maintain a natural interaction with the user, and react to any question of the user by producing an adequate search query. This involves endowing the agent with the ability to understand both verbal and non-verbal cues. Finally, the retrieved query will be transmited to the user in a realistic manner through natural language enriched with non-verbal behaviour.
SSPNet: Social Signal Processing Network
Funded by: European Comission FP7
The main focus of the SSPNet is automatic assessing and synthesis of human social behaviour, which have been predicted to be the crux of the next-generation computing. The mission of the SSPNet is to create a sufficient momentum by integrating an existing large amount of knowledge and available resources in SSP research domains including cognitive modelling, machine understanding, and synthesis of social behaviour, and so:
- Enable the creation of the European and world research agenda in SSP,
- Provide efficient and effective access to SSP-relevant tools and data repositories to the research community within and beyond the SSPNet, and
- Further develop complementary and multidisciplinary expertise necessary for pushing forward the cutting edge of the research in SSP.
Mahnob: Multimodal Analysis of Human Nonverbal Behaviour in Real-World Settings.
Funded by: European Research Council, ERC Starting Grant
The main aim of the project was to address the problem of automatic analysis of human expressive behaviour found in real-world settings. The core technical aim was to develop a set of tools, based on the findings in cognitive sciences, for the automatic analysis of human spontaneous (as opposed to posed and exaggerated) patterns of behavioural. To this end, a range of audiovisual spatiotemporal events need to be automatically detected. These cues include, for example, facial expressions, head poses and gestures (e.g. head nods and shakes), hand and body movements, or vocal outbursts (e.g. laughter or yawning).
EmoPain. Pain Rehabilitation: E/Motion based automatic coaching
Funded by: Engineering and Physical Sciences Research Council (EPSRC), U.K.
The main aim of the project is the design and development of an intelligent system that will enable ubiquitous monitoring and assessment of patients’ pain-related mood and movements inside (and in the longer term, outside) the clinical environment. Specifically, the focus of the project is a twofold: (i) the development of a set of methods for automatic recognition of audiovisual cues related to pain, behavioural patterns typical of low back pain, and affective states influencing pain, and (ii) the integration of these methods into a system that will provide appropriate feedback and prompts to the patient based on his/her behaviour measured during self-directed physical therapy sessions. Imperial College Team is responsible for visual analysis of facial behaviour and audiovisual recognition of affective states.