Photo by Troy House - CORBIS; downloaded from http://www.aldebaran.com/en/press/gallery/pepper
My main research goal is to build artificial characters---mainly robots---that people can interact with using natural, face-to-face conversation (yes, like C-3PO and R2-D2). Supporting this sort of interaction means that a robot must be able both to produce a wide range of appropriate social signals (speech, facial expressions, body language, gestures) and to understand the social signals produced by its conversational partners. To make this possible, we must (1) observe the signals used by humans in a range of situations, (2) build models of those behaviours that can be used by a robot for both input processing and output generation, and then (3) test how well the models perform when the robot must interact with humans.
I would welcome applications from Ph.D. students who are interested in working on any of the above areas. You can find out more general information about the Ph.D. programme on the School web pages, or if you want more specific information, feel free to email me directly.
From March 2016, I am coordinating the MuMMER project, a four-year Horizon 2020 project where we will develop and test a socially intelligent robot in a public shopping mall in Finland. The partners are:
- University of Glasgow (coordinator)
- Heriot-Watt University
- Idiap Research Institute
- Aldebaran Robotics
- VTT Technical Research Centre of Finland
- Ideapark (shopping mall)
Watch this space for more details as the project progresses!
From October 2016, I am also a co-investigator on the SoCoRo project, which aims to build a socially competent robot training buddy for adults with ASD.
Before coming to Glasgow, I worked on a number of projects in the general area of embodied conversational agents, including the following:
- EMOTE (EU FP7, 2012-2015), which has the goal of developing empathy-based robotic tutors.
- JAMES (EU FP7; 2011-2014), where we developed a robot agent that is able to interact in a socially appropriate way in multi-party, situated interactions in the bartender domain.
- ECHOES (EPSRC/ESRC TEL; 2009-2012), which developed a technology-enhanced learning environment in which both typically developing children and children with Asperger's Syndrome could explore and improve their social interaction and collaboration skills.
- JAST (EU FP6; 2006-2009), which investigated the cognitive, neural, and communicative aspects of joint action.
- COMIC (EU FP5; 2002-2005), where we developed an embodied multimodal dialogue system.
Here are a few online videos of me talking about my research.
- You can see a short clip of me talking about MuMMER around 1/3 of the way through this highlights video from The SICSA DEMOfest 2016.
- Me discussing AI and emotions at the Edinburgh Digital Entertainment Festival in August 2016.
- A short description of the MuMMER project, recorded by RoboHub at the European Robotics Forum in Ljubljana.
- An invited talk given at the Idiap Research Institute in Martigny, Switzerland, describing the results of a number of the above projects.
- A short description of the JAMES project from the 2012 SICSA DemoFest.
- A guest lecture from the 2009 ShanghAI Lectures series, describing the results of the JAST project.
Dr Mary Ellen Foster
How to find my office ...
- Enter the Sir Alwyn Williams Building
- Go to the main stairwell and climb up to the floor marked "F" (Lilybank Gardens)
- Walk along the hallway all the way to the end (past the teaching office) -- you will see rooms numbered F101 and F102
- Go down the stairs to your right to the first landing -- that's M101 (my office)