Irene Pepperberg

Interspecies communication: a means of studying the cognitive and communicative abilities of Grey parrots

Pepperberg has been studying the cognitive and communicative abilities of Grey parrots for over 40 years. She will briefly describe the history of research on avian abilities, the training techniques that she has used to establish two-way communication with parrots, and the highlights of her work with Alex that were possible because of this communication system. She will present data on her most recent research—on topics such as probabilistic learning and Piagetian tasks— that have been carried out with her current subjects, Griffin and Athena, showing how their intelligence compares with that of human children.

Biography

Pepperberg (S.B, MIT, ’69; Ph.D., Harvard, ’76) is a Research Associate and lecturer at Harvard. She has been a visiting Assistant Professor at Northwestern University, a tenured Associate Professor at the University of Arizona, a visiting Associate Professor at the MIT Media Lab and an adjunct Associate Professor at Brandeis University. She has received John Simon Guggenheim, Whitehall, Harry Frank Guggenheim, and Radcliffe Fellowships, was an alternate for the Cattell Award for Psychology, won the 2000 Selby Fellowship (Australian Academy of Sciences), the 2005 Frank Beach Award for best paper in comparative psychology, was nominated for the 2000 Weizmann, L’Oreal, and Grawemeyer Awards, the Animal Behavior Society’s 2001 Quest Award and 2015 Exemplar Award, and was renominated for the 2001 L’Oreal Award and the 2017 and 2018 Grawemeyer Award. She won the 2013 Clavius Award for research from St. Johns University. Her research has been supported by the National Science Foundation (US). Her book, The Alex Studies, describing over 20 years of peer-reviewed experiments on Grey parrots, was favorably reviewed in publications as diverse as the New York Times and Science. Her memoir, Alex & Me, a New York Times bestseller, won a Christopher Award. She has published over 100 scholarly articles in peer reviewed journals and as book chapters. She is a Fellow of the Animal Behavior Society, the American Psychological Association, the American Psychological Society, the American Ornithologists’ Union, AAAS, the Midwestern Psychological Society, and the Eastern Psychological Association. She serves as consulting editor for four journals and as previous associate editor for The Journal of Comparative Psychology.


Gabriel Skantze

Towards Real-time Coordination in Human-robot Interaction

When humans interact and collaborate with each other, they coordinate their behaviours using verbal and non-verbal signals, expressed in the face and voice. If robots of the future should be able to engage in social interaction with humans, it is essential that they can generate and understand these behaviours. In this talk, I will give an overview of several studies that show how humans in interaction with a human-like robot make use of the same coordination signals typically found in studies on human-human interaction, and that it is possible to automatically detect and combine these cues to facilitate real-time coordination. The studies also show that humans react naturally to such signals when used by a robot, without being given any special instructions. They follow the gaze of the robot to disambiguate referring expressions, they conform when the robot selects the next speaker using gaze, and they respond naturally to subtle cues, such as gaze aversion, breathing, facial gestures and hesitation sounds.

Biography

Gabriel Skantze is an associate professor in speech technology at the Department of Speech Music and Hearing at KTH (Royal Institute of Technology), Stockholm, Sweden. He has a M.Sc. in cognitive science and a Ph.D. in speech technology. His primary research interests are in multi-modal real-time dialogue processing, speech communication, and human-robot interaction, and is currently leading several research projects in these areas. He is also co-founder of the company Furhat Robotics, which develops a social robotics platform to be used in areas such as health care, education and entertainment.


Arik Kershenbaum

Animals, humans, computers, and aliens. Is there anything in common between all their languages?

It is often said that one of the greatest unsolved mysteries in biology is the evolution of human language. Somehow, our ancestors made a quantum leap from having no language (like all other animals), to having an infinitely complex communication medium - which no other species displays, even in part. But how can we be sure that animals have no language? What is the fundamental difference between non-human communication, and fully fledged linguistic ability? Is there some kind of “languageness” that can be quantified and measured? Some researchers claim that animal communication is nothing more than an instinctive execution of a set of neural commands. But then, at what point does autonomous computer communication become a language, rather than just a deterministic execution of commands? In the Search for Extra Terrestrial Intelligence, this question becomes crucial - would we recognise an alien language even if we heard it? Is it possible that there are languages so alien that we could never recognise them as such? In this talk, I will explore these ideas using examples from animal communication and human language (but without examples of alien language) and show how the statistical properties of these communication systems may - or may not - help distinguish language from nonsense.

Biography

Arik Kershenbaum is the Herchel Smith research fellow in Zoology at the University of Cambridge, where he researches animal communication from both theoretical and empirical angles, combining field studies with wolves and dolphins, with computer simulations of cooperative behaviour. He received his BA in Natural Sciences from Cambridge, and PhD in behavioral ecology from the University of Haifa, before going on to be a research fellow at the National Institute for Mathematical and Biological Synthesis in the USA. He has also worked developing image processing and artificial intelligence systems for the Israeli aerospace industry.






Véronique Aubergé

The socio-affective glue: how to manage with the empathic illusion of human for robot?

Is the social robot the result of the artificial intelligence production or of the natural intelligence perception of human? One main phylogenetic feature of human is to continuously extend his body and environment competences, both cognitively and technologically. The “augmented self” paradigm has the age of human. Technologies to extend the social space of human, “to augment the others”, are very old dreams, that can be drawn by the history of speech synthesis premises. However it is only recently that the social robot becomes a societal desire, without any strong hypotheses able to explain how a smart object can become perceptively a subject. In this talk we will propose how some non verbal speech primitives, collected from human sciences explorations, can dynamically manipulate the human relation with the robot by building socio-affective glue, within ethical background challenges and risks. We will explore some ecological experimental methods implying societal people, industrial partners and researchers around the users, in order to co-construct together models, smart data and technologies in the constraints of responsible research and innovations.

Biography

Véronique Aubergé is a CNRS researcher in human sciences at the LIG Lab (Computer Sciences Lab at Grenoble, France) where she heads the Domus Living Lab platform, and she is an associate Professor at the University of Grenoble-Alpes (UGA) where she directs I3L department. She heads the Chair Robo’Ethics at Grenoble National Polytechnics Institute. She has a PhD in Language Sciences and in Computer Sciences. She was a research engineer at the French Company OROS, and a researcher at ICP Lab and then at GIPSA Lab until 2012, where she developed cognitive models, experiments and applications in phonetics, prosody and expressive text-to speech synthesis. At LIG Lab, she focuses on social robotics as instruments to observe and to design models on the human interactional behaviors. She develops co-construction methods for experimenting in Living Lab some real life socio-damaged situations (elderly, children at hospital), for which the robot could be a transitory aid in ethical issues. In particular she is implied in the LIG robotic Social-Touch-RobAir platform developed within the LIG fablab, and in Emox (Awabot Inc) and Diya One (Partnering Robotics Inc.) robots.