In the first week of February we have two distinguished speakers visiting University of Helsinki and Aalto University: Rosalind Picard from MIT Media Lab and Hans Gellersen from University of Lancester
Rosalind Picard, MIT Media Lab
3rd of February, 15:15 at Pieni Juhlasali Main Building, Fabiankatu 33
(part ofÂ HelsinkiÂ DistinguishedÂ Lecture Series onÂ FutureÂ Information TechnologyÂ )
Register hereÂ (free of charge).
Years ago, I set out to create technology with emotionalÂ intelligence, demonstrating the ability to sense, recognize, and respondÂ intelligently to human emotion. At MIT, weÂ designed studies andÂ developed signal processing and machine learning techniques to see whatÂ affective insights could be reliably obtained. In this talk I willÂ highlight theÂ most surprising findings during this adventure. TheseÂ include new insights about the “true smile of happiness,” discoveringÂ new ways cameras (and your smartphone, even inÂ your handbag) can computeÂ your bio-signals, finding electrical signals on the wrist that revealÂ insight into deep brain activity, and learning surprising implicationsÂ of wearableÂ sensing for autism, anxiety, sleep, memory, epilepsy, andÂ more. What is the grand challenge we aim to solve next?
Bio: Rosalind Picard, ScD, FIEEE is founder and director of the AffectiveÂ Computing Research Group at the MIT Media Laboratory, co-founder ofÂ Affectiva, providing emotionalÂ intelligence technology used by 1/3 ofÂ the Global Fortune 100, and co-founder and Chief Scientist of Empatica,Â improving lives with clinical-quality wearable sensors andÂ analytics.Â Picard is the author of over 250 articles in computer vision, patternÂ recognition, machine learning, signal processing, affective computing,Â and human-computerÂ interaction. She is known internationally for herÂ book, Affective Computing, which helped launch the field by that name.Â Picard holds bachelors in Electrical Engineering (EE)Â from Georgia TechÂ and Masters and Doctorate degrees in EE and CS from MIT. Picardâ€™sÂ inventions have been twice named to “top ten” lists, including the NewÂ York TimesÂ Magazine’s Best Ideas of 2006 for the Social Cue Reader, andÂ 2011’s Popular Science Top Ten Inventions for a Mirror that MonitorsÂ Vital Signs.
Hans Gellersen, Universiy of Lancaster
February 2, 2017, at 11:00, TUAS building, room AS2, Otaniemi
Rethinking eye gaze for human-computer interaction
Eye movements are central to most of our interactions. We use our eyes to see and guide our actions and they are a natural interface that is reflective of our goals and interests. At the same time, our eyes afford fast and accurate control for directing our attention, selecting targets for interaction, and expressing intent. Even though our eyes play such a central part to interaction, we rarely think about the movement of our eyes and have limited awareness of the diverse ways in which we use our eyes — for instance, to examine visual scenes, follow movement, guide our hands, communicate non-verbally, and establish shared attention.
Bio: Hans Gellersen is Professor of Interactive Systems at Lancaster University. Hans’ research interest is in sensors and devices for ubiquitous computing and human-computer interaction. He has worked on systems that blend physical and digital interaction, methods that infer context and human activity, and techniques that facilitate spontaneous interaction across devices. In recent work he is focussing on eye movement as a source of context information and modality for interaction