Pupils faces may soon act as a remote control for robotic teachersJune 26th, 2008 - 3:32 pm ICT by ANI
Washington, June 26 (ANI): A computer science Ph.D. student at the University of California, San Diego, has shown that it is possible create machines that will speed up or slow down video lectures just by reading a persons facial expressions to judge whether he/she is understanding the lesson.
Jacob Whitehill has revealed that the proof-of-concept demonstration is part of a larger project, aimed at realising an automated facial expression recognition technique to make robots more effective teachers.
In a recent study, Whitehill and his colleagues has shown that it is possible to use information within the facial expressions people make while watching recorded video lectures to predict what is their preferred viewing speed, as well as how difficult a person perceives the lecture at each moment in time.
He says that the teams new study is at the intersection of facial expression recognition research and automated tutoring systems.
“If I am a student dealing with a robot teacher and I am completely puzzled and yet the robot keeps presenting new material, that’s not going to be very useful to me. If, instead, the robot stops and says, ‘Oh, maybe you’re confused,’ and I say, ‘Yes, thank you for stopping,’ that’s really good,” said Whitehill, who will present his work at the Intelligent Tutoring Systems conference on June 25.
In a pilot study, Whitehill and his colleagues observed that the facial movements made by the eight participants, when they perceived the lecture to be difficult, varied widely from person to person.
However, most of the subjects blinked less frequently during difficult parts of the lecture than during easier portions, which is supported by findings in psychology.
Whitehill said that one of the next steps for the project was to determine what facial movements one person would naturally make when exposed to difficult or easy lecture material, which would be helpful in training a user specific model to predict when a lecture should be sped up or slowed down based on the spontaneous facial expressions.
The researcher said that to collect examples of the kinds of facial expressions involved in teaching and learning, his team used a video conferencing software to record a session wherein a group of people was taught German grammar.
“I wanted to see the kinds of cues that students and teachers use to try to modulate or enrich the instruction. To me, it’s about understanding and optimising interactions between students and teachers,” he said.
“I can see you nodding right now, for instance. That suggests to me that you’re understanding, that I can keep going with what I am saying. If you give me a puzzled look, I might back up for a second,” he added. (ANI)
- Scientist turns his face into a remote control - Jun 26, 2008
- Tony Danza releases his book about teaching Public School for a year - Sep 11, 2012
- Scientists create robot which registers human emotions - Jul 16, 2012
- Interactive computer acts like real teacher - Mar 04, 2012
- Rubik's Cube robo solves puzzle in 15 seconds! - Dec 05, 2010
- Soon, 'emotional' satnavs that read your mind and know how you feel! - Dec 30, 2010
- In 10 years: Affordable robots to complete daunting home-tasks - Sep 26, 2010
- Can't thank our teachers enough: B-town on Teachers Day - Sep 05, 2012
- Robot taught to smile and frown through self-guided learning - Jul 09, 2009
- Boy Stepan Supin Stays Home, Sends Robot To School - Jan 24, 2011
- Anticipating SMSes, calls too contribute to car crashes - Apr 30, 2012
- Indian-American working on robots to improve daily life - Sep 27, 2010
- Maths graduates in India to teach British children - Sep 10, 2010
- In twilight years, teachers get 'gurudakshina' of love - Jan 27, 2012
- Happily unmarried and living in at 50 (Feature, With Images) - Dec 18, 2011
Tags: california san diego, computer science ph, concept demonstration, facial expression recognition, facial expressions, facial movements, intelligent tutoring systems, lecture material, moment in time, person to person, pilot study, proof of concept, pupils, recognition research, recognition technique, remote control, robot, university of california san diego, video lectures, whitehill