About Keith

Keith Brawner currently works in the simulation industry for the DoD, before, during, and after getting a Masters in Intelligent Systems. Sadly, he is not yet a Doctor.

Friday, October 8, 2010

“Yes!”: Using Tutor and Sensor Data to Predict Moments of Delight during Instructional Activities

“Yes!”: Using Tutor and Sensor Data to Predict Moments of Delight during Instructional Activities was written out of Arizona State by Muldner, Burleson, and VanLehn.  It can be found for free online here.

Nothing particularly fancy is going on.  The combined Regular Day Off and Columbus Day have granted a glorious 4-day weekend.  This combined with today's high of 85 will make for a relaxing weekend of light reading, sushi, and entertainment.

I/ITSEC, the international conference on modeling and simulation will be in Orlando next month.  It will be a good time, offering everything from paper presentations on how to train nurses, to the latest advancements in computer graphics, to 3 dimensional printing, to firing grenade launchers (mockups with recoil) at insurgents (simulated).


The AI in Eduation field, in general, has been focusing on trying to keep students 'in the zone', or to keep them from getting frustrated through the use of hints, easy questions, scaffolding, or a number of other methods.  However, this paper postulates that the most important moments in education are the "yes!" moments or great success.  However, we don't know how to detect these moments and may interfere with their occurrence.

Most everyone has had a "yes!" moment in their education.  If you think, this is the moment where you had a sudden realization of a concept, or when you had just answered a particularly difficult problem.  This moment probably involved significant work with sudden reward.  As my high school calculus teacher would say "These are the moments when the student finally understands, and the moments I live for."

"Yes!" data was gathered using the Example Analogy (EA) Coach with Newtonian physics.  Interactions with the interface were recorded, and students were asked to think aloud.  Additionally, a posture chair, skin-conductance bracelet, eye tracker, and pressure mouse were used to gather data on the students current state.  The "yes!" moment was labeled by an expert, and the system trained to recognize the occurrence based upon that information.

Posture Chair Data
Logistic regression was used to attempt to make sense out of the sensor data.  As has been found in other studies (such as Automatic prediction of frustation), the data from the posture chair was not particularly usable.  However, through the use of time-based models which included pupil response and imput from the other sensors, they were able to correctly predict 60% of the "yes!" events, while incorrectly predicting non-"yes!" events 13% of the time.  Obviously there is some work left to be done in the field, but these results are promising and show possibility.

Why do you care? 
1 - ITS systems can keep students optimally challenged if they are reporting a high frequency of "yes!" events.  This is just as important, if not more important, than predicting frustration.
2 - Detection of such events is possible, and should be further investigated.

No comments:

Post a Comment