A team of researchers at the University of California, San Diego, has developed an artificial intelligence system that can track
The eyes of teachers and students in the classroom and their analysis in order to enhance the process of virtual education in the future.
Researcher Shlomo Dubnov, an expert in music education through the computer at the University’s Entertainment and Learning Research Center, began developing the new system to deal with the defects of music education methods through the electronic application Zoom during the pandemic.
And the technology website TechExplore quoted Dubnov as saying: “In the music classroom, non-verbal means of communication such as physical gestures and facial expressions are extremely important to follow the performance of students.
Coordination of playing and communicating ideas,” adding that “the non-verbal aspects of the communication process were severely affected in the virtual classrooms, due to the lack of the teacher and student in the same spatial space.”
The new system follows the teacher’s attention during the lesson, determines the person the teacher is looking at, and allows the student to know that he is the focus of the teacher at a specific moment during the explanation.
The study team made a prototype of the system and tested it in a virtual hall for teaching music through the Zoom application at the University of California, San Diego.
Researcher Ross Greer says, “The new system uses a camera to photograph the movements of the teacher’s eyes to know their direction, and we have devised an algorithmic equation to accurately determine the direction that the teacher looks at, which allows us to determine
The student that the teacher looks at or who is directed to the explanation.” When the system detects any change in the teacher’s viewing angle, it identifies the new student who is looking at him, and displays a message on the screen to specify the name of the student.
The person the teacher is looking at.