SSP Forum: Meng Guo and Shreya Gupta (M.S. Candidates)

Tuesday, April 18, 2023
Margaret Jacks Hall (Bldg. 460), Room 126
symsys bubbles logo

The
Symbolic Systems Forum
presents

Facilitate Science Learning and Knowledge Retention with Augmented Reality and Narratives
Meng Guo (M.S. Candidate)
Symbolic Systems Program


and

EEG Signals for Robot Learning
Shreya Gupta (M.S. Candidate)
Symbolic Systems Program

Tuesday, April 18, 2023
4:30-5:20 pm
Margaret Jacks Hall (Bldg. 460), Room 126

ABSTRACTS:

Meng Guo, "Facilitate Science Learning and Knowledge Retention with Augmented Reality and Narratives" (primary advisor: James Landay, Computer Science)
     Maintaining children's motivation to learn effectively is a crucial topic in K-12 education to mitigate the decrease in learning motivation and dropout. Educational games incorporating Augmented Reality (AR) have been proposed as a promising approach to enhance student's learning experience, promote content comprehension, and facilitate spatial structure learning, along with long-term memory retention. Furthermore, narratives have demonstrated efficacy in engaging children in the learning experience, particularly in STEM domains, by providing relatable contexts for concepts, evoking emotional attachment and immersion. We present "Mission Earth Rescue," an AR-enhanced interactive narrative aimed at promoting learning in natural and environmental science among elementary school students. We aim to demonstrate how narrative-based educational activities integrating AR technology can enhance learning engagement and knowledge retention.

Shreya Gupta, "EEG Signals for Robot Learning" (supervisor: Anthony Norcia, Psychology)
     The field of robotics suffers from a lack of generalization between different skills and actions. As machine learning moves towards Artificial General Intelligence, we begin to think how we can enhance communication between humans and robots to make their training more robust. Recently gestures, gaze, facial expressions and language have been employed to aid this learning. In this work, we further the goal of shared autonomy by directly tapping into the source of intelligence - the human brain. We collect and use non-invasive electroencephalogram (EEG) brain recordings to decode intent to aid robot learning. More concretely, we use Steady State Visual Evoked Potentials, Event Related Potentials and propose action plan for using more complex Error Related Potentials (ErrP), three different kinds of EEG signals to choose action space and trajectory for the robot. Preliminary results from the study show a proof of concept that brain signal might have the potential to be used for elementary tasks. At the same time, the undergoing research plan to use relatively complex ErrP signals will be discussed.

A NOTE ON THE RECORDING OF EVENTS:

If a decision has been made in advance to record an event and to make it available for later public viewing, the event announcement will usually state this. In many cases, however, decisions to record, and/or to make a recording available publicly, are not finalized before an event is announced. Availability decisions for recordings are often subject to what speakers prefer after an event has concluded, among other considerations that may include usage rights for material used in an event, as well as the need for, and practicality of, editing. When recordings are made publicly available, they will be linked within the original event announcement on the Symsys website in the days or weeks following an event. Unfortunately, we cannot follow up on individual requests for more information about whether and when a recording may become available if it is not yet posted publicly.