User Engagement Across Instructional Format

Project: Research project

Project Details

Description

User Engagement Across Instructional Format User Engagement Across Instructional Format Project Description Goals. The Advancing Next Generation Learning Environments (ANGLE) at ASU proposes to conduct a research study to identify whether students level of engagement, excitement, and learning varies across three instructional formats: PDF text file, video-based lecture, or interactive website. Research Study: Below is a description of the research study. 1. Overview a. The following research questions will be addressed: i. Are there differences in users level of Engagement across the three types of instructional content (text, video, or interactive) used in CourseConnect 3.0 as captured by eye tracking and neuro-signal headset? ii. Are there differences in users level of Instantaneous Excitement across the three types of instructional content (text, video, or interactive) used in CourseConnect 3.0 as captured by eye tracking and neuro-signal headset? iii. Are there differences in users level of Long-Term Excitement across the three types of instructional content (text, video, or interactive) used in CourseConnect 3.0 as captured by eye tracking and neuro-signal headset? iv. Are there differences in users performance on learning across the three different types of instructional content (text, video, or interactive) used in CourseConnect 3.0 as measured by a brief, 10-item pencil-paper survey? b. Clearly statement of causality and/or association c. Clear and concise explanation for why the research is relevant and important 2. Participants and Design a. Participants will be 75 undergraduate college students who range in age from approximately 18 to 40 years of age. Students will be recruited from pools of ASU undergraduates who either attend an introductory psychology class or an introductory education technology class and will be paid $25 or provided course credit to participate. b. The study design is a pretest, posttest one factor design with three levels of instructional content type: text, video, or interactive. Participants will be randomly assigned to the three levels such that there will be equal numbers in each cell of design (n= 25). 3. Learning Materials and Measures a. The learning content for this study will be based on a CourseConnect 3.0 lesson. The learning content will be delivered in three instructional formats: a PDF file, a video lecture, and interactive website. b. A demographic questionnaire will be used to gather information about gender, age, year in school, first language, and ethnicity. c. A 10-item learning performance measure will be used to assess learning. The assessment will be administered as a pretest and a posttest.d. HARDWARE COMPONENT 1: Tobii Eye Tracker monitor will be used to detect eye-gaze patterns and fixation durations. To the participant, the device will look like a standard computer monitor. The participant will not required to wear headgear of any kind, and there will be no other hardware components associated with the eye tracker to cause distraction. The Tobii eye-tracker samples the position of the participants eyes on an average of every 20ms (i.e., 50Hz) and is characterized by the unobtrusive addition of the eye-tracking hardware (e.g., high resolution camera and near infra-red lightemitting diodes) to the monitor frame. e. HARDWARE COMPONENT 2: Emotivs EPOC headset, a high resolution, neuro-signal acquisition and processing wireless headset, will be used to gather affective data. The headset is in the form of a lightweight molded plastic cap with 14 padded nodes that contact the participants scalp and serve to transmit EEG activity (Emotiv, 2010). f. SOFTWARE COMPONENT 1: Emotivs EPOC headset transmits EEG waves emitted by the brain that are interpreted by a software program and transformed by tested algorithms (cluster analyses) into Engagement, Instantaneous Excitement, and Long- Term Excitement. The constructs are described below (Emotiv, 2010): i. Engagement is experienced as alertness and the conscious direction of attention towards task-relevant stimuli. It is characterized by increased physiological arousal and beta waves (a well-known type of EEG waveform) along with attenuated alpha waves (another type of EEG waveform). The opposite pole of this detection is referred to as Boredom in Emotiv Control Panel and the Emotiv API. Related emotions: alertness, vigilance, concentration, stimulation, interest. Scoring behavior: The greater the attention, focus and cognitive workload, the greater the output score reported by the detection. Examples of engaging video game events that result in a peak in the detection are difficult tasks requiring concentration, discovering something new, and entering a new area. Writing something on paper or typing typically increase the engagement score, while closing the eyes almost always rapidly decreases the score. ii. Instantaneous Excitement is experienced as an awareness or feeling of physiological arousal with a positive value. Excitement is characterized by activation in the sympathetic nervous system, which results in a range of physiological responses including pupil dilation, eye widening, sweat gland stimulation, heart rate and muscle tension increases, blood diversion, and digestive inhibition. Related emotions: titillation, nervousness, agitation. Scoring behavior: In general, the greater the increase in physiological arousal the greater the output score for the detection. The Instantaneous Excitement detection is tuned to provide output scores that more accurately reflect shortterm changes in excitement over time periods as short as several seconds. iii. Long-Term Excitement is experienced and defined in the same way as Instantaneous Excitement, but the detection is designed and tuned to be more accurate when measuring changes in excitement over longer time periods, typically measured in minutes.
StatusFinished
Effective start/end date10/1/131/31/14

Funding

  • INDUSTRY: Domestic Company: $84,938.00

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.