Summary: | This article presents concurrent multimodal data, including EEG, eye-tracking, and behavioral data (cursor movements and clicks), acquired from individuals (N = 22) while engaging in several German language lessons using the web-based Duolingo interface. Lessons were restricted to visual learning only (excluding audio and speech components), including reading and writing vocabulary words and sentences, and matching vocabulary to images. EEG data was collected using the open-source OpenBCI device utilizing dry Ag-AgCl electrodes, while eye-tracking data was recorded using the Gazepoint GP3 system. Timestamped screen captures associated with mouse click and keypress events and user behavior (cursor movements) were acquired using AutoHotKey macro scripts. These data provide neural (EEG), gaze (eye-tracking), and behavioral (mouse movements, clicks, and keypresses) data, with respect to presented language-learning media (Duolingo screen captures) for a wide range of possible scientific analyses and methods development. Keywords: Electroencephalography, EEG, Eye-tracking, Mouse-tracking, Duolingo, Naturalistic language learning, Multimodal human data
|