See simple examples of what you can do with Metacog by exploring a series of demo widgets here.
The Metacog® platform is designed to capture trainee activity at a granular level of detail while they are interacting with training hardware and software applications. Data from biosensors may also be captured where applicable. As a result, Metacog is capable of tracking not only principal actions (e.g., answering a question on a test or deploying landing gear on a flight sim) but also secondary actions like pop-ups for opening and closing help, changing views, starting to drag an item but not dropping it on a target, and even idle periods.
The huge data set that results from this detailed monitoring can be filtered and retrieved at any time in order to perform advanced modeling and analysis using machine learning. For an overview of the Metacog platform and how it works, start here. To drill deeper into demo widgets, documentation and developer tutorials, try one of the paths below.
We'll walk you through an example of how Metacog works using an air traffic control simulation tool and a Nervanix EEG headset. Watch in real time as Metacog captures the click-stream processes associated with managing takeoffs and landings and fuses them with sensor data captured from the headset.