Information for developers, researchers, and anyone curious about what's under the hood


The Metacog® platform is designed to capture trainee activity at a granular level of detail while they are interacting with training hardware and software applications. Data from biosensors may also be captured where applicable. As a result, Metacog is capable of tracking not only principal actions (e.g., answering a question on a test or deploying landing gear on a flight sim) but also secondary actions like pop-ups for opening and closing help, changing views, starting to drag an item but not dropping it on a target, and even idle periods.

The huge data set that results from this detailed monitoring can be filtered and retrieved at any time in order to perform advanced modeling and analysis using machine learning. For an overview of the Metacog platform and how it works, start here. To drill deeper into demo widgets, documentation and developer tutorials, try one of the paths below.

Asset 02



See simple examples of what you can do with Metacog by exploring a series of demo widgets here.

08731 documentation icon-nopadding-15pt



Check out documentation for Metacog's architecture, Javascript client library, and more here.

gear and document icon with representation of data being captured



See how easy it is to instrument your training devices and applications for data capture by following our tutorials here.

Let us show you

We'll walk you through an example of how Metacog works using an air traffic control simulation tool and a Nervanix EEG headset. Watch in real time as Metacog captures the click-stream processes associated with managing takeoffs and landings and fuses them with sensor data captured from the headset.



Demo Screen Shot showing air traffic controller software on left and readings from headset and activity on right

contact us