I am using the old version of MonkeyLogic and had a question about the accuracy of timestamps.
I am trying to find the latency of saccades relative to the onset of a visual stimulus. In the timing file, the onset of the relevant stimulus is indicated by an eventmarker (as an argument within the toggleobject function). I am then using the BhvInfo.CodeTimes and BhvInfo.CodeNumbers to find when the visual stimulus was presented on each trial. I then timelock the InBhvInfo.AnalogData.EyeSignal to the value in BhvInfo.CodeTimes. I then process the eye signal to detect saccades. However, I need to know whether this method of timelocking is going to provide me with a reliable latency measure?
To elaborate, we are using a screen with a refresh rate of 60Hz and are sampling eye data at 240Hz. We cannot recall how the timestamps in CodeTimes relate to what actually happened on each trial. For instance, will there be a jitter in the actual stimulus presentation relative to the timestamp - due to the refresh rate of the monitor (i.e. 1000/60 = 17ms of jitter)? Or does MonkeyLogic only trigger this event code when the stimulus is presented on the screen? Has this been extensively tested, or would I need to test my configuration using a diode?
A short summary of the eventmarkers in the timing file, and the analysis, is shown here:
eventmarker(12); %Fixation finished
toggleobject([Cue,stopCue],'EventMarker',62); %%% Turn on relevant stimulus
On an example trial, code 62 has a timing of 2078ms. I therefore take the eyesignal as InBhvInfo.AnalogData.EyeSignal(2078:end,😉
There is a 4-21ms (i.e. 17ms jitter) between when code 12 and code 62 is produced on each trial. I am aligning my eyetracking responses to code 62, assuming this is a reliable measure of when the stimulus appeared on the screen.
Thanks for your advice,