Sign up Calendar Latest Topics
 
 
 


Reply
  Author   Comment   Page 13 of 13     «   Prev   10   11   12   13
Jaewon

Administrator
Registered:
Posts: 855
Reply with quote  #301 

Video is always recorded if a webcam is configured. To extract it, please try the attached function. I am not sure yet what is the best way to implement the function, so it can be changed later. Be careful since you can easily run out of memory when handling movie data.

Regarding to the general input, let me think about it. The file was not recorded in the simulation mode, was it?

0
crponce

Junior Member
Registered:
Posts: 23
Reply with quote  #302 
Hi Jaewon! First of all, some feedback: the latest ML2 versions have been working great for us, and the Bluetooth mode is particularly useful. Now we can have monkeys playing happily with tablets in their home cages, which has made life a lot easier (and they seem pretty entertained themselves).

Every few sessions, though, our task crashes with the following message. It's infrequent enough that we've ignored it, and we can't reproduce it by hand in lab, but it may be related to a discrepancy in the internal count of analog touch samples. Any pointers for debugging?

"<<< MonkeyLogic >>> Product of known dimensions, 2, not divisible into total number of elements, 5417. (forageTablet_runtime/end_trial, Line 1432)
Error using reshape
Product of known dimensions, 2, not divisible into total number of elements, 5417.
 
Error in forageTablet_runtime/end_trial (line 1432)
ml_touch=EyeCal.deg2pix(reshape(ml_touch,2,[])');"


0
Jaewon

Administrator
Registered:
Posts: 855
Reply with quote  #303 
Thanks for letting me know it, crponce. I think I know where to look. Please give me some time to fix this.
0
Jaewon

Administrator
Registered:
Posts: 855
Reply with quote  #304 
* Changes in NIMH MonkeyLogic 2 (Mar 23, 2019)

 + An error that occurred when the touch trace was displayed is fixed.


* Changes in NIMH MonkeyLogic 2 (Apr 1, 2019)

 + mlplayer is improved to correctly replay data files produced by recent ML
   versions. (Thanks to Daniel Hahnke)

 + Now you can choose which eye (or joystick) between #1 and #2 you want to
   import previous calibration from.

0
DHaehnke

Junior Member
Registered:
Posts: 27
Reply with quote  #305 
Quote:
Originally Posted by Jaewon

Video is always recorded if a webcam is configured. To extract it, please try the attached function. I am not sure yet what is the best way to implement the function, so it can be changed later. Be careful since you can easily run out of memory when handling movie data.

Thanks for the function. I'm trying to get an idea of whether it's feasible for us to record videos for entire sessions. Is the movie data somehow compressed in the bhv file? What impact does the video recording have on Monkeylogic's performance?

Quote:
Originally Posted by Jaewon

Regarding to the general input, let me think about it. The file was not recorded in the simulation mode, was it?

Yes, it was not recorded in simulation code.




0
Jaewon

Administrator
Registered:
Posts: 855
Reply with quote  #306 
The video is not compressed. The function will return the data as a matrix of Y by X by 3 by N.

Video capturing will take more resource, like CPU, and increase time to save data during intertrial intervals.
0
DHaehnke

Junior Member
Registered:
Posts: 27
Reply with quote  #307 
I have a small bug in my custom adapter that uses General1 input in its analyze method. Just to clarify, what exactly are the values in p.DAQ.General.Gen1? I see that the size of this vector is variable and fluctuates between 50 and 70 for me. I also have the impression that this vector starts with the more recent voltage values, meaning that older values are more towards the end. Is this correct?
0
Jaewon

Administrator
Registered:
Posts: 855
Reply with quote  #308 
p.DAQ.General must be a cell matrix, so it can't have a field, Gen1. But I got what you meant.

It contains data acquired during the previous frame (or data acquired before the scene start, if it is the very first frame of the scene). 50-70 are too many, but I guess you checked the number for the very first frame of the task.

No, the first number is the oldest. New values are added to the back.
0
DHaehnke

Junior Member
Registered:
Posts: 27
Reply with quote  #309 
Quote:
Originally Posted by Jaewon
p.DAQ.General must be a cell matrix, so it can't have a field, Gen1. But I got what you meant.

It contains data acquired during the previous frame (or data acquired before the scene start, if it is the very first frame of the scene). 50-70 are too many, but I guess you checked the number for the very first frame of the task.

No, the first number is the oldest. New values are added to the back.


Oh, yes, thanks. The representation of the AI data in the bhv was still in my mind, hence the struct syntax [wink]

So the number of elements should be 1s/frame_rate*ai_aquisition_rate? Thus for frame_rate 60 Hz and ai_aquisition_rate 1000 Hz this would be around 16-17?


I also wonder what some properties in the runtime variable p (class RunSceneParam) mean. Sorry if you explained that elsewhere. Is p.LastFlipTime the draw time of the current frame or the frame prior to the current frame? Does p.LastAquisitionTime refer to the tracker's (or other input's) most recent sample (so for my adapter p.DAQ.General{1}(1))?
0
Jaewon

Administrator
Registered:
Posts: 855
Reply with quote  #310 
The AI acquisition rate is always 1 kHz internally in NIMH ML, no matter what you choose on the menu. The rate chosen on the menu affects data saving only. Yes, you can expect 16-17 samples per each frame at 60 Hz.
ftp://helix.nih.gov/lsn/monkeylogic/ML2_doc/runtimefunctions.html#Trackers

Those two fields are just to calculate elapsed times for display purpose and don't need to be there. They may be removed in the future.  p.LastFlipTime is the time when the current frame is presented. p.LastAcquisitionTime is close to the time of the most recent sample, p.DAQ.General{1}(end), but it is not really associated with data acquisition, so please don't do anything with it.
0
DHaehnke

Junior Member
Registered:
Posts: 27
Reply with quote  #311 
Thanks for the clarification.

Please don't remove LastAcquisitionTime. I use it to give an online estimate of the reaction time (difference between SceneStartTime and LastAcquisitionTime). I know that this is not accurate but I use a different algorithm for offline analysis.
0
Jaewon

Administrator
Registered:
Posts: 855
Reply with quote  #312 
I left most of those properties accessible because of the coding convenience, but users should not directly access some of them, so I can't guarantee. And there is a better way to calculate RT online.

First of all, you are not supposed to access p.DAQ.General directly. In your case, it is inevitable because there is no tracker for general input. But multiple access to p.DAQ.General can slow down performance.

After you read from p.DAQ.General, read p.DAQ.LastSamplePosition immediately and store the value somewhere. It gives you the number of samples accumulated (from trial start) before you read p.DAQ.General. So, if you find that the 10th sample in p.DAQ.General passes the threshold, the time of threshold cross is p.DAQ.LastSamplePosition(1) + 10. Save it somewhere and, after run_scene(), subtract the return value of run_scene() from it.

So, in you adapter,

data = p.DAQ.General{1};
lastpos = p.DAQ.LastSamplePosition(1);
...
obj.TimeOfThreshodCross = lastpos + 10;

Then, in the timing file,

youradp = WheelAdapter(...);
...
fliptime = run_scene(scene);
RT = youradp.TimeOfThreshodCross - fliptime;

If this doesn't make sense, just use p.scene_time(), instead of p.LastAquisitionTime. That variable name is misleading, so I will do something about it.

-----

In fact, I just removed it. Instead, I added a new property for the first flip time of the scene so that you can calculated the RT inside the adapter.

obj.RT = lastpos + 10 - p.FirstFlipTime;
0
Jaewon

Administrator
Registered:
Posts: 855
Reply with quote  #313 
* Changes in NIMH MonkeyLogic 2 (Apr 8, 2019)

 - A typo in the ImageGraphic adapter is fixed.

    Line 60 of ImageGraphic.m
       from : case {'double','unit8'}
       to   : case {'double','uint8'}

 - When BHV2 files were read on the Unix or Mac platform, an error occurred
   sometimes due to the character encoding difference. It is fixed now.
   (Thanks to Peter Schade)

0
DHaehnke

Junior Member
Registered:
Posts: 27
Reply with quote  #314 
Thanks Jaewon, I'll try your suggestion about the online reaction times.

I have a request regarding the Stimulator adapter. In my task I use two waveforms to play auditory feedback. The response scene combines a Stimulator and a custom response adapter. The Stimulator plays a go cue of, say, 200 ms length. After Success of the response adapter I want to play a different waveform. However, it is possible that the go cue is still playing after the response adapter has Success and that setting the WaveformNumber property gives an error.
Would it be possible to change Stimulator's behaviour so that it stops sending if you set a different WaveformNumber? If that violates your design principles I'm fine with just using the bare analogoutput objects [wink]
0
Jaewon

Administrator
Registered:
Posts: 855
Reply with quote  #315 
Does the attached adapter work for you?

 
Attached Files
m Stimulator.m (4.45 KB, 0 views)

0
Jaewon

Administrator
Registered:
Posts: 855
Reply with quote  #316 
* Changes in NIMH MonkeyLogic 2 (Apr 15, 2019)

 + During the eye/joystick calibration, some UI control remained focused and
   responded to key input undesirably.  This has been a known problem, but
   there is no effective way to prevent this, because MATLAB figures are not
   native windows and there are some limits in controlling them.  As an
   alternative solution, a UI lock button is added so that the UI controls can
   be disabled when they are not needed.

 + In the 2-D spatial transformation calibration, the fixation point stayed on
   the screen once presented, unless the reward method was "On Fixation".  Now
   it can be turned off anytime by pressing the 'B' key.

 - The problem that the stimulus preview failed when a GEN function returned a
   movie without the alpha channel is fixed. (Thanks to Keith Purpura)

 
Attached Files
mlappi NIMH_MonkeyLogic_2_(Apr-15-2019).mlappinstall (38.42 MB, 0 views)
zip NIMH_MonkeyLogic_2_(Apr-15-2019).zip (34.07 MB, 0 views)

0
Previous Topic | Next Topic
Print
Reply

Quick Navigation:

Easily create a Forum Website with Website Toolbox.