Sign up Calendar Latest Topics
 
 
 


Reply
  Author   Comment   Page 13 of 14     «   Prev   10   11   12   13   14   Next
Jaewon

Administrator
Registered:
Posts: 933
Reply with quote  #301 

Video is always recorded if a webcam is configured. To extract it, please try the attached function. I am not sure yet what is the best way to implement the function, so it can be changed later. Be careful since you can easily run out of memory when handling movie data.

Regarding to the general input, let me think about it. The file was not recorded in the simulation mode, was it?

0
crponce

Junior Member
Registered:
Posts: 23
Reply with quote  #302 
Hi Jaewon! First of all, some feedback: the latest ML2 versions have been working great for us, and the Bluetooth mode is particularly useful. Now we can have monkeys playing happily with tablets in their home cages, which has made life a lot easier (and they seem pretty entertained themselves).

Every few sessions, though, our task crashes with the following message. It's infrequent enough that we've ignored it, and we can't reproduce it by hand in lab, but it may be related to a discrepancy in the internal count of analog touch samples. Any pointers for debugging?

"<<< MonkeyLogic >>> Product of known dimensions, 2, not divisible into total number of elements, 5417. (forageTablet_runtime/end_trial, Line 1432)
Error using reshape
Product of known dimensions, 2, not divisible into total number of elements, 5417.
 
Error in forageTablet_runtime/end_trial (line 1432)
ml_touch=EyeCal.deg2pix(reshape(ml_touch,2,[])');"


0
Jaewon

Administrator
Registered:
Posts: 933
Reply with quote  #303 
Thanks for letting me know it, crponce. I think I know where to look. Please give me some time to fix this.
0
Jaewon

Administrator
Registered:
Posts: 933
Reply with quote  #304 
* Changes in NIMH MonkeyLogic 2 (Mar 23, 2019)

 + An error that occurred when the touch trace was displayed is fixed.


* Changes in NIMH MonkeyLogic 2 (Apr 1, 2019)

 + mlplayer is improved to correctly replay data files produced by recent ML
   versions. (Thanks to Daniel Hahnke)

 + Now you can choose which eye (or joystick) between #1 and #2 you want to
   import previous calibration from.

0
DHaehnke

Member
Registered:
Posts: 32
Reply with quote  #305 
Quote:
Originally Posted by Jaewon

Video is always recorded if a webcam is configured. To extract it, please try the attached function. I am not sure yet what is the best way to implement the function, so it can be changed later. Be careful since you can easily run out of memory when handling movie data.

Thanks for the function. I'm trying to get an idea of whether it's feasible for us to record videos for entire sessions. Is the movie data somehow compressed in the bhv file? What impact does the video recording have on Monkeylogic's performance?

Quote:
Originally Posted by Jaewon

Regarding to the general input, let me think about it. The file was not recorded in the simulation mode, was it?

Yes, it was not recorded in simulation code.




0
Jaewon

Administrator
Registered:
Posts: 933
Reply with quote  #306 
The video is not compressed. The function will return the data as a matrix of Y by X by 3 by N.

Video capturing will take more resource, like CPU, and increase time to save data during intertrial intervals.
0
DHaehnke

Member
Registered:
Posts: 32
Reply with quote  #307 
I have a small bug in my custom adapter that uses General1 input in its analyze method. Just to clarify, what exactly are the values in p.DAQ.General.Gen1? I see that the size of this vector is variable and fluctuates between 50 and 70 for me. I also have the impression that this vector starts with the more recent voltage values, meaning that older values are more towards the end. Is this correct?
0
Jaewon

Administrator
Registered:
Posts: 933
Reply with quote  #308 
p.DAQ.General must be a cell matrix, so it can't have a field, Gen1. But I got what you meant.

It contains data acquired during the previous frame (or data acquired before the scene start, if it is the very first frame of the scene). 50-70 are too many, but I guess you checked the number for the very first frame of the task.

No, the first number is the oldest. New values are added to the back.
0
DHaehnke

Member
Registered:
Posts: 32
Reply with quote  #309 
Quote:
Originally Posted by Jaewon
p.DAQ.General must be a cell matrix, so it can't have a field, Gen1. But I got what you meant.

It contains data acquired during the previous frame (or data acquired before the scene start, if it is the very first frame of the scene). 50-70 are too many, but I guess you checked the number for the very first frame of the task.

No, the first number is the oldest. New values are added to the back.


Oh, yes, thanks. The representation of the AI data in the bhv was still in my mind, hence the struct syntax [wink]

So the number of elements should be 1s/frame_rate*ai_aquisition_rate? Thus for frame_rate 60 Hz and ai_aquisition_rate 1000 Hz this would be around 16-17?


I also wonder what some properties in the runtime variable p (class RunSceneParam) mean. Sorry if you explained that elsewhere. Is p.LastFlipTime the draw time of the current frame or the frame prior to the current frame? Does p.LastAquisitionTime refer to the tracker's (or other input's) most recent sample (so for my adapter p.DAQ.General{1}(1))?
0
Jaewon

Administrator
Registered:
Posts: 933
Reply with quote  #310 
The AI acquisition rate is always 1 kHz internally in NIMH ML, no matter what you choose on the menu. The rate chosen on the menu affects data saving only. Yes, you can expect 16-17 samples per each frame at 60 Hz.
ftp://helix.nih.gov/lsn/monkeylogic/ML2_doc/runtimefunctions.html#Trackers

Those two fields are just to calculate elapsed times for display purpose and don't need to be there. They may be removed in the future.  p.LastFlipTime is the time when the current frame is presented. p.LastAcquisitionTime is close to the time of the most recent sample, p.DAQ.General{1}(end), but it is not really associated with data acquisition, so please don't do anything with it.
0
DHaehnke

Member
Registered:
Posts: 32
Reply with quote  #311 
Thanks for the clarification.

Please don't remove LastAcquisitionTime. I use it to give an online estimate of the reaction time (difference between SceneStartTime and LastAcquisitionTime). I know that this is not accurate but I use a different algorithm for offline analysis.
0
Jaewon

Administrator
Registered:
Posts: 933
Reply with quote  #312 
I left most of those properties accessible because of the coding convenience, but users should not directly access some of them, so I can't guarantee. And there is a better way to calculate RT online.

First of all, you are not supposed to access p.DAQ.General directly. In your case, it is inevitable because there is no tracker for general input. But multiple access to p.DAQ.General can slow down performance.

After you read from p.DAQ.General, read p.DAQ.LastSamplePosition immediately and store the value somewhere. It gives you the number of samples accumulated (from trial start) before you read p.DAQ.General. So, if you find that the 10th sample in p.DAQ.General passes the threshold, the time of threshold cross is p.DAQ.LastSamplePosition(1) + 10. Save it somewhere and, after run_scene(), subtract the return value of run_scene() from it.

So, in you adapter,

data = p.DAQ.General{1};
lastpos = p.DAQ.LastSamplePosition(1);
...
obj.TimeOfThreshodCross = lastpos + 10;

Then, in the timing file,

youradp = WheelAdapter(...);
...
fliptime = run_scene(scene);
RT = youradp.TimeOfThreshodCross - fliptime;

If this doesn't make sense, just use p.scene_time(), instead of p.LastAquisitionTime. That variable name is misleading, so I will do something about it.

-----

In fact, I just removed it. Instead, I added a new property for the first flip time of the scene so that you can calculated the RT inside the adapter.

obj.RT = lastpos + 10 - p.FirstFlipTime;
0
Jaewon

Administrator
Registered:
Posts: 933
Reply with quote  #313 
* Changes in NIMH MonkeyLogic 2 (Apr 8, 2019)

 - A typo in the ImageGraphic adapter is fixed.

    Line 60 of ImageGraphic.m
       from : case {'double','unit8'}
       to   : case {'double','uint8'}

 - When BHV2 files were read on the Unix or Mac platform, an error occurred
   sometimes due to the character encoding difference. It is fixed now.
   (Thanks to Peter Schade)

0
DHaehnke

Member
Registered:
Posts: 32
Reply with quote  #314 
Thanks Jaewon, I'll try your suggestion about the online reaction times.

I have a request regarding the Stimulator adapter. In my task I use two waveforms to play auditory feedback. The response scene combines a Stimulator and a custom response adapter. The Stimulator plays a go cue of, say, 200 ms length. After Success of the response adapter I want to play a different waveform. However, it is possible that the go cue is still playing after the response adapter has Success and that setting the WaveformNumber property gives an error.
Would it be possible to change Stimulator's behaviour so that it stops sending if you set a different WaveformNumber? If that violates your design principles I'm fine with just using the bare analogoutput objects [wink]
0
Jaewon

Administrator
Registered:
Posts: 933
Reply with quote  #315 
Does the attached adapter work for you?

 
Attached Files
m Stimulator.m (4.45 KB, 1 views)

0
Jaewon

Administrator
Registered:
Posts: 933
Reply with quote  #316 
* Changes in NIMH MonkeyLogic 2 (Apr 15, 2019)

 + During the eye/joystick calibration, some UI control remained focused and
   responded to key input undesirably.  This has been a known problem, but
   there is no effective way to prevent this, because MATLAB figures are not
   native windows and there are some limits in controlling them.  As an
   alternative solution, a UI lock button is added so that the UI controls can
   be disabled when they are not needed.

 + In the 2-D spatial transformation calibration, the fixation point stayed on
   the screen once presented, unless the reward method was "On Fixation".  Now
   it can be turned off anytime by pressing the 'B' key.

 - The problem that the stimulus preview failed when a GEN function returned a
   movie without the alpha channel is fixed. (Thanks to Keith Purpura)

0
hakan

Junior Member
Registered:
Posts: 2
Reply with quote  #317 
Hi Jaewon,

I was wondering if there is a way to loop video stimuli in ML2? With ML1, we were able to loop a mov TaskObject. 

I am also receiving an error when I set a block frequency in the conditions file to 0. I did not have this issue with ML1. 

Thank you in advance for any guidance!
0
Jaewon

Administrator
Registered:
Posts: 933
Reply with quote  #318 
Please explain what you want to do by looping your video. Are you repeatedly presenting the same image sets? Then how do you determine the intervals between images?

What is the purpose of setting the frequency to 0? You don't want to run that condition? Please explain why you need to set it 0.
0
hakan

Junior Member
Registered:
Posts: 2
Reply with quote  #319 
Hi Jaewon, 

In ML1, we have a video of a moving gradient that is roughly half a second long. When we toggled the TaskObject for a certain amount of time, the video would automatically loop until we toggled off. In ML2, the video plays once and does not loop. 

For the frequency, in ML1, we would set the frequency to 0 for flexibility rather than creating a new block. I mention this primarily because the error being thrown out by ML2 is saying 'the frequency must be 0 or higher'. 

Thank you again!


0
Jaewon

Administrator
Registered:
Posts: 933
Reply with quote  #320 
Thanks for the information.

In ML1, looping movies is the only option. But not everyone wants to loop movies always.

To loop a movie defined in the conditions file, you can change it to a GEN object and set the loop option there. This is described in the TaskObject manual (Remark #5).
ftp://helix.nih.gov/lsn/monkeylogic/ML2_doc/taskobjects.html

So, instead of mov(movfile,0,0), put gen(movfile,0,0) in your conditions file and create a gen function like the following.

function [imdata,info] = movfile(TrialRecord,MLConfig)
    imdata = 'movfile.avi';
    info.Looping = 1;
end

I am working on a way to set the loop option directly in the conditions file, so it will be easier soon.

However, one thing you should know is that the original MonkeyLogic (ML1) processes movies frame by frame without respecting their frame rates. So the speed (and length) of movies differ depending on the refresh rate of the monitor. By default, NIMH ML (ML2) plays movies at the frame rates given by the movie files, so you may not get the same results on ML2 depending on your purpose of using movies. I am adding another option to play movies in the same way as ML1.

Regarding to the frequency, it was my mistake to write the error message like that. If you saw the code, you would know that it actually checks whether the frequency is a natural number. I changed it to accept 0 and will release a new version soon with other changes that I am working on.
0
Jaewon

Administrator
Registered:
Posts: 933
Reply with quote  #321 
* Changes in NIMH MonkeyLogic 2 (May 9, 2019)

 + MATLAB is slow at the very first run. A warming-up procedure is reintroduced
   to improve the performance in the first trial.

 + The TaskObject type, MOV, provides an option to loop the movie when the end
   is reached.

    mov(filename, Xdeg, Ydeg, looping)

 + Now there is an explicit way to play movies like the original ML.

    gen(moviefile,0,0)     % in the conditions file

    function [imdata,info] = moviefile(TrialRecord,MLConfig)  % gen script
        imdata = 'moviefile.avi';
        info.FrameByFrame = 1;
    end

 + The Frequency column of the conditions file can be 0.

 - Minor fixes


0
DHaehnke

Member
Registered:
Posts: 32
Reply with quote  #322 
Quote:
Originally Posted by Jaewon
Does the attached adapter work for you?


Hi Jaewon,

thanks for the adapter and sorry that I couldn't test it earlier.

The adapter doesn't throw an error, but it plays the first waveform if I set the waveform during sending.
This can easily be solved by restarting the analogoutput object after the issending assertion:
if issending(obj.AO)
 stop(obj.AO);
 start(obj.AO);
end

Are you going to implement that in the next version?

Thanks,
Daniel
0
Jaewon

Administrator
Registered:
Posts: 933
Reply with quote  #323 
I don't understand the situation. Can you explain a little more details or show me the part of your timing file code?
0
Jaewon

Administrator
Registered:
Posts: 933
Reply with quote  #324 
* Changes in NIMH MonkeyLogic 2 (May 21, 2019)

 + TCP/IP support for ISCAN DQW software (ver. 1.21E or later)

Visit https://monkeylogic.nimh.nih.gov/ for downloads.
0
DHaehnke

Member
Registered:
Posts: 32
Reply with quote  #325 
Quote:
Originally Posted by Jaewon
I don't understand the situation. Can you explain a little more details or show me the part of your timing file code?


Sure. I have one Stimulator adapter in the timing file. I assigned two waveforms to the adapter and use the adapter in two different scenes. The first of these scenes is the response scene in which the adapter should play a go sound. The second of these scenes is a reward feedback scene in which the adapter should play a sound that is associated with reward. I switch between the two waveforms prior to running the respective scene.
The reward feedback scene comes directly after the response scene. So if the go sound is still playing (i.e. "sending") the next waveform can't be set. With your patched version Monkeylogic doesn't crash when the next waveform is set. However, the reward feedback scene doesn't play the set waveform, but the first waveform (i.e. the go sound instead of the reward sound). I see that after the assertion of 'issending(obj.ao)' the ManualTriggerNextWF is set, so the behaviour I get doesn't make sense.
When I was using the analogobject directly a while back, I also switched between waveforms. For that I had to first stop the object and then restart it. That's what I tried for the Stimulator adapter and it works, i.e. it stops the go sound and plays the reward sound.

The code in the timing file would look something like this:

stim = Stimulator(null_);
stim.Channel = 1;
stim.Frequency = 44100;
stim.Waveform = {go_sound reward_sound};
scene_sound = create_scene(stim);
resp = SomeResponseAdapter(null_);
resp.Properties (...)
tc = TimeCounter(stim);
tc.Duration = response_time;
oa = OrAdapter(resp);
oa.add(tc);
scene_response = create_scene(oa);
stim.Waveform = 1;
run_scene(scene_response);
if resp.Success
  stim.Waveform = 2;
  run_scene(scene_sound);
end

0
Previous Topic | Next Topic
Print
Reply

Quick Navigation:

Easily create a Forum Website with Website Toolbox.