Forum
Register Calendar Latest Topics
 
 
 


Reply
  Author   Comment   Page 4 of 5      Prev   1   2   3   4   5   Next
ebos

Junior Member
Registered:
Posts: 15
Reply with quote  #76 
Dear ryklin,

thanks for looking into this. To answer your previous questions, the last version I used before updating to 1.2.26 was 1.0.83, and it worked fine with acquirefix in that version. I've now tried both 1.2.26 and your latest release and I still get the single 1x1 variable. (Matlab version 2104a, if that can be of any help).

The empty matrix [] in the code was not a typo. I used that because according to the online ML documentation, setting an empty matrix as a threshold for a button press sets it to a default threshold (in this case, 0.5 for a digital button). The button is assigned to a digital I/O (port 0, line 1) in the GUI. I don't have access to an eyetracker at the moment, so I cannot test/debug the multicontroller with anything other than touchscreen + button. 


0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #77 
okay, so then I need to test with a button press and touch screen. There's probably some conflict with buttons and digital usb stream.

However, I need to know how to replicate your setup. You just have a button which has no voltage, it completes a circuit? It's mapped to port 0, line 1. Then you use eyejoytrack as specified in the previous comment? Anything else I need to know?
0
ebos

Junior Member
Registered:
Posts: 15
Reply with quote  #78 
Thanks. My button is a simple touch-sensor which returns 1 when it's not touched and returns 0 when it is touched. Hence until we write a wrap-around, I have to use acquiretouch and holdtouch counterintuitively for the time being, as they're rising and falling thresholds respectively. Other than the mapping to digital I/O as in my previous post, I can't think of anything else in the setup that might be relevant.

The idea of the code I am trying to use is that it's ensuring that the subject touches the sensor when the sample is presented in order to start the trial, and a touch to the screen instead of the sensor, or failing to touch the sensor at all, results in an error that aborts the trial.
0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #79 
Sure, your logic makes absolute sense and I can envisage many cases in which this would be necessary.
0
ebos

Junior Member
Registered:
Posts: 15
Reply with quote  #80 
I've come across another issue that I had glossed over before as it seemed secondary at the time, but the touch sensitivity seems to be rather unreliable. Sometimes just brushing against the target is enough for detection, most other times I need to really steadily hold or press on the target for it to be detected. I thought it was just an issue with the touchscreen, but I've now tried it on a different touchscreen with the same result. I see that another member had a similar issue a few pages back and updating the ML version helped, but not in my case (I'm using 1.2.39 right now). It doesn't seem to be an issue with the screen or Windows, as it's nice and sensitive when using various calibration/drawing tools and moving the cursor. Is this to do with the frequency at which ML samples from the screen? And, if so, can it be adjusted? 
0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #81 
I'm not sure at this point, and would need much more detail since I don't have your screen(s) in front of me. If I were you, since you probably don't know how to debug the data stream, change your conditions file to detect a touch, and then wait for a hold touch. That might give you more consistency. It's possible some artifact is being generated by your screen and sent to ML. Again, I don't know how to isolate the problem and determine where it's coming from, the screen or the software. 
0
ebos

Junior Member
Registered:
Posts: 15
Reply with quote  #82 
Hi Ryklin,

we have done some more work today trying to narrow down the problem and so far we have found that when 'gettouch' in mlvideo.m gets called, the function 'getmousebuttons' it uses often does not return any value (and therefore does not detect a touch). The 'if' statement on line 284 does not return 'true' for either right or left button. Therefore x and y are not collected.

Code:

 

if ( (left_button == 1) || (right_button == 1) ) % update touch location if left mouse button is down
x_touch = (pos(1) - obj.sub_offset_x)/obj.sub_ppd_x;
y_touch = -(pos(2) - obj.sub_offset_y)/obj.sub_ppd_y;
else
x_touch = -180; %out of bounds
y_touch = -180; %out of bounds
end

 



This does not seem to be related to how hard or how long we press on the screen, and often if we try several times in the allotted time window, it will eventually detect it. We are happy to continue trying to do some debugging of our own, but it would be helpful to know where we should go look next, in particular how the signal is fed to ML up to here.
0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #83 

Hi,

The code difference between a touchscreen and mouse is that the latter does not require clicking to respond. You can just drag the mouse around and the ML cursor follows it. The touchscreen behaves like a mouse with the additional requirement that a left OR right mouse button is depressed. This is an important distinction because otherwise when the participant releases the touchscreen, that would be the same as leaving the mouse cursor in place, and this could inadvertently trigger a state change. By adding this requirement to the touchscreen, we effectively move the cursor outside the video display by assigning it the -180 location.

Okay, so how is that implemented... 

mouse_state = mlvideo('getmousebuttons'); -> xglgetcursor_buttonstate - > xglhanders.cpp (xglgetcursor_buttonstate) which is implemented as :

if (!(GetAsyncKeyState (VK_LBUTTON) & 0x8000))
{
//cout << "Line " << line++ << ": " << "Left Mouse Up " << endl;
returnStateLeft = 0;
} else {
//cout << "Line " << line++ << ": " << "Left Mouse Down " << endl;;
returnStateLeft = 1;
}

if (!(GetAsyncKeyState (VK_RBUTTON) & 0x8000))
{
//cout << "Line " << line++ << ": " << "Right Mouse Up " << endl;;
returnStateRight = 0;
} else {
//cout << "Line " << line++ << ": " << "Right Mouse Down " << endl;;
returnStateRight = 1;
}

I don't recall at the moment why I chose to use the asynchronous version of the key state function. It probably had better performance.

I would recommend you try this:

1) use your mouse instead of the touch screen to test your task and see if you experience the same problem

2) if the mouse works fine, then check to see if your touchscreen has any settings to change the latency of the touch time. 

3) In ML, map the mouse adapter in place of the touchscreen. Then use your touchscreen to drag the response cursor around. even though it won't detect depression of the screen, you will be able to test for performance lag, which can help us hone in on the problem.

While you do that, I will look at the synchronous version of the getkeystate function, and see if I can append it to the system for you to try and compare both versions.



0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #84 
I also want to explain the reason I used -180 as a magic number.

When the touchscreen is not depressed, there is no data value. However, the way ML is structured, some numerical value must be placed into the data stream. If I were to use a value of 0, that's indistinguishable from a touch at 0.0 degrees of visual angle. Unfortunately, Matlab (to my knowledge) does not have Nullable Types. So I am forced to use some number, therefore I chose one that's far outside an animal's visual range. This allows us to filter for a value of -180 posthoc, as I do in the function touchscreen_dataclean.m
0
Jaewon

Administrator
Registered:
Posts: 423
Reply with quote  #85 
I am not following this thread, but, regarding to using -180 as a magic number, I use NaN in similar cases. Filtering out NaN is easy with isnan().
0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #86 
Good point, I could have used Nan, that IS matlab's nullable type. I will look into updating this, which might make the process better (or at least clearer). thanks for the advice Jaewon!
0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #87 
Yes, nan is a better choice than -180, this obviates touchscreen_dataclean.m and simplifies the code (leaving the file in the repos for now in case we need to refer back to it in the near future).

This update is committed as version 151.
0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #88 
one more thing I want to add is that the magic number issue should not have any impact on ebos's complaint that the button state is not reliably detected. We need to look into that problem still, which is in the C++ xgl code.
0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #89 
I've added new changes to the touchscreen subroutines that might improve your performance, take a look at build 161
0
Jaewon

Administrator
Registered:
Posts: 423
Reply with quote  #90 
Those who use the touchscreen version of MonkeyLogic,

Can you post Max Latency and Cycle-Rate of your MonkeyLogic when you run the 'combo' task (touch_combo_free_choice.txt) included in the package? I think ebos's problem is just because the touchscreen routine slows down MonkeyLogic. The left figure below is the performance of the touchscreen version on my computer and the right is another task that runs on the old non-touchscreen MonkeyLogic. The same MATLAB (R2014a) and the same computer.


ryklin_monkeylogic.png  typical_cyclerate.png 

0
ebos

Junior Member
Registered:
Posts: 15
Reply with quote  #91 
Thanks Jaewon, well spotted. I never noticed the latency before. I couldn't try the combo task because I don't have access to an eye-tracker at the moment, but I tried running my touchscreen task and the latency is very high (between 13 and 20ms, depending on the trial), with a cycle rate around 2900Hz. The only other task I could try with the equipment I have was the mouse task that comes with the package, and the latency there is also very high (around the same as for the touchscreen - which I guess is to be expected if they run off the same routine?), but, unlike the touchscreen task, the cycle-rate for that was extremely low (>1000Hz).

Also running on R2014a.
0
Jaewon

Administrator
Registered:
Posts: 423
Reply with quote  #92 
The delay caused by this issue increases geometrically as a trial gets longer. Whatever task you run, you will experience a significant drop of performance after 10 sec.
0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #93 
I'm still investigating this and don't have a complete answer yet. However, one notable problem is that almost all of the sample tasks have the following issue that is significantly adding to the "Max Latency" shown above.

showcursor('on'); 

By *removing* the above function call ('off' by default), you should reduce your Max latency and increase your cycle rate to ~10KHz. 

This is not the only issue however, there's something else that's wrong. I should be able to identify it this week. It is not the way data is sampled or stored; I've backed that out and still get an increase in Max Lat. when running touch/mouse tasks. Also, keep in mind that the Combo task is unique. It samples from two data channels and so the Max Latency can be 2x greater.  The Combo task accentuates any bugs that might exist. 

Finally, on my system, when running non-touchscreen branches of ML, I get a Max Latency of ~3.0ms and a cycle rate of 10-30KHz. I can't explain the variability at this time.
0
ctestard

Junior Member
Registered:
Posts: 20
Reply with quote  #94 
Hey guys,

On my touchscreen task too the latencies are usually in the red zone (about 20 msec). The cylce rate is usually around 8000. Running Matlab 2013b with ML 1.2.55.

Best,
0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #95 
new release has been posted.

https://github.com/ryklin/MonkeyLogic_stable/releases/tag/1.2.85

0
ebos

Junior Member
Registered:
Posts: 15
Reply with quote  #96 
Thanks Ryklin, trying the new release now. I see in the release notes that eyejoytrack is no longer required for touchscreen responses. What is the new syntax to use for the touchscreen? The tasks included in the release all seem to be using the eyejoytrack syntax still.
0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #97 
Hi ebos,

Eyejoytrack can be used as before, to track responses. However, what I meant by this release note was that it is no longer necessary to use Eyejoytrack in your code to sample and record data. Meaning, if you assign the touchscreen to USB input 1 and 2, then data from the touch screen will be recorded during the entire trial, even if eyejoytrack is not explicitly coded into the timing file. This may not impact your task, but other users may want to passively monitor the touch touchscreen behavior, OR users may want to collect data from the touchscreen after eyejoytrack has confirmed a correct touch, but did not immediately terminate the trial.

Does that make sense?
0
stremblay

Member
Registered:
Posts: 65
Reply with quote  #98 
Hi Ryklin,

This is a nice new feature. Can you let us know in what variable/structure the touchscreen data is being saved?

Thanks!
0
ebos

Junior Member
Registered:
Posts: 15
Reply with quote  #99 
Thanks! I've tried it and it now works almost flawlessly. The cycle rate has doubled (or more) and the touch is detected very well now. The latency is still in the red-zone though (around 10-12ms), but this doesn't seem to impact the interaction with the task. I'm not sure in which situation the latency might be a concern?

On another note, I have also just noticed that ML now also runs in 64-bit Matlab. I had so far been using 32-bit Matlab and so I've tried now running the same version of ML in 64-bit, but I can't get any past the first touch on any task because I get the same xglflip error reported by ctestard in another thread (http://forums.monkeylogic.org/post/xglflip-error-8046819?pid=1292138793). In my case, however, this happens immediately on the first trial, touching the stimulus and not the edges of the screen and is immediately followed by:

 

Code:

<<<*** MonkeyLogic ***>>> Task Loop Execution Error
Error using xglmex
XGL has not been initialized

Error in xglshowcursor (line 12)
xglmex (29, rhs1, rhs2);

Error in mlvideo (line 241)
xglshowcursor(devicenum, val);

Error in monkeylogic>error_escape (line 2184)
mlvideo('showcursor', ScreenInfo.Device, 1);

Error in monkeylogic (line 1143)
error_escape(ScreenInfo, DaqInfo, fidbhv);

Error in mlmenu (line 2323)
monkeylogic(condfile, datafile, testflag);




Not sure what could be causing this in 64-bit but not 32-bit?

 

0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #100 
Are you sure the resolution is set correctly? Sounds to me like a configuration problem. 
0
Previous Topic | Next Topic
Print
Reply

Quick Navigation:

Easily create a Forum Website with Website Toolbox.