Forum
Register Calendar Latest Topics
 
 
 


Reply
  Author   Comment   Page 2 of 5      Prev   1   2   3   4   5   Next
stremblay

Member
Registered:
Posts: 65
Reply with quote  #26 
This is wonderful news Ryklin.

I'm really looking forward to your beta version.

Thank you.
0
ebos

Junior Member
Registered:
Posts: 15
Reply with quote  #27 
Thank you very much for the updates, I'm looking forward to the beta version too. Do you have an estimated time frame by which you think it will be ready?

Thanks.
0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #28 
Another update for everyone and a question:

1) I'm in the testing stage right now. Only some small rough details remaining for a release.
2) Beta release will be posted next week for certain. I will post a link here for those who want to test it. It will include a sample touch screen task.
3) There has been some confusion about where to get the latest ML distribution, with a few different sites mentioned. We will reorganize our code, and update the community sometime next week, I hope. Things were slowed down by the SFN conference.

QUESTION:

Should ML display a pointer or some image when/where the touch is made? I'm thinking of making this an option, but initially there will not be any cue on the screen when a touch is made. Users will need to program their tasks to be touch responsible, ie use eyejoytrack() to toggle an object. Please comment below.


0
stremblay

Member
Registered:
Posts: 65
Reply with quote  #29 
Thank you Ryklin.

We are looking forward to your release.

Regarding your question, this would be a good option. In our case, we would not want the monkey to view this image when he touches the screen. It would be interesting for the experiment to have access to this feedback however.

Best, 

Sebastien
0
ebos

Junior Member
Registered:
Posts: 15
Reply with quote  #30 

Thank you very much for the update, it's great news.

As for the question, I do think it would be good to have an option so one can choose whether or not to display a pointer. For our current setup, I don't think we would want to use the pointer right now, but it might come in useful in some other occasions.

Thanks again for your hard work on this. Looking forward to the release.

0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #31 
Hi Everyone,

After a long debugging session today, I have a relatively clean alpha release of touchscreens for you to try out.

Included with this distribution is a sample tasks\touchscreen\tracking.m which contains further instructions.

Link here for more release notes and download instructions:

https://github.com/ryklin/MonkeyLogic_stable/releases/tag/v1.0.65


0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #32 
I will update the showcursor() to work on touchscreens as well as joysticks.
0
ebos

Junior Member
Registered:
Posts: 15
Reply with quote  #33 
Hi, thanks for the release! 

I've been testing it just now and it works fine with the mouse. However, I can't get the touchscreen to work because it sends a digital signal via the PCI card and Eye Signal in ML accepts analog input only. Is this something that will be possible on the future release, where I see you mention a separate I/O adapter for the touchscreen, or will this solution work only with analog touchscreens?

Thanks again.
0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #34 

Hi Ebos,

Great, thanks for being the first person to test and post a review! I will help you as much as I can via this forum.

First lets discuss how the touchscreen interface is designed to work, and then how your specific touchscreen may need to be configured in order to work.

1) A typical windows touchscreen may connect to the computer via USB, parallel port, or even PCI card.
2) The touch screen should come with a driver that needs to be installed first. This driver will typically do two things; a) allow you to calibrate the touch screen by touching it in a few places, and b) override the mouse when touching the screen.
note: If you have windows 8, with a multitouch monitor, I do not know how this will work, I have not tried it (yet). This is on my list.

So, when using your touchscreen you should be able to control the windows's mouse pointer when clicking on the screen.  If you can not do this, you need to review your touchscreen documentation first before moving on to ML.

Next, once you can control the window's mouse with your touchscreen, then start ML. I mentioned in the release notes that one of the caveats in the current version is that the touchscreen does not have it's own adapter in ML. By this what I mean is that there is an Eye Signal x,y, Joystick x,y, buttons, photodiode, etc., but no touchscreen. That's okay, because the touchscreen's signal is being piped (aka routed) through the Eye Signal adapter. This does not mean that ML is sampling the analog inputs on the nidaq card. It is not! Instead, ML is ignoring the analog inputs and using the touchscreen signal in place of the analog signal, but "piggy backing" on the eye signal data structures (this is only temporary by the way).

So, it does not matter if your touchscreen is analog or digital. In fact, if you have an analog touchscreen (I believe that VPixx makes a great one), you could channel it through the current eye signal or joystick signal adapter. If you have a digital touchscreen, like the kind it sounds you have, you just need to configure your driver software to override the window's mouse -- I think that's what you're missing. Furthermore, the goal of this update is to interface a digital touchscreen, NOT analog.  So, your PCI card should definitely work.

Please let us know / when you find a solution, or if you need more help!


0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #35 
I also see no harm in users testing other public commits in my repository branch. You do not need to limit yourself to pre/releases.
0
ebos

Junior Member
Registered:
Posts: 15
Reply with quote  #36 
Dear Ryklin,

thanks for your thorough advice.

I have been playing around with it and I now noticed that the touchscreen was indeed already overriding the mouse all along. The reason the experiment crashes is because Windows seems to detect any touch made to the touchscreen (Display #2) as a touch to the regular monitor (Display #1), and so ML detects it as a touch outside the stimulus screen window and crashes the program. This happens using the "Extend these displays" option in Windows 7 (ML does not detect the second monitor with the other options, anyway), and regardless of whether I position the cursor on Display 1 or 2. So it appears to be something about the way Windows 7 organizes touchscreens when using multiple monitors, rather than an issue with ML. 

I will keep trying and keep you updated, but do let me know if you think I'm missing something very obvious.

Thanks again! 
0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #37 
Questions:

1.Does this happen when you use your mouse instead of the touchscreen?
2.Are you sure your touchscreen is mapped to display #2? This should be configured by the driver, not windows 7.
3.What kind of touchscreen do you have?

You should be using extended display, that's the only way. However this does not answer question #2
0
ebos

Junior Member
Registered:
Posts: 15
Reply with quote  #38 
I solved the problem! The touchscreen is a 3M Microtouch MT7 and previously I had indeed mapped the touchscreen to Display 2 using its own driver. I tried to reinstall the driver selecting the option of exploiting Windows built-in touch functionality instead and then used Control Panel to configure Pen and Touch Displays and it now works fine. I guess for some reason the driver didn't really interface well with the extended display option (maybe it's an old driver/touchscreen?). Regardless, it now works and no longer detects touches on the touchscreen as touches on the main display.

As for ML itself, the things I'm able to do now with both mouse and touchscreen are to touch the red dot (which becomes filled) and drag it to one of the two colored squares on either side (which also become filled). The one thing I noticed is that in the control screen I get the following Warnings at every trial: "Attempt reposition object #2 failed. Target outside screen boundary" and "Warning: Desired ITI exceeded (ITI ~= 550 ms)" in the Matlab command window. Not sure whether this is expected since this is a test program or not. 
0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #39 
That's great!

Don't worry about those two warnings, I get them too.

I suggest you upgrade to my latest build, even though it's not a prerelease. You will have another sample task to look at it. It works even better and is much simpler to use. Then you should start modifying it to create your own task. All future updates will have minor impact on your task code.
0
ctestard

Junior Member
Registered:
Posts: 20
Reply with quote  #40 

Hello again,

I've started using touch screen input in ML and it works (great!) but, unfortunately, not all of the time. Although mouse input is always correct (i.e. ML marks the event of a correct/incorrect response adequately when using a mouse), touch input (using my finger) is not 100% accurate. The cursor is at the right position when I touch the screen (so it is not a screen calibration problem) but the click seems to be detected only part of the time. I've also tried to change the touch screen configurations allowing for different types for 'clicks' but it doesn't change much. Maybe you could have an idea as to why this would occur?

Thank you,
Camille 

0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #41 
Too many variables at this point. I haven't a clue why. I suggest you keep testing until you narrow down the reasons. Also, I recommend you update to my latest build because it features a better sample task.
0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #42 
I have published another touchscreen pre-release

https://github.com/ryklin/MonkeyLogic_stable/releases/tag/v1.0.75
0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #43 
Hi Everyone,

I was able to correct the last bug preventing me from saving and recalling touch data. So, yet another pre-release update has been posted on github.

https://github.com/ryklin/MonkeyLogic_stable/releases/tag/v1.0.79


0
ctestard

Junior Member
Registered:
Posts: 20
Reply with quote  #44 
Hi again,

The new version works beautifully, thank you! Quick question: i noticed that I couldn't hide the mouse pointer on the touch screen without hiding it for our main screen as well (it is quite unpractical to get the pointer back on once the task is over...). It is a distraction when the subject is performing the task, would it be possible to remove it? 

Thanks,
Camille
0
ctestard

Junior Member
Registered:
Posts: 20
Reply with quote  #45 
Hi,

I tried your newest version! it is super useful that we can track where the subject has been touching the screen, thank you. However I have encountered one small bug. The tracking is done perfectly when there is a response (either correct or incorrect) however, when there is no response, the task signals an error: 

" Subscript indices must either be real positive integers or logicals.

Error in touchscreen_dataclean (line 26)
updateto = [updateto updateto(length(updateto))];

Error in screen_task_1_runtime>end_trial (line 2089)" 

Is this an error i can correct with my code of the task or could it be something with the main script? 

Thank you!
Camille
0
ebos

Junior Member
Registered:
Posts: 15
Reply with quote  #46 
Dear Ryklin,

I've tested the new release and I too found it works fine for both tasks except for the same errors Camille is reporting when no response is made. I also find that I can't pause the task by pressing ESC as I used to be able to do in the previous release. If I press ESC, nothing happens until eventually it just detects a no-response and crashes, giving the errors reported by Camille.

I also had a question: all my responses are coded as error. I assumed this is because these tasks don't have correct and incorrect responses? Am I supposed to get correct responses too?

0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #47 
1) The issue concerning the windows mouse pointer is at the top of my list. It sometimes hides and restores itself properly, and other times does strange things. It's at the top of my list.

2) I will look into this crash bug. Sounds like you're running out of memory due to no response.

3) Please take the time to review the sampling timing file I wrote. It is very basic. You may need to edit it in order to get the exact behavior you need.
 -- No it doesn't code errors, hits, misses, etc. I can add that in so you're not confused.
 -- The reason pressing escape doesn't stop the program is that it's a forced choice task -- you must complete the task for the program to stop. So if you press escape, you still need to select both targets, then the trial will end gracefully, the pointer should be restored, and you can view your data afterwards. 

0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #48 
I just committed a new version that fixes the bug related to updateto = [updateto updateto(length(updateto))];

Version number is 1.0.81, it's available now

This is a temporary fix. I need to reprogram how no-touches are handled by the data file. As a result you will see some strange "lines" shooting out of the lower left hand corner. These are caused by datapoints generated when no contact with the screen was made.

I also updated the task files with trialerror information.
0
ryklin

Administrator
Registered:
Posts: 237
Reply with quote  #49 
Dear All,

I have published a new beta (pre-release) on GitHub, version 1.0.83, please read the release notes on Github.

https://github.com/ryklin/MonkeyLogic_stable/releases/tag/v1.0.83

Please take note that I will be committing additional code to this repository. DO NOT update to any later versions (beyond 1.0.83) until further notice, it will most certainly break your systems. 

Thank you.


 

0
stremblay

Member
Registered:
Posts: 65
Reply with quote  #50 
Thank you Ryklin. Great work!

The release notes say "limited functionality". Is this relative to the previous beta or relative to the final upcoming release?

Thanks!
0
Previous Topic | Next Topic
Print
Reply

Quick Navigation:

Easily create a Forum Website with Website Toolbox.