Forum
Register Calendar Latest Topics
 
 
 


Reply
  Author   Comment   Page 1 of 5      1   2   3   4   Next   »
ryklin

Administrator
Registered:
Posts: 250
Reply with quote  #1 
In order to add a touch screen to ML, we may need to add a new class of adapters. Currently, a ML adapter can be a nidaq board, a parallel port, or a sound card. We would need to add a new adapter called PS2/USB or something along those lines. Then we capture data from this new adapter, transform it appropriately, and finally map it to 'Touch X' and 'Touch Y'. The rest of the subsystems should integrate this new signal seamlessly (is my guess). So for example, behavioral data will just save touch screen signals coming from the USB adapter the same way it would eye signals coming from the nidaq adapter.

From what I can tell, adaptors are automatically created by the Matlab DAQ Toolbox by calling hwinfo = daqhwinfo; so I'm not sure if we can append our own adapter to the resulting hwinfo structure, or if there's a better strategy.
 
Also, I'm wondering, what are the technical differences between mapping EyeX vs JoyX to analog input? Fundamentally, we know the difference between using a joystick versus an eye tracker, but are these signals implemented any differently in ML, or are these just separate (but redundant) labels used for the sake of providing user clarity? If that's the case, and we continue along this route, we will have Eye, Joystick, Touch, and Mouse signals that need mapping. Why don't we just collapse them all into one label called 'X Input', 'Y Input'. This also brings up the possibility that a user may want to record gaze from two eyes, or record gaze and joystick responses concomitantly. In this case, we wouldn't want to collapse the labels but rather add even more such as Left Eye Signal X, and Right Eye Signal X... which quickly gets very complicated, and will be confusing for the majority of users.
0
Jaewon

Administrator
Registered:
Posts: 382
Reply with quote  #2 
I think you are talking about separating hardware abstraction from data processing and I like the idea. In fact, the software I used before ML was designed in that way and it provided a great degree of freedom in terms of the type of data acquisition devices that you can add. For example, as long as each hardware adapter provided the interface that the software required, you could plug in that hardware to the existing behavior detection code, so that you could reuse the same task code when you switch from eye signals to joystick.

I am not sure about implementing a mouse adapter as you described, though. Mouse/touchscreen inputs are translated into screen coordinates by their driver software and it is up to hardware makers how to implement the details, so we cannot assume that PS2/USB provides voltage signals as eye tracker/joystick does. If we make an adapter for mouse/touchscreen, I think, it may need to be done based on APIs provided by OS. For Windows, something like DirectInput.
0
ryklin

Administrator
Registered:
Posts: 250
Reply with quote  #3 
PS2/USB is just a suggestion for a label. How you would call it? Yes DirectInput is the only way to capture it.


0
Jaewon

Administrator
Registered:
Posts: 382
Reply with quote  #4 
My old software used pretty general names like XYAcquirer for 2-channel devices and XAcquirer for 1-channel devices, but naming is not important. I just thought you were suggesting making the mouse/touchscreen adapter as a voltage sampler.

XGL has a function that retrieves the mouse position (xglgetcursor), which is a wrapper of a Windows API, GetCursorPos(). I don't know how quickly GetCursorPos() updates the current position, compared to DirectInput, but I saw many OpenGL projects used this function. We can build an adapter with it, although it is not perfect.

The attached is a ML task that I made. During the task, a red dot and a blue dot follow the mouse cursor in the control screen and the subject screen, respectively. Calling xglgetcursor enables the mouse function, so clicking outside the subject screen will abort MonkeyLogic. MouseTracker.m (included in the zip file) should be in the MATLAB path to run the task.

 
Attached Files
zip mouse_tracking.zip (1.88 KB, 21 views)

0
ryklin

Administrator
Registered:
Posts: 250
Reply with quote  #5 
This looks useful. I already built a basic mouse and touchscreen adapter, so maybe I can grab data from the mouse using your code.

You say it's not perfect. Since I haven't tested it yet, can you tell me what limitations it has, and what I need to be cautious about?
0
Jaewon

Administrator
Registered:
Posts: 382
Reply with quote  #6 
So far xglgetcursor is the only interface that XGL provides related to the mouse actions, so we need to build something else to detect clicks/touches. I hope your adapter has a solution for this. Plus, as I mentioned, calling xglgetcursor activates the mouse functions that are disabled by ML, so clicking outside the subject screen stop the task running.
0
ryklin

Administrator
Registered:
Posts: 250
Reply with quote  #7 
Well, that's why I asked in another post if / how we can modify XGL. 
0
Jaewon

Administrator
Registered:
Posts: 382
Reply with quote  #8 
The XGL package included in ML has all the source code of the library, so we can modify it and recompile. I am not sure if the distribution of the source code is granted by XGL developers though, since the package in their website does not include the source code.
0
ryklin

Administrator
Registered:
Posts: 250
Reply with quote  #9 
That's true the C source code is not part of the package they distribute on their website - 
http://svi.cps.utexas.edu/xgltoolbox-1.1.0.zip

However it is part of the ML package.

I already started tinkering with it.
0
ryklin

Administrator
Registered:
Posts: 250
Reply with quote  #10 
I was able to get your mouse_tracking.zip to work, however for some odd reason only the horizontal motion of the mouse is being tracked. The vertical remains stationary. I'm debugging it now.
0
ryklin

Administrator
Registered:
Posts: 250
Reply with quote  #11 
I figured out what was wrong with the Vertical Mouse Position. Some dimensions were manually coded and off scale. I'll need to auto detect them for the stimulus display. I know you tried to do this, but something went wrong. I may post an update to MouseTracker.zip here later.
0
Jaewon

Administrator
Registered:
Posts: 382
Reply with quote  #12 

The manually coded numbers are just for testing in my setup. Did you run it in MonkeyLogic? Some dimensions are dependent on MonkeyLogic's figure sizes as well as your screen sizes and they will be set correctly when MouseTracker is initialized in a timing file with ScreenInfo. See tracking.m for example.

0
ryklin

Administrator
Registered:
Posts: 250
Reply with quote  #13 
Yes I did run it inside of monkey logic as a timing file (and also outside for debugging by creating a dummy ScreenInfo structure). I will be debugging that today before moving forward.
0
ryklin

Administrator
Registered:
Posts: 250
Reply with quote  #14 
What's important to note here is that ScreenInfo contains variables for converting from Pixels to Degrees of Visual Angle (DVA). When I first observed the lack of vertical motion, I couldn't tell if it was because the XGL API was reporting a null value for MOUSE_Y or if it was further along the pipeline, such as when passed to matlab, or even later when converted to DVA. It turned out to be the latter. In the process I made progress reviewing and compiling XGL from C. This will prove very useful down the road when it comes time to enhance XGL with mouse up and down notifications. Until then, I still need to find where in the DVA transformation there is an error on my system. Then I will integrate the MouseTracker object into my PS2/USB adapter, with its call to GetCursorPosition, thereby making this a kernel behavior and not a timing file that displays and re-positions an object. It will visualize exactly as the eye position disk appears now, only controlled by the mouse instead of analog input. Then we tack on gestures with perhaps some API controls, and we have ourselves a touch screen adapter. We can also add DirectX mouse notifications to XGL, in place of (or addition to) Win32, and then compare the results. I think we may need to do this for the final product because currently the mouse position reported is actually relative to the ML control window's client area, and not the subject's visual display. This might be a minor problem for now, but needs to be addressed. Perhaps it has something to do again with ScreenInfo. It's actually strange because XGL creates the video screen window, so presumably it should report the mouse coordinates relative to that client area. This is a detail that might be too esoteric to discuss in this thread.
0
ebos

Junior Member
Registered:
Posts: 15
Reply with quote  #15 
I am a brand new monkeylogic user, and I came across this thread while I was looking up information about how to interface ML with a touchscreen, as we have so far been collecting all our behavioral data this way.

Apologies if my question isn't directly related to your current discussion, as I'm no programmer or developer, but am I correct in inferring from this thread that as of now there isn't any way to use touchscreens with ML? I thought I would be able to just map the X-Y coordinates from the touchscreen onto the X-Y eyetracking/joystick inputs but as the signal from the touchscreen is digital/serial, this isn't possible. Are there any workarounds to this problem at all?

Thanks and sorry again if I'm taking this OT.
0
ryklin

Administrator
Registered:
Posts: 250
Reply with quote  #16 
That's essentially what we are talking about doing.
0
Wael.Asaad

Administrator
Registered:
Posts: 53
Reply with quote  #17 
I agree that adding a software abstraction layer between the hardware and ML task execution would be a good way to work towards this goal.  We had previously contemplated adding a function such as "mlvideo" (e.g., "mldaq") that converted local DAQ-related commands to hardware-specific commands, without going straight to the DAQ hardware (via the Matlab DAQ toolbox).  That way, modifying this single function could easily re-map DAQ related commands to new or updated hardware.
0
ryklin

Administrator
Registered:
Posts: 250
Reply with quote  #18 
I will be releasing a beta this week.
0
ryklin

Administrator
Registered:
Posts: 250
Reply with quote  #19 
Found the problem! If your computer's second (or third...) monitor has negative coordinates (monitor is to the 'left' or 'below' the main monitor), then xglrect(2) - where '2' is the second monitor number, will return erroneous values. That's because the GetScreenRect function (xgl.cpp line 67), has pointers declared as unsigned (void Session::GetScreenRect (unsigned d, unsigned *x, unsigned *y, unsigned *w, unsigned *h)). Therefore, negative values can not be properly passed back to ML. This starts an entire chain of errors that prevents proper tracking of the mouse or touch screen under these conditions.

I will take care of this along with my other additions to XGL.
0
ryklin

Administrator
Registered:
Posts: 250
Reply with quote  #20 
FYI, mlvideo.m already handles a message called 'getmouse' which returns xglcursor, same as MouseTracker.m. However it doesn't scale to the ScreenInfo params. I need to think about what to do next. Probably will incorporate MouseTracker into mlvideo, and then set the new PS2/USB adapter to grab scaled cursor positions from mlvideo('getmouse'), or something along those lines.

Also, somewhat related to this; mlvideo also handles a message called 'showcursor', which is not related to the MLHELPER program. It is indeed implemented by XGL, and can show/hide the mouse cursor. It's called by the video test procedure, and other subroutines.
0
ryklin

Administrator
Registered:
Posts: 250
Reply with quote  #21 
I will probably need the source to mlhelper.exe 
0
ryklin

Administrator
Registered:
Posts: 250
Reply with quote  #22 
I'm almost done with the touch screen update (beta). Primarily, I only need to figure out how to mex XGL.
0
stremblay

Member
Registered:
Posts: 65
Reply with quote  #23 

Hi ryklin, 

         I am a postdoctoral fellow at McGill University.

I would like to couple a touch screen with MonkeyLogic. I saw your conversation about this issue on the forum.

On your last post, you said you would be releasing a beta solution soon. Have you released this beta yet or are you still working on it?

Thank you for your help!

Best,

-S├ębastien

0
ryklin

Administrator
Registered:
Posts: 250
Reply with quote  #24 
We are working on it, please feel free to check back in here to get a status update.
0
ryklin

Administrator
Registered:
Posts: 250
Reply with quote  #25 
Status update on touch screen integration...

As of now I'm able to inject a signal from the mouse/touch screen into ML so that it controls the gaze point and actuates the eyejoytrack function, as well as detect holds and releases. I've also constructed an adapter that appends a USB device to the list of XY inputs instruments. This will allow users to map a touch screen to the XY input the same was as they are accustomed to with a joystick or eye tracker.

Here's what remains:
1) Finish the adaptor so that the touch screen is completely independent of the eye tracker
2) Scaling issue (converting from pixels to DVA)
3) Clean up any temporary code
4) Test

Barring any disruptions we should have a beta available soon.
0
Previous Topic | Next Topic
Print
Reply

Quick Navigation:

Easily create a Forum Website with Website Toolbox.