Multiple Monitors on Windows 7

Jan 4, 2010 at 7:36 PM
Edited Jan 4, 2010 at 7:38 PM

Is there a way to work with more than one monitors?

In tests I've tried use two monitors and touch on first monitor has effects on left half on primary monitor. Touch on second monitor has effect on right half on primary monitor.


I've notice on HidContactInfo.cs that:

     const ushort MaxSize = 32767;
     static readonly double XRatio = SystemParameters.VirtualScreenWidth / MaxSize;
     static readonly double YRatio = SystemParameters.VirtualScreenHeight / MaxSize;

So, SystemParameters.VirtualScreenWidth = 2048 when I work with two monitors and each one is configured with 1024x768.

Maxsize shoudn't be 65535 instead 32767?

Is there a way to force a touch on secondary monitor?


I need some help.





Feb 2, 2010 at 6:49 AM

I don't think getting both monitors with touch is possible. At least this code base appears to be a ways off from handling it.  I had this funny scaing problem as well.... and I just made a fix that got touch to work on my primary monitor while the secondary was enabled.  You are close to the point where I had to make the fix.  I changed that code to

<font size="2">



const ushort MaxSize = 32767;

static readonly double fontSize = 1.25;

static readonly double XRatio = (SystemParameters.PrimaryScreenWidth * fontSize) / MaxSize;

static readonly double YRatio = (SystemParameters. PrimaryScreenHeight * fontSize) / MaxSize;

This fixed the scale problem due to the fact that the VirtualScreenSize is the width of both monitors... but the driver (Win7) seems to be limited to providing touch coordinates onto a single monitor so I figured it should only be scaling to the width of the primary monitor.   The other problem I hit was that I run with 'large fonts'... or more accurately the default zoom level to 125%.  I have not figured out how to programmatically obtain thius setting so I just hardcoded it for now.  Set this back to 1.0 if you run with small fonts.  I'll try to figure out how to set this programatically.

Note that the MultiMouse input provider has some improper constraining for multi-monitor as well.  In the RawDevicesManager.cs UpdateMouse... it incorrectly constrains the limits.  The code as is would constrain the touch point to the primary monitor located on the right.  It would allow touch points (red dots) to go onto the secondary monitory if it was on the right (primary on left).  By checking for virtualScreen.Left/Top rather than zero you can get it so that touch points get generated across the whole VirtualScreen (all monitors).  But even after doing this the Win7 side of things contstrains the touch points it generates to the primary monitor.  The proper thing is probably to constrain the points in MultiMouse to the primary screen so it needs a fix for the case where the primary monitor is on the left.

Note that I'm using the 29484 build.  On the later builds I was having a problem where the offsets within child windows were off... but it could be that I was in a bad state while searching for a fix to the scaling issue.  I'll update to the latest build, apply my patch... and report if I find any problems.


Feb 4, 2010 at 8:14 PM

dongi, I have tried your solution with hid driver and it works. thank you :)
but there is then a problem with wpf application if dpi is set to 125%. unfortunatly I don't have many time to look solution. do you have any ideas?