OK, so I recently purchased the Acer T232HL touch screen to connect to my Macbook Pro as an additional monitor. To give you an idea, here is my setup .
OS X does not support this monitor in any way, as you can see in the screenshot that I actually run Windows 8 through VMware, which proxies a USB connection to Windows, where touch events are supported. But obviously this is not perfect.
There is at least one third-party driver for OS X, which looked somewhat promising, but it does not seem to support multitouch from this device, it is expensive, and, as a rule, the pain began to work to a small extent. There is also mt4j , but best of all I could say after running my examples, it does not support this device at all.
So, here is my question: what do I enter if I would like to write a driver for this thing? I am mainly a web developer with many years of experience with Ruby, Objective-C (and a bit of C), Javascript, etc. I never ventured into any kind of hardware programming, so from the surface it seems like an interesting, intimidating challenge.
I know, at some level, I need to read data from USB. I know that this will probably mean trying to rebuild any protocol they use for sensory events (maybe this will be completely custom?). However, I have no clue where to start - will it be a kernel extension? In C, I suppose? I would like to have a high level overview of the moving parts involved here.
Ultimately, I want to use the touch screen to control a specialized web interface (working in Chrome), so ideally I could proxy touch events directly in Chrome without the OS actually moving the mouse cursor to the touch position (so the UI leads myself, just like on the iPad), but no matter how technically possible, I would like to start with something working.