On 17 August 2010 13:21, Marco Martin wrote:
On Monday 16 August 2010, Stefan Majewsky wrote:
> The reason why I'm writing this is because I started work on libkgame, a
> collection of libraries which shall, at some point, supersede libkdegames
> which is currently used by most games in the kdegames module. In the
> beginnings of the design process, I've identified as a main weakness of our
> applications the fact that they are designed for mouse and keyboard
> interaction and for desktop form factors. They do not scale to the mobile
> form-factors which are becoming increasingly important in casual gaming.

> This effort should also cover input methods, in order
> to make it dead-easy to integrate multitouch support in existing
> applications.

for input, basically if you are chained to single touch, mouse events are
usually mostly good enough, if you want to react to multi touch you should
implement -also- TouchEvents, that have a semantics almost identical to mouse
events, apart that they have an arbitrary number of points in a single event.
there are also higher level gestures, but are probably -too- high level to be
used in games...


The problem with mouse events is that they were created for point-and-click applications in desktop environments, and they don't always translate well to mobile devices. Events like on-hover or keypress are quite difficult to reproduce in touch screens. Games aggravate this problem, as they can introduce a wide variety of interactions.

Have also in mind that other non-standard input devices will likely become more popular, such as accelerometers, pressure and proximity sensors, or position tracking (now found at the Wii, Microsoft Natal and PS3 Move). These will be frequently used in casual games as they get added to mobile devices, and open games would greatly benefit from a library providing a standardized way to access them.

For low level interaction that supports all those devices, I'd recommend basing the APIs on the concept of crossing-based interfaces[1] which have been poorly supported in traditional widget toolkits, but which I think would make a good basis for touch and position-based interfaces. A cross-based interface keeps track not just of the pointed object, but also which lines are being crossed by the input gesture.

Imagine what a game could do if, instead of a mouse-out event, it received an event saying "the pointer has exited the button with direction 38ยบ NorthEast and at a speed of 20 in/s". In my opinion, this mid-level API would make it easier to program games tailored to several different input/output methods.


* [1] http://en.wikipedia.org/wiki/Crossing-based_interface