[prev in list] [next in list] [prev in thread] [next in thread] 

List:       kde-devel
Subject:    Re: GSoC enthusiast hoping to integrate multi-touch user-interface
From:       Jordi Polo <mumismo () gmail ! com>
Date:       2009-03-24 1:54:46
Message-ID: a4162420903231854x42728f72n7b99a91797395a7b () mail ! gmail ! com
[Download RAW message or body]

[Attachment #2 (multipart/alternative)]


You may be looking for this:

http://code.google.com/p/sparsh-ui/


On Sun, Mar 22, 2009 at 5:52 AM, Ashish Kumar Rai
<mr.ashish.rai@gmail.com>wrote:

> Hi,
>
> Thanks for your enthusiastic replies. I was working on the basic blocks of
> the framework so that I could come with a concrete idea. Here it goes :
>
> The project envisages the development of a package that could be used in
> conjunction with Tbeta/Touchlib to develop and debug a complex multi-touch
> application in a fast and efficient way. Tbeta/Touchlib (from NUIGroup )
> will encapsulate the hardware of the Multi-Touch setup and will send data of
> touch-events in TUIO protocol through TCP/UDP packets. The functionalities
> provided by the package will start from this step.
>
> Multi-touch support needs to be in the windowing system. We really need a
> way to convey events from a touch device to the correct client application
> when a number of multi-touch applications are running at the same time on
> the MT surface. A server based on MPX will also be designed to serve this
> purpose. The MPX Sever can be configured to give the multiMouseEvent as
> Marco suggests. ( This needs to build into the Qt layer for relaying events
> from MPX server too. )
>
> This will help in building the required framework wherein *many*
> multi-touch applications can be run on the desktop/surface with conventional
> single touch applications going on, at the same time. This will thus not
> only support multiple cursors but also multiple cursors in one application
> too – all running at the same time with the responsibility being with the
> MPX server to relay the touch-events to the proper application. Now when the
> computer will restart the surface will be fully MT compatible.
>
> Gestures depend a lot on the context. A gesture in one context can mean
> something different in a different context. And the only thing that knows
> the context is the application. This is very similar to a button press.
> Pressing a mouse button can mean a zillion different things, depending where
> and when it happens. That's why all a X  server does is relay the button
> press to a client application, which then does the right thing.
>
> In the implementation also the touch-events will be received by a
> particular application, relayed by an MPX sever, and then it will be
> patterned according to the scene ( which is divided into regions of various
> sizes which needs to be updated alone when a touch-event occurs in that
> region ). According to the corresponding region, the touch-event data will
> be clubbed and sent to Gesture Recognition module in a particular format
> which will then, as per the application requirement, pass commands to the
> scene-manager to update the required region(s). An explicit graphics engine
> could be used for the scene-manager.
>
> The gesture recognition system and the widgets will be so developed that
> automated debugging of the code can be possible, in conjunction with the
> QMTSim simulator.
>
> According to the widgets being utilized by the user to develop the
> application and the gestures defined, automatic code can be generated to
> test the application with those gestures and some variants.
>
> The whole work can be summarized in the following five sequential stages :
>
> 1. Design and testing of a ANN based gesture recognition and its
> integration with OSCPack and QMTSim.
> 2. Integration of the gesture recognition with the scene manager.
> 3. Development of some primitive MT widgets which can be easily used in the
> scenes.
> 4. Development of demo applications and testing using the automated
> debugging facilities provided with the widgets and the gesture recognition
> package in conjunction with QMTSim.
> 5. Integration with the MPX server
>
> I am very much hopeful that with this framework, fast and complex MT
> applications can be very easily developed, where the developer can be more
> involved with core functionality rather than basic MT stuff - just as GUI
> designing is done these days with Qt.
>
>
> On Fri, Mar 20, 2009 at 4:14 PM, Marco Martin <notmart@gmail.com> wrote:
>
>> On Thursday 19 March 2009, Stefan Majewsky wrote:
>> > On Donnerstag 19 März 2009 17:54:21 Ashish Kumar Rai wrote:
>> > > I am a GSoC enthusiast hoping to integrate multi-touch user-interface
>> > > with the help of MPX ( Multi-Pointer X <
>> http://en.wikipedia.org/wiki/MPX>
>> > > ) in MT enabled desktop/surface.
>> > >
>> > > [...]
>> > >
>> > > I am looking forward for a positive reply from your side and I feel
>> that
>> > > you will make me conversant about my prospects for  this project under
>> > > your guidance. I am hoping to make a full proposal with a time line
>> and
>> > > the deliverables  and start working if you think that this could be a
>> > > viable GSoC project.
>> >
>> > I'm concerned whether KDE is the right project to talk to at this time.
>> > While I would still like MPX support, I do not want it through some kind
>> of
>> > additional KDE library. Instead, support should go directly into the Qt
>> > layer (QMouseEvent), but Qt is not a GSoC project...
>>
>> what i would love is something like a moltiMouseevent, where event->pos is
>> a
>> list of positions
>> then probably an higher level library than that could be handy (like for
>> support to multitouch gestures) but i fear support in qt is pretty much an
>> hard requirement before thinking about something like that
>>
>> > Greetings
>> > Stefan
>>
>>
>>
>> >> Visit http://mail.kde.org/mailman/listinfo/kde-devel#unsub to
>> unsubscribe <<
>>
>
>
>
>
>
> --
> Cheers,
>
> Ashish Kumar Rai
> Electronics Engineering Department,
> IT-BHU
>
>
> >> Visit http://mail.kde.org/mailman/listinfo/kde-devel#unsub to
> unsubscribe <<
>
>


-- 
Jordi Polo Carres
NLP laboratory - NAIST
http://www.bahasara.org

[Attachment #5 (text/html)]

<br>You may be looking for this:<br><br><a \
href="http://code.google.com/p/sparsh-ui/">http://code.google.com/p/sparsh-ui/</a><br><br><br><div \
class="gmail_quote">On Sun, Mar 22, 2009 at 5:52 AM, Ashish Kumar Rai <span \
dir="ltr">&lt;<a href="mailto:mr.ashish.rai@gmail.com">mr.ashish.rai@gmail.com</a>&gt;</span> \
wrote:<br> <blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, \
204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;"><div \
class="gmail_quote">Hi,<br><p> Thanks for your enthusiastic replies. I was working on \
the basic blocks of the framework so that I could come with a concrete idea. Here it \
goes :<br>

</p><p>The project envisages the development of a package that
could be used in conjunction with Tbeta/Touchlib to develop and debug a complex
multi-touch application in a fast and efficient way. Tbeta/Touchlib (from NUIGroup ) \
will encapsulate the hardware of the Multi-Touch setup and will send data of
touch-events in TUIO protocol through TCP/UDP packets. The functionalities provided
by the package will start from this step.<span style="font-size: 12pt; font-family: \
&quot;Times New Roman&quot;;"><br> </span>

</p><p>Multi-touch support needs to be in the windowing system. We
really need a way to convey events from a touch device to the correct client
application when a number of multi-touch applications are running at the same
time on the MT surface. A server based on MPX will also be designed to serve
this purpose. The MPX Sever can be configured to give the multiMouseEvent as Marco \
suggests. ( This needs to build into the Qt layer for relaying events from MPX server \
too. )<br></p>

<p>
This will help in building the required framework wherein *many*
multi-touch applications can be run on the desktop/surface with conventional
single touch applications going on, at the same time. This will thus not only
support multiple cursors but also multiple cursors in one application too – all
running at the same time with the responsibility being with the MPX server to
relay the touch-events to the proper application. Now when the computer
will restart the surface will be fully MT compatible. <br></p>



<p>Gestures depend a lot on the context. A gesture in one
context can mean something different in a different context. And the only thing
that knows the context is the application. This is very similar to a button
press. Pressing a mouse button can mean a zillion different things, depending
where and when it happens. That&#39;s why all a X<span> 
</span>server does is relay the button press to a client application, which
then does the right thing. <br></p>



<p>In the implementation also the touch-events will be received
by a particular application, relayed by an MPX sever, and then it will be
patterned according to the scene ( which is divided into regions of various
sizes which needs to be updated alone when a touch-event occurs in that region
). According to the corresponding region, the touch-event data will be clubbed
and sent to Gesture Recognition module in a particular format which will then,
as per the application requirement, pass commands to the scene-manager to
update the required region(s). An explicit graphics engine could be used for
the scene-manager.<br></p><span style="font-size: 12pt; font-family: &quot;Times New \
Roman&quot;;"></span>

<p>The gesture recognition system and the widgets will be so
developed that automated debugging of the code can be possible, in conjunction
with the QMTSim simulator. </p>

<p>According to the widgets being utilized by the user to
develop the application and the gestures defined, automatic code can be
generated to test the application with those gestures and some variants.<br>

</p><p>The whole work can be summarized in the following five sequential stages : \
</p>

<p>1. Design and testing of a ANN based gesture recognition and its integration
with OSCPack and QMTSim. <br>
2. Integration of the gesture recognition with the scene manager.<br>
3. Development of some primitive MT widgets which can be easily used in the
scenes. <br>
4. Development of demo applications and testing using the automated debugging
facilities provided with the widgets and the gesture recognition package in
conjunction with QMTSim. <br>
5. Integration with the MPX server <br></p><p></p>

<p>I am very much hopeful that with this framework, fast and
complex MT applications can be very easily developed, where the developer can
be more involved with core functionality rather than basic MT stuff - just as
GUI designing is done these days with Qt.</p><div><div></div><div \
class="h5"><div><div></div><div>

<br><br>

<div class="gmail_quote">On Fri, Mar 20, 2009 at 4:14 PM, Marco Martin <span \
dir="ltr">&lt;<a href="mailto:notmart@gmail.com" \
target="_blank">notmart@gmail.com</a>&gt;</span> wrote:<br> <blockquote \
class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt \
0pt 0.8ex; padding-left: 1ex;">

<div><div></div><div>On Thursday 19 March 2009, Stefan Majewsky wrote:<br>
&gt; On Donnerstag 19 März 2009 17:54:21 Ashish Kumar Rai wrote:<br>
&gt; &gt; I am a GSoC enthusiast hoping to integrate multi-touch user-interface<br>
&gt; &gt; with the help of MPX ( Multi-Pointer X &lt;<a \
href="http://en.wikipedia.org/wiki/MPX" \
target="_blank">http://en.wikipedia.org/wiki/MPX</a>&gt;<br> &gt; &gt; ) in MT \
enabled desktop/surface.<br> &gt; &gt;<br>
&gt; &gt; [...]<br>
&gt; &gt;<br>
&gt; &gt; I am looking forward for a positive reply from your side and I feel \
that<br> &gt; &gt; you will make me conversant about my prospects for  this project \
under<br> &gt; &gt; your guidance. I am hoping to make a full proposal with a time \
line and<br> &gt; &gt; the deliverables  and start working if you think that this \
could be a<br> &gt; &gt; viable GSoC project.<br>
&gt;<br>
&gt; I&#39;m concerned whether KDE is the right project to talk to at this time.<br>
&gt; While I would still like MPX support, I do not want it through some kind of<br>
&gt; additional KDE library. Instead, support should go directly into the Qt<br>
&gt; layer (QMouseEvent), but Qt is not a GSoC project...<br>
<br>
</div></div>what i would love is something like a moltiMouseevent, where \
event-&gt;pos is a<br> list of positions<br>
then probably an higher level library than that could be handy (like for<br>
support to multitouch gestures) but i fear support in qt is pretty much an<br>
hard requirement before thinking about something like that<br>
<div><div></div><div><br>
&gt; Greetings<br>
&gt; Stefan<br>
<br>
<br>
<br>
&gt;&gt; Visit <a href="http://mail.kde.org/mailman/listinfo/kde-devel#unsub" \
target="_blank">http://mail.kde.org/mailman/listinfo/kde-devel#unsub</a> to \
unsubscribe &lt;&lt;<br>

</div></div></blockquote></div><br><br \
clear="all"></div></div><br></div></div></div><br clear="all"><div><div></div><div \
class="h5"><br>-- <br>Cheers,<br><br>Ashish Kumar Rai<br>Electronics Engineering \
Department,<br>IT-BHU<br>

</div></div><br><br>
&gt;&gt; Visit <a href="http://mail.kde.org/mailman/listinfo/kde-devel#unsub" \
target="_blank">http://mail.kde.org/mailman/listinfo/kde-devel#unsub</a> to \
unsubscribe &lt;&lt;<br> <br></blockquote></div><br><br clear="all"><br>-- <br>Jordi \
Polo Carres<br>NLP laboratory - NAIST<br><a \
href="http://www.bahasara.org">http://www.bahasara.org</a><br><br>



>> Visit http://mail.kde.org/mailman/listinfo/kde-devel#unsub to unsubscribe <<


[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic