[prev in list] [next in list] [prev in thread] [next in thread]
List: freenx-knx
Subject: Re: [FreeNX-kNX] Forcing the graphical applications to use the
From: Murray Trainer <mtrainer () central-data ! net>
Date: 2009-03-20 1:16:44
Message-ID: 683103.1012281237511803976.JavaMail.root () mailstore01 ! gopc ! net
[Download RAW message or body]
Hi Prakash,
There was a thread on this list titled "[FreeNX-kNX] NX performance issue" starting \
3/10/07. I am not sure if there were any results? Perhaps a developer can update us \
all on the status of this.
Thanks
Murray
> Hi there. I develop an open source application (http://www.virtualgl.org)
> which adds hardware-accelerated 3D capabilities to X proxies such as NX,
> VNC, etc. It works by rerouting the 3D rendering into a Pbuffer on the
> server's graphics card, reading it back, and drawing it into the X proxy
> using XShmPutImage() or using X pixmap drawing (if XShm isn't available.)
> The essential workload that VirtualGL produces is a stream of full-screen
> (usually 1280x1024), back-to-back calls to XShmPutImage() or XCopyArea(),
> depending on whether the MIT-SHM extension is available.
> And now, the issue -- in either case, what I observe with NX is that this
> workload will perform great for about 3-5 seconds, then it will slow to a
> crawl and remain slow. I've tried various quality settings,
> enabling/disabling the bitmap caches, etc., and nothing seems to improve the
> situation. Is there something else I could try? Or is there perhaps a way
> to profile that system and determine what's causing it to slow down? I
> don't observe this slow-down with other X servers (TurboVNC, specifically,
> or even just sending the pixels via. remote X on a gigabit connection.)
> Darrell Commander
----- Original Message -----
From: "Prakash Velayutham" <prakash.velayutham@cchmc.org>
To: "User Support for FreeNX Server and kNX Client" <freenx-knx@kde.org>
Sent: Wednesday, March 18, 2009 12:15:01 PM GMT +08:00 Beijing / Chongqing / Hong \
Kong / Urumqi
Subject: [FreeNX-kNX] Forcing the graphical applications to use the server-side \
graphics abilities
Hello,
If this question does not belong in this list, I would appreciate if
you could direct me to the right list.
I use FreeNX and NoMachine NX Client to connect to my cluster head
node. Everything works fine, except that I want to make the following
work:
How can I make the graphical applications utilize the graphics
processor on the server instead of on my NX client (which, in X terms
is the X-server)?
Thanks,
Prakash
________________________________________________________________
Were you helped on this list with your FreeNX problem?
Then please write up the solution in the FreeNX Wiki/FAQ:
http://openfacts2.berlios.de/wikien/index.php/BerliosProject:FreeNX_-_FAQ
Don't forget to check the NX Knowledge Base:
http://www.nomachine.com/kb/
________________________________________________________________
FreeNX-kNX mailing list --- FreeNX-kNX@kde.org
https://mail.kde.org/mailman/listinfo/freenx-knx
________________________________________________________________
________________________________________________________________
Were you helped on this list with your FreeNX problem?
Then please write up the solution in the FreeNX Wiki/FAQ:
http://openfacts2.berlios.de/wikien/index.php/BerliosProject:FreeNX_-_FAQ
Don't forget to check the NX Knowledge Base:
http://www.nomachine.com/kb/
________________________________________________________________
FreeNX-kNX mailing list --- FreeNX-kNX@kde.org
https://mail.kde.org/mailman/listinfo/freenx-knx
________________________________________________________________
[prev in list] [next in list] [prev in thread] [next in thread]
Configure |
About |
News |
Add a list |
Sponsored by KoreLogic