[prev in list] [next in list] [prev in thread] [next in thread]
List: kde-core-devel
Subject: Re: KIO: Mass Copy of Files from Different Sources to Different
From: "Dawit A." <adawit () kde ! org>
Date: 2009-09-16 17:22:41
Message-ID: 200909161322.41578.adawit () kde ! org
[Download RAW message or body]
David,
Yes, now that I understand how the high level jobs work, I completely got your
concern about the potential for a deadlocked condition.
Right now I am working on a solution to eliminate this deadlock condition from
ocurring in the scheduler. There is a way to do this by pairing the requests
from high level jobs so that the scheduler can take that into account when it
is scheduling jobs.
More about that once I refine and test the solution to see whether or not it is
viable and does solve the deadlock problem...
On Wednesday 16 September 2009 11:52:44 David Faure wrote:
> On Tuesday 08 September 2009, Dawit A. wrote:
[snipped]
> > Can you give an example of how to trigger this
> > dead lock ? I suppose I can simply start copying files from remote
> > locations (sftp/ftp) until the max instances limit is reached, no ?
>
> Maybe. I admit I didn't actually try, but it seems logical to me, with the
> above reasoning. To get many filecopyjobs started, I recommend copying a
> whole directory of files. That gives time to start another directory copy
> while it's happening. Each file being copied will start a FileCopyJob.
Just for clarification, the dead lock condition can only occur if both ends of
the high level job are remote urls, correct ? That is both the put and get
operation must be remote otherwise they are handled differently and the
scheduling does not come into the equation... Or did I get that wrong ?
[prev in list] [next in list] [prev in thread] [next in thread]
Configure |
About |
News |
Add a list |
Sponsored by KoreLogic