[prev in list] [next in list] [prev in thread] [next in thread] 

List:       wget
Subject:    How to continue a recursive dowloading process without checking update?
From:       hoho hoho <luomengyu2000 () yahoo ! com ! cn>
Date:       2007-05-28 10:51:07
Message-ID: 163709.89889.qm () web15006 ! mail ! cnb ! yahoo ! com
[Download RAW message or body]

Hi everybody:
   This is my first time to do this.
      I want to download a site to my computer.And the
situation is described as below 
  1 My Internet connection is time limited.I can't get
on line at night.
  2 Even though the power supply will be cut off at
night.So I can't keep my computer always on.
  3 The connection to that site is not very stable,and
it's delay is a bit long.
  4 I tried to download that site like this:
    wget -r -T 10 -c -S  http://some.site.
    
      My problem is :after days retrieving,everytime I
start wget.Wget always spend lots of time to check
whether these exist files are retrieved or
updated.Even transmission time are much shorter than
checking.
      As time goes by,due to the large amount of files
on that site ,I'm afraid wget will not transmit file
but only check exist files.
      Is there any trick to avoid that situation? 

      My second problem is: I want to write a script
to download in multitheard download,and avoid race
condition?

Can anyone help me?
Thank everybody

   Luo Mengyu!


      ___________________________________________________________ 
ÇÀ×¢ÑÅ»¢Ãâ·ÑÓÊÏä3.5GÈÝÁ¿£¬20M¸½¼þ£¡ 
http://cn.mail.yahoo.com
[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic