[prev in list] [next in list] [prev in thread] [next in thread] 

List:       apache-modproxy-dev
Subject:    Re: Streaming and a slow client
From:       Chuck Murcko <chuck () topsail ! org>
Date:       2001-08-31 21:21:01
[Download RAW message or body]


On Friday, August 31, 2001, at 04:42 PM, Ian Holsman wrote:

> On Fri, 2001-08-31 at 11:32, Graham Leggett wrote:
>> Ian Holsman wrote:
>>
>>> so are you suggesting we download the file first into our local cache,
>>> and then serve the file from there?
>>
>> Almost.
>>
>>> that won't work for streaming, as the client would have to wait for
>> the
>>> whole file to be present before it would get a single byte.
>>
>> Not at all.
>>
>> You download the file to the local cache, and *while* this is 
>> happening,
>> you send the file downstream to the browser *at the same time*.
>>
>> Effectively you read from the backend and write to the file, and at the
>> same time you read from the file and write to the network.
>
> is this working at the moment?
>
> so all I need is a way to tell the cache not to actually 'cache' the
> result for the next request and I'll be happy
>

If it's really a stream, and there's no Content-Length (there can't be) 
then we really need to do a bit more. You don't want the cached stream 
to hog your cache, so the most robust solution is probably to circularly 
buffer streams to some configurable length whether you're caching or not 
and let the clients keep up as best they can. Heck, that's what the 
clients themselves do.

None of this is HTTP/1.1 compliant AFAIK, so it'd be an extension, 
perhaps by Content-Type.

I think Graham's cache solution is a most welcome improvement over 1.3 
behavior.

Chuck Murcko
Topsail Group
http://www.topsail.org/

[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic